Shweta Bachchan is the latest brand name to turn author, getting a large interview in Times of India even before the book is out. The book is quite likely to be successful because of who she is (I read the free excerpt and I’d probably buy the book to read on a flight). Shah Rukh Khan’s daughter graced the cover of Vogue though she isn’t an actress yet. Again, despite the online naysayers her identity helps sell the mag. So your identity can be quite a valuable asset, and despite grumbles about ethics and nepotism, commercial entities will put a premium on it. (Chetan Bhagat, by the way, had trouble finding a publisher for his first book with one reviewer being put off by the fact that he had enclosed a marketing plan along with the manuscript!). So there’s the hard way to success and the short way — it’s quite clear that Shweta won’t be changing her name any time soon.
But what if you’re not lucky enough to be born with a famous parent? China has a solution for that. You can build up your social credit score by doing ‘good’ things like repaying loans, following traffic rules, paying your taxes and so on. You lose points if you do ‘bad’ things like, say, cheating in an online game. The trouble is that the government can decide the rules and you probably can’t opt out. Sesame Credit from Ant Financial (an Alibaba affiliate) operates on the same principles but is voluntary. Opting in can have tangible benefits ranging from better dating options to waiver on house deposits. Would I share my data for money is a question we are going to be increasingly asking ourselves. Our identity gives us a foundation of trust which can be used in lieu of physical collateral, but in order to develop that trust we have to be open to being observed in a variety of situations.
I grew up in a village in Tamil Nadu where not much was private. People could see you visiting the bank a couple of times in a week and draw their own inferences. Not wearing the same gold jewellery would beg the question of whether it was converted to new ones or just pawned. A telegram would send a delegate of concerned villagers to visit. Ditto a walk to the hospital. Socially-approved behaviour – for example going to church every day – would up your standing as would your kids getting a high “rank’ in school. Watching a movie in the theatre would drop your score a bit, as would wearing racy clothes or not going to church on a Sunday. Log Kya Kahenge was a definite consideration. In a village there was no real way to opt out of this social score without exiting the community. The new social scoring models are built on similar principles and while opting out may not require you to become a hermit there may be an actual financial price for your lack of trust capital.
I keep seeing “Ethics of AI” online. AI can’t have ethics. Only the corporations and people who make them can have ethics. If you see a process skewed in favour of discrimination – in the pursuit of business goals – should you speak up or assume this is business as usual? Remember not all discrimination is on purpose. AI rules are being composed in cultures and countries other than your own and it may not gel with the norms you consider appropriate. We must speak up.