“We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.” That was the former executive chairman of Google, Eric Schmidt, trying to convince users that the tech giants did care about their privacy, ironically enough. But that was in 2010.
Fast forward eight years, and a lot has changed. The world has been rattled by revelations that the personally identifiable data of about 50 million Facebook users was breached by an analytics firm. Since then, the skeletons haven’t stopped tumbling out, with the news that the NaMo app asks for as many as 22 permissions from users, and that the official Congress app, since deleted, was vulnerable to data breach.
Bruce Schneier, an American security technologist and fellow at Harvard University’s Berkman Klein Center for Internet & Society, in his book Data and Goliath, says: “Google knows what kind of porn each of us searches for, which old lovers we still think about, our shames, our concerns, and our secrets.”
So, what does any of this mean for us, the lay users?
It may be helpful to start by asking what this ‘data’ is. “Whenever you use any service on your phone or browser, you end up giving a lot more information than you consciously recall. This includes not just the content of your interactions, but also metadata and so on,” says Nayantara Ranganathan, manager of the Internet Democracy Project’s Freedom of Expression programme. Metadata is, simply put, data about your data. So, for example, your location information, what time you were home, how many times you made calls to a certain number, and so on.
“This is known as behavioural data,” says Sunil Abraham, executive director of The Centre for Internet & Society, “which includes how fast or slow you scrolled, how long you stayed on a page, how many times you went to a particular part of a website, and so on.”
Bhajan or you?
This is not just data gathered by the large Facebook and Gmail apps, but also by a lot of the smaller ones. An app that plays bhajans, for example, may mine your data and share it. And what do the third parties do with this? Well, the idea is to simply embed you further in a consumerist panopticon.
“Surveillance capitalism is the business model of the Internet, and all social media apps make their money collecting data on users and monetising that,” says Schneier.
“Lots of apps have no revenue generation. Their only benefit is data,” says Manan Shah, founder and CEO of Avalance Global Solutions, a cyber security firm. In fact, he says, apps like WhatsApp are the obvious suspects while the smaller ones, like the bhajan one, slip under the radar.
All of it is part of ‘lead generation’ — the process of identifying potential customers for a service or business. “A call-centre is useless without data,” Shah says. “If I want to sell you an antivirus, for instance, a company will identify filters — who owns a computer, who has already purchased an antivirus, and so on. I can then target that user. This filtered data is often your full name, bank details, data about your debit and credit cards. Abraham says there is another fairly obvious purpose for all this data collection – to get you to spend as much time on the said platform as possible.
This explains why, for example, when you Google something, the suggested searches are often tailored in an eerie manner. If you search for a word, the second search suggestion will offer to get that word translated into the local language. So if you’re in Chennai, Tamil, or into Marathi if you’re in Mumbai.
This a product of profiling your location data as well as behavioural data. “Imagine the kind of insights your location information over the course of a month can expose: your residence, where you spend your mornings, your route to work, your loved one’s residence, and more,” says Ranganathan.
“Users are often not aware that they’ve given their consent to sharing this data,” says Nikhil Pahwa, digital rights activist. “The terms and conditions of every app are so complicated and voluminous that often you have no way of knowing what something is being used for and what you’ve given your permission to. That’s a failure of the kind of consent we have today,” he says. If an app developer, quips Pahwa, puts in a condition saying the user will name their first child after the app, the user is more than likely to click on ‘I agree’.
While the failure to make consent transparent is illegal, data collection in itself is a grey area. And what constitutes ‘misuse’ of data is murky because of the lack of regulations and clear outlines. “What if a salon has your phone number and sends an SMS saying your haircut is due,” asks S. Anand, CEO of data science firm Gramener. “Would you consider that misuse?”
It gets more ominous.
We’ll use it some day
At present, India has no law to stop apps from sharing your data with data brokers or data analytics firms. “The tendency has been to collect as much data as you can, even if it isn’t relevant to your business today, because it might be some day or, better still, it might be valuable to others,” says Amba Kak, a Mozilla technology policy fellow. “This is why we need a law to say — collect what you need, not what you want.”
As an Indian citizen, your data today is breached, misused or sold, there is little you can do about it. “At most, users can be more vigilant about the apps they download, what permissions they give, and evaluate whether there are better alternatives,” says Ranganathan.
“One can approach a court and seek redress under the IT Act,” says Abraham, “but only if you have suffered a loss of property or money. If your data has been breached or leaked, and you haven’t suffered a monetary or property loss, there’s nothing you can do.”
The Justice Srikrishna committee, set up in July, is right now working on a draft data protection bill. The committee published a white paper last November, and a final report is expected by end of May. “The white paper itself looks fantastic,” Abraham tells me.
An ideal data protection law, says Kak, “will reflect the Supreme Court’s recent decision that all interference with the right to privacy must be necessary and proportionate.”
If data sharing is inevitable in the digital age, then it could be made illegal, for instance, to share data that can identify individuals. Anand says, “This could be done by replacing all names with a new random name or by aggregating total purchases by store and product rather than by individual purchase.”
So in an era where we have been casually asked to accept that ‘data is the new oil’, who is the biggest loser? “Framing ‘data’ as the new oil is dangerous,” says Ranganathan. Kak agrees: “This is a tired analogy that doesn’t seem to get us anywhere except to recognise that data is a source of profit for the private sector.” She would rather go with Turkish sociologist Zeynep Tufekci’s definition where we think of data privacy like clean air or safe drinking water. “It is a public good that we need to safeguard as a collective through laws that make controllers of data accountable,” says Kak.