This is the fourth report in our series of events coverage from Delhi and Bangalore on India’s Data Protection Law. Click here to read the rest.

There is no other fundamental right you can sign away contractually [than privacy]. You cannot sign yourself into slavery in any democratic country under law. So how is it that we think that by ticking a box we can sign away our right to privacy?”, Anja Kovacs, director of the Internet Democracy Project said at the #NAMAprivacy discussion on the data protection bill in Delhi last month. The Srikrishna committee’s bills makes many decisions, some celebrated and some contested, about the founding principles and goals of a data protection law.

The following are key points made at the #NAMAprivacy discussions in both Delhi and Bangalore.

Consent and notice

Smart speakers – you enter into a contract with them because you bought the smart speaker. But the guest who comes into your house, with whom you are having a conversation with, hasn’t. There is personal information being collected from conversations with smart speakers. It is not possible to provide consent in a situation like that,” said Rahul Matthan, Partner at Trilegal.

  • Consent not fundamental in the bill: To have meaningful consent, where you’re not under duress, you need an environment to enable it. We’re reducing consent to just ticking a box. Consent isn’t a pillar for autonomy as the Puttaswamy judgement recognises it should be. In principle, there is consent, but the patriarch of the family — in this case the government — can step in any time to make an exception. (Kovacs)

  • Meaningfulness of consent is a work in progress: Along with consent, meaningful consent won’t happen overnight, it’s an evolving process globally. (Amba Kak, Mozilla)

  • Define public interest’: For the state, consent almost doesn’t matter anymore. Everything is justified with public interest’. Even internet shutdowns are. What does the state and law define as public interest? (Kovacs)

  • Digital exhaust: The question is not about what I post or don’t post, it’s about how much a platform knows about me — that’s not always fully informed by what I choose to publish to the public. Where I’m going, what I’m liking, all this digital exhaust is monetisable. It can also be weaponised. (Kak)

  • Making informed digital decisions: For people giving away their data for free, every time we do training on digital security and policy, once people understand where their data is going, questions come up immediately. If we can affect behaviour like that in two days of training, where does that leave the free services’ argument? Most people have no idea. We are not making informed decisions. (Kovacs)

  • Unbundle consent: Bundled consent is a problem today — where there is one tickbox which is a kind of take-it-or-leave-it consent where you are denied the service altogether if you don’t want to part with certain data. Going from one box to many boxes is an evolving process which GDPR is encouraging. For sensitive personal data, there’s a granular definition of what specific consent means. There you have a requirement as a data principal to seek separate consent for different categories of data and different processing actions. (Kak)

  • Permissions on apps and consent: Two versions ago, Google started penalizing app developers who did not unbundle their permissions. If I, as a bank app, refused to let you transact because you didn’t give me location, that would make me a moron. This is a significantly less pervasive problem today than it was a few years ago. (Sameer Nigam, Founder & CEO, PhonePe)

  • Incremental consent: This bill doesn’t talk about a framework for incremental consent, which is asking for new consent for processing data you already have. That’s a problem that needs to be addressed. (Nigam)

Accountability

The law should not serve those who come from a position of privilege. You have to account for the worst case scenario of the most vulnerable people. These are the kind of people who are being increasingly marginalised in an organised manner.” — Vidushi Marda, Policy Advisor at ARTICLE19 said.

  • Pressure to comply in the bill: The law’s obligations on how data fiduciaries can collect and how it can be used and stored can be very powerful. If subject to an audit, everyone must be able to demonstrate compliance. That deterrent of penalties is important. (Kak)

  • Obligations regardless of consent: The bill says consent must be meaningful; but more importantly, obligations to keep data principals’ information private still apply. It’s not because consent is meaningless, but because there is often an inequality of bargaining power between the fiduciary and the principal. In that case, accountability should apply. (Kak)

  • Bureaucratisation is necessary: The requirement on documentation seems like bureaucratisation may seem burdensome, but this is how fiduciaries can be aware of data flows and keep fiduciaries accountable. (Kak)

  • Reasonable expectations’: The reasonable expectation’ argument for privacy is flawed since we may one day say oh, the government is spying on us, so privacy is dead’. What we do with such laws instead is to raise that aspiration — irrespective of what we want to change or not, we are trying to go for a régime change. We need hard accountability standards. (Kak)

  • Notice fatigue: Why are we worrying about notice for every form of data collection? It shouldn’t matter when a data fiduciary is not going to process the data to harm a data principal. We should take notice and provide the entire framework only when data is being used to violate someone’s privacy. (Matthan)

Data processing and next-generation rights

The bill doesn’t account for most of the emerging technologies. It doesn’t have the principal of data minimisation,” Marda said.

  • High standards for collection: Isn’t it about identifiability? The fact that my address can be used to find out my income status, my caste, phone number or some other details about me, shouldn’t we have a higher standard for what data we should collect in the first place? (Marda)

  • Next-gen rights like processing objection: It’s important to put in place protections for next-generation rights. Like being able to ask for an explanation for decisions based on automated algorithms. (Kovacs)

  • Right to be forgotten not thought through: The right to be forgotten does nothing for technologies like blockchain and immutable ledgers where it is impossible to delete information. This when the same regulators who make these laws are also pushing for these technologies. This right has not been entirely thought through in the bill. (Jyoti Panday)

  • AI works even on public info: The premise on which AI (Artificial Intelligence) technology is specifically based on, is that they can find intimate information from what is publicly available; it’s a feature of the technology. (Marda)

  • Inputs vs process in AI: You can’t hold algorithms responsible because you don’t always understand how they work. That is why it is important to understand what data was used to make the model that took any decision. (Marda)

  • Right to explanation of automated processing: The bill doesn’t have a right to explanation; It would have enabled data principals, who have been subjected to a decision based on automated processing of their data, a right to know the logic behind that particular decision. By ignoring this specific part of emerging technologies, it kind of makes the other safeguards not as important.” (Marda)

  • Sensitive personal data vs personal data: The bill, in having different standards for processing sensitive personal data and personal data, is in itself a big question mark for me. If a lot of applications are based on processing data that isn’t always available, why do we have different standards of collection? (Marda)

  • Fair and reasonable processing was a sufficient requirement: Section 4 [which says that processing has to be fair and reasonable] is my ray of hope in the bill. It would have been fine if the entire act was just this. But the fact that they’ve gone into all the details means that this is not going to override everything else. Everything else in Chapter 2, purpose limitation, collection limitation and all of that, is an example of what is fair and reasonable data processing. (Matthan)

  • Metadata: Not talking about metadata in the bill again goes back to how data is used today and the premise around which much of the data is collected. (Marda)

Originally published in MediaNama.