Consent is seen as a manifestation of control over one’s body, but what does it mean in conversations around data governance?

…the big thing that feminism put on the table is that the autonomy, in which the idea of consent is embedded is never individual, it is always relational…You have to create the conditions that enable them to be free and equal conditions and so to express consent meaningfully,” said Dr Kovacs during the inaugural session of PrivacyNama 2021, a global conference on privacy regulations held by MediaNama on October 6 and 7. Kovacs was talking about consent in digital services and equating it with feminist theories. 

Beni Chugh of Dvara Research, Director of the Internet Democracy Project Dr. Anja Kovacs, Executive Director of the Centre for Internet and Society Amber Sinha, Executive Director at the Centre for Communication Governance Jhalak Kakkar, and professor at Monash University Mark Andrejevic in a discussion moderated by Nehaa Chaudhari, a partner at Ikigai Law, discussed biometric data collection along with the consent and privacy-preserving processes undertaken. 

Consent has to be more meaningful

Consent in sexual relations is discussed in a strikingly different way than consent in data governance, found Dr Kovacs. If we look at how consents on the internet consists of ticking a box where supposedly we can completely sign away our right to privacy […] we have said there is an essence of this right that needs to be protected and yet on the internet, certainly with privacy, this seems to go away,” she said.

Consent is the biggest dark pattern

I’m probably going to say something, which is really provocative. But I think consent is the biggest dark pattern of them all,” Chugh said. She further elaborated: I mean consent as it exists, it is not free, it is not a contract amongst equals it…it does not give you complete information, it can be changed unilaterally, if we were to apply the lens of contracts to consent, it would fail tremendously, right? …So I don’t think that consent really needs dark patterns.” Dark patterns present in privacy policies, terms and conditions, or user interfaces find discreet ways to trick you into giving your consent for things.

Consent is modular

Referring to dark patterns, Sinha said that there should be scope for negotiation in obtaining consent – through modular or granular consent.

Kovacs also concurred with him saying that We really have to think also in these specific situations where do power relationships come in and what do we need to do to make sure that the data that is available cannot be misused against whoever is the weakest partner in that relationship.” She also pointed to other general practices related to consent:

It isn’t clear to us at all why you should be forced to give data to third parties, if that third party company is not doing something that essentially improves the service that you are availing of. Then there are changes to what can happen with your data after you have given it because at the moment terms and conditions are so broad that basically, you literally say like here is my data, now you do with it whatever you want. — Kovacs said.

However, Chugh’s remarks on modular consent highlighted the practical challenges that it poses to companies collecting data. There’s a lot of conversation around having modular consent, which is to say that the person at the time of consenting, can say that out of the five data points that you are collecting from me, I’m okay with just three being collected and two [data points] you should not collect. Our providers tell us that the biggest kind of challenge is to make that information trickle through your system and say that for this one particular person, you are not supposed to use this particular data set. How do you educate all your automated systems out there to say that this person has expressed an explicit kind of reservation in sharing or using this data point?” Chugh said.

Consent is logical

Professor Andrejevic chose to illustrate the logic of consent with an analogy: When you enter a building, you are not required to look at all of the building inspection notices […] in order for you to consent to enter that building. Instead, one of the ways in which society functions is by constructing buildings in ways that there are systems in place to ensure that the thing won’t fall down on you when you go in […] to give yourself the ability to decide whether or not you want to enter that building.” Andrejevic added that consent can be constrained to an individual model so that, in cases where social structures don’t necessarily rest on the logics of consent, we’ve already built our priorities into how we want those things constructed. So that when we enter them, we know that they are meant to be following those rules.”

Consent is being used to legitimise inequality

We are creating new inequalities of a type that Civil Rights Movements have fought against for decades now”, Kovacs said while referring to how data related to bodies has led to a fundamental restructuring of relationships between people and people, people and big business, and people and governments. Essentially consent is being used at the moment to legitimise this growing inequality,” Kovacs added. 

She gave the example of starvation deaths due to the failure of biometric Aadhaar-authentication to drive home that how we talk about data really should shift to put bodies fundamentally back in the center of this debate.”

Respecting the demand for privacy

Just beyond the harm that can occur from, you know, sharing of personal data at the very baseline, there is an expression of privacy, there is a claim of privacy which people don’t want to violate and I think this should be the starting point of all our conversations,” Chugh said, referring to results from a survey conducted by Dvara Research on people consenting to sharing data. According to the research, nearly 30% of people were comfortable sharing their data – including phone numbers, photographs, and fingerprints, with their neighbours while 42% would be uncomfortable sharing it simply because it was their private information. Chugh stressed that there was an intrinsic demand for privacy that needed to be respected as opposed to constantly trying to curb the harms of personal data sharing. 

Later, Chugh said that people care about privacy, and thus, private sector businesses especially impact investors, should also look at it as an important area of consideration. 

Originally published in MediaNama.