An online discussion on bodies and data protection turned to biometric tech regulation, specifically what works and what doesn’t.

Our data and our body are so intimately interwoven that what we call our virtual bodies and our physical bodies, the line that is supposedly dividing these two becomes irrelevant, even if you can argue that maybe it exists,” Dr. Anja Kovacs said during the inaugural session of PrivacyNama 2021, a global conference on privacy regulations held by MediaNama on October 6 and 7. The session was focused on the use of biometric technology like facial recognition, thermal scanning, fingerprint scanning, etc., and what it takes to govern them. 

Beni Chugh of Dvara Research, Director of the Internet Democracy Project Dr. Anja Kovacs, Executive Director of the Centre for Internet and Society Amber Sinha, Executive Director at the Centre for Communication Governance Jhalak Kakkar, and professor at Monash University Mark Andrejevic in a discussion moderated by Nehaa Chaudhari, a partner at Ikigai Law, spoke about the obstacles to effective regulation of biometric tech and data collection in India, such as: 

Lack of incentives for clear privacy policies

I feel that there is I think enough that we have to work with in terms of being able to provide that conceptual model in the form of a privacy policy, or a privacy notice to an individual and there is absolutely no incentive right now for companies to actually invest in that,” Amber
Sinha said, referring to the flawed models of clear, informed, and granular consent-seeking documents. Further, Sinha recommended risk-based and impact assessments of biometric technology to implement aggressive and granular measures to tackle them.

Ambiguous legal standing leaves room for SOPs

Referring to the ambiguity around the legal standing of any biometric-based technology deployed by the government, Jhalak Kakkar said, Typically these kind of technologies in the Indian context have certainly worked on SOPs which are not even usually in the public domain and I think another first thing I really liked to see is actually legislation that is in the public domain to which I can hold the government accountable…” 

Kakkar further said that she would want them to disclose information on the storage processes used by such a system, the duration the data is retained, processing it is subject to, how it is being shared, etc. She also asked for more disclosures around the procurement of such technology as well.

Stakeholders not being given enough chances to submit feedback

I would really wonder if civil society and interested stakeholders are going to have a chance to really come to the table and give their perspective and push back against some of the thinking that may emerge in a new version of the bill…” Kakkar said, about India’s Personal Data Protection Bill which was first introduced in 2019 and is currently under consideration by a Joint Parliamentary Committee. Later, Sinha also expressed his reservations about the draft of the bill saying, Essentially what the draft does is it takes power away from the private sector and gives it to the government – that needs to change.”

Lack of will to ensure compliance with Puttaswamy judgement

While talking about the landmark Puttaswamy judgement being violated by the government with no consequences due to minimal judicial review and legislative oversight, Sinha said, We are certainly in a position where we have truly a epochal judgment and fairly well-established judicial standard that is not really being followed.” He further said that there isn’t any political will and power with other stakeholders to address this non-compliance as well. 

While talking about the landmark Puttaswamy judgement being violated by the government with no consequences due to minimal judicial review and legislative oversight, Sinha said, We are certainly in a position where we have truly a epochal judgment and fairly well-established judicial standard that is not really being followed.” He further said that there isn’t any political will and power with other stakeholders to address this non-compliance as well.

Varying interpretations of Puttaswamy judgement

I do think where I see the whole confusion coming from is the early interpretation of Puttaswamy itself, the privacy judgment, where there seems to be the three-prong test laid out but nobody seems to be applying it,” Chugh said while commenting on the compliance with the Puttaswamy judgement by the private sector. So is there something in between which is missing that, this is the constitutional wisdom but what does the person on the ground who wants to run a business take away from it right? How do they apply it, how do they internalise it?” she added. According to Chugh, there needs to be frameworks that guide the industry on complying with the judgement’s three-prong test.

Taking stock of the problem

During the discussion, the participants raised points on the consequences of biometric data collection and the deployment of related technology.

Power relationships

There are levels of control that we are now seeing and supposedly accepting that we just didn’t do earlier,” Kovacs said while referring to biometric and location-based data collection by apps, employers, and governments. During the session, Kovacs also referred to feminist theories and concepts to make the point that consent is relative. You have to create the conditions that enable them to be free and equal conditions and so to express consent meaningfully,” she said, adding that this is not being followed in the current practices around consenting to e‑governance programs, leading to power relationships.

In children, impacting development

Kakkar raised concerns over childrens’ development if surveillance took place in their schools, referring to a study that found that children as young as one and a half years old change their behaviour when they are being watched.

Can you imagine what impact this is going to have on the child’s psyche growing up? — Kakkar on CCTVs being installed in Delhi’s government schools.

Notions of autonomy, you know, and beyond privacy, beyond autonomy, like our ability to function as human beings and the society and….and you know, we’re going to be monitoring them in a place where they learn and that’s just going to undermine setting a strong foundation for a free and Democratic Society,” Kakkar said.

Re-inforcing old prejudices

All of this kind of inferential use, which seems quite, I don’t know, far-fetched and an attempt to resuscitate long-debunked ways of thinking about how our bodies work, are being explored and built into these systems,” Andrejevic said, talking about facial recognition and other biometric technology that claim to be able to infer a person’s mood, sexual orientation, or criminal tendency. He also said that much of the legislation he had seen around such technologies concerned themselves more with identification than inferential and possible discriminatory uses, and flagged the collection of such data by discreet mechanisms. Automation makes that form of data collection very powerful,” he said. 

We use bias training data sets of flawed sampling, which overall under-represents minority communities and then AI systems make decisions based on their training on these data sets which sort of perpetuate and really embed historical bias and discrimination into our society,” Kakkar said. She further said these were often transferred from high resource contexts (such as western countries) to low resource contexts — something that regulators must think about. 

What does effective regulation look like?

Effective regulation depends on the context: In the regulation of AI systems, context really matters. AI systems are being developed in the western world and very often just being translated fairly blindly into other contexts, right. They are being translated from high resource context to low resource context. They are being translated from the context where they are trained on particular kinds of data sets […] to being deployed in an Indian context where we have such a diverse society with many ethnic groups,” Kakkar said. 

Effective regulation makes privacy competitive: The best thing that regulation can do is to make a market out of privacy which is to say that make privacy a parameter on which private providers kind of compete,” said Chugh. 

Risk-based regulation is the best we have: Risk-based regulation is like an educated compromise because we know that we’re not going to have enough State capacity and enough institutional bandwidth to go and regulate every person out there who has data,” said Chugh. So what we do is we kind of filter through to see who are the biggest kind of carriers of data, collectors of data and see if we can enforce the regulatory régime on to them pretty well and then a significant share of the market is covered. So is risk-based regulation the best approach? No, but it’s probably the best that we have,” she added. 

Comments have been lightly edited for the purposes of clarity and brevity.

Originally published in MediaNama.