How Covid-19 helped government control your data and you
by Radhika Radhakrishnan
Pandemic-era data-enabled surveillance—via digital tracking, disinformation, a variety of apps, phones and drones—undermines data privacy, autonomy and dignity of individuals and specifically disadvantages women, research shows.This piece was first published in Article 14.
Bangalore: Indians over 60 or over 45 with comorbidities can now register for vaccination against Covid-19 but mainly through a government portal or the Aarogya Setu application. Even if registration is offline, your details will be added to a government database and you will get a certificate with a digital QR code.
Such data-enabled surveillance undermines data privacy, autonomy and dignity of individuals and specifically disadvantages women, research has shown, including my own.
The reported spike in domestic violence during the lockdown caused by the Covid-19 pandemic focused attention on an ingrained problem in a country where over 25% of women in seven states reported domestic violence to the latest round of the National Family Health Survey (NFHS).
Anecdotal data from NGOs and the National Commission for Women recorded a spike in this already serious problem (see our earlier story here). Less remarked upon was how the spread of the pandemic also resulted in increased surveillance of citizens, including women.
The breakdown of offline systems during the lockdown created challenges to women in accessing state helplines to report domestic violence. Mobile phones are often shared at home and so, said Bishakha Datta, executive director of Point of View, a nonprofit, women “were scared of leaving a trace of the number that they were calling”.
The gender gap in digital access, including access to mobile phones, is backed by data; 38% of women use mobile phones compared to 71% of men.
Women are restricted from owning mobile phones in order to control and monitor their movement, wrote Anja Kovacs and Nayantara Ranganathan in a February 2017 paper about mobile phone bans in India, published by the Internet Democracy Project. Further research analyses such gendered restrictions to be a means of surveillance.
During the lockdown, this control hindered women from reporting violence and this in turn exacerbated inequalities within already unequal homes, as I reported in my November 2020 paper, I took Allah’s name and stepped out: Bodies, Data and Embodied Experiences of Surveillance and Control during COVID-19 in India.
“The state must provide non-technological alternatives for women to report domestic violence, classifying them as essential services,” I wrote in a December 2020 policy brief, Protecting Bodies and Rights in Disease Surveillance during COVID-19 in India.
Going Beyond Data Privacy
Data protection policies attempt to protect violations of data. Data is viewed as a resource (“the new oil”); a disembodied asset that is valuable only after human extraction. When surveillance is carried out within this policy framework, the focus is narrowly on violations of data.
But something more happens to people when their data are controlled by others: A loss of control over their own bodies.
This can be seen in the instance of restriction to women’s access to mobile phones and their resulting inability to protect their own bodies from violence during the lockdown as described above.
Hence, it is important to protect bodily integrity, autonomy and dignity through data protection policies, I wrote in my paper. These aspects are more intimate than aspects of data privacy, and carry with them the burden of justifying a breach of the body.
We can protect these by highlighting interconnections between people’s bodies and their data.
For marginalised communities, inequalities turn to stigma. One way this happens is through digital disinformation on social media.
For instance, a fake video that went viral on Facebook during COVID-19 showed Delhi residents imposing a ban on Muslim vendors. Following this, buyers began asking vendors to produce identity cards to ascertain their religion.
“Vegetable sellers who are Muslims were boycotted, people were saying don’t buy vegetables… from Muslims,” said Drishti Agarwal and Manju Rajput from the Aajeevika Bureau, an organisation that works with labour movements.
By fuelling existing prejudices about marginalised communities, digital disinformation promotes surveillance, and its harms spill over as physical harms.
This is why I argued in my December 2020 paper that social media platforms should incorporate fact-checking of posts and flag posts that may contain disputed claims or be misleading.
State surveillance exacerbates these inequalities within homes and communities.
Another way the pandemic has increased surveillance is through home quarantine apps that extend the state’s gaze within the home. Karnataka’s Quarantine Watch app required hourly selfies from home to verify that people are in quarantine. For women, sharing private images with state authorities can have consequences of voyeurism, slut-shaming, or predatory actions, I found.
Such data which may make specific groups more vulnerable to harm, and that is not strictly necessary to contain disease spread should not be collected. Data collection should be limited when there are other means of inferring the same information.
Punjab’s Ghar Ghar Nigrani app was mandatory for ASHA workers for door-to-door community surveillance. “If someone has a cough in the village, and an ASHA enters that in the app, if they get called for enquiry, they blame the ASHA for putting them in that situation. ASHAs are facing violence due to it,” said Ranjit Kaur, a union leader in Punjab for the All India ASHA Worker and Facilitators Union.
Public Mobility Restrictions Through Criminalisation
Surveillance results in mobility restrictions. When such surveillance is enforced by criminalising people’s actions without understanding their social contexts, then it is not the disease, but the bodies of people that are controlled through their data.
One such method of surveillance was the use of drones during the lockdown. “People were not getting medicine because if they left the house, drones were coming and monitoring, and people were scared, so they’d go back inside,” (translated from Hindi) said Salma (name changed), a nurse and a Muslim woman from Jharkhand.
The fear of drones was partly because people had not been informed why drones were being deployed in their localities. A submission by the Internet Democracy Project in response to the government’s Draft Unmanned Aircraft System Rules, 2020, recommends that the “state should ensure that the people that are affected or may be impacted should be informed about the nature, purpose and implications of the use of this technology”.
Traced through mobile phone data, some states also released public lists with personal details of people suspected to be in the vicinity of the Nizamuddin Markaz in Delhi when a religious congregation was organised there by the Tablighi Jamaat. As Article 14 reported in April 2020, the misuse of these data started a spiral of Islamophobia, leavened by fake news and hyperbole.
On one such list was Salim (name changed), a Kashmiri Muslim who said he was not in that area that day. Since then, Salim said he felt that the police “are keeping an eye on me… they can stop me from going anywhere… if they have a database, they can find me”.
For Salim, the tracking of his data was akin to tracking his physical body. He said he considered leaving his phone at home whenever he went out after this.
Exclusion From Essential Services And Rights
During COVID-19, Aadhaar was mandatory for accessing hospitals tests, rations, and pensions, causing people to lose out on their entitlements, and even their lives.
“If your… Aadhaar card is the identity… people who have no identity card saying they are transgender, they might be in problem if they become Covid-19 positive,” said Raina Roy, a trans woman and director of Samabhabona, an organisation that works with trans communities.
The government’s Aarogya Setu app is ostensibly meant for contact tracing. It was meant to be voluntary, but was made mandatory for gig workers.
“This is not your constitutional right to mandate people to download [the app]… How do you have the right to stop someone’s livelihood and daily bread-and-butter because they did not download it?” said Shaik Salauddin, national general secretary of the Indian Federation Of App-Based Transport Workers (IFAT).
As a form of social protection, the Jharkhand government promised Rs 1,000 to each migrant stranded outside the state during COVID-19. But in order to access this money, migrants needed to download the Jharkhand Corona Sahayata App requiring registration with their selfie, bank account details, and Aadhaar number.
Yet, many migrants I spoke to could not receive money through the app due to technical errors. “We are just waiting for almost a month now, but we have not seen any money,” Satya (name changed), a migrant from Jharkhand, told me.
Aadhaar, Aarogya Setu, and the Jharkhand Corona Sahayata app are digital requirements that collect people’s data and profile them into categories.
Bezwada Wilson, the head of Safai Karmachari Aandolan that represents manual scavengers, has argued: “Any marginalized community… wants to destroy its existing identity…but in your Aadhaar, my occupation, where I come from, everything will be there.”
Thus there occurs a reification of categories through a fixed digital identity.
These digital categories also don’t often reflect physical realities. Aarogya Setu is prone to errors, false positives, and false negatives; people have been forcibly quarantined through it despite testing negative for COVID-19.
Fingerprint authentication issues with Aadhaar for accessing rations have also led to starvation deaths. On the Jharkhand Corona Sahayata App, a labour group stranded in Bihar was shown to be in Jharkhand due to GPS issues, making them ineligible to receive money. A migrant’s selfie was not accepted by the same app because his beard looked different from his Aadhaar card picture.
Digital identities are thus given precedence over corporeal identities, and the categories they slot people into hamper their autonomy to make decisions about their lives, becoming a form of social control. This results in their exclusion from accessing state services and rights.
The harms and violations discussed here are not new, but they are increasingly being inflicted in ways that were not possible before. Current policy frameworks are not enough to respond to this scenario because they are blind to the harms of data surveillance upon the bodies of people.
To ensure that human rights are protected in the digital age, a fundamental shift is required in how policies conceptualise data and its connections to bodies. Policies responding to COVID-19 that aim to protect personal data must be broadened to respond to embodied harms to ensure that individuals retain control over not just their data, but also their datafied bodies.
Author’s note: This article draws upon qualitative research carried out for a paper that can be accessed here.