Observer Research Foundation
Embodied surveillance during COVID-19 in India: A feminist perspective
by Radhika Radhakrishnan
In March 2020, Salim (name changed), a Kashmiri Muslim, woke up to calls from concerned relatives to find that his name and phone number had been released on a public government list. This list contained the information of around 650 people suspected to have been in the vicinity of the Nizamuddin Markaz Mosque in Delhi at the time when a religious congregation was organised there by the Tablighi Jamaat, an Islamic missionary movement. This event is claimed by the Indian government to have caused a spike in COVID-19 cases in India. The police used mobile phone data to trace people who were in the vicinity of the event. Salim told me that he was not in that area that day, and he did not know why his name was on the list, and why it had been made public.
After this incident, Salim said he felt constantly scared when he heard a knock on his door, thinking that the police may have come to arrest him: “They are keeping an eye on me, I felt… They can stop me from going anywhere… if they have a database, they can find me.” (translated partly from Hindi). For Salim, the state tracking his location data was experienced as the state tracking his body. This was such a palpable emotion that he considered leaving his phone at home every time he went out after this incident. In today’s world, our bodies are interconnected to our data to such an extent that Salim needed to physically sever the connection with his phone to avoid being controlled through his data.
Salim’s experiences of fear and public targeting are not just violations of data privacy. These intimate emotions and the materiality of their consequences upon his physical mobility point to something more devastating that we’re unable to see within current data protection frameworks. The most common understanding of data in these frameworks is of a resource which is independent from our bodies and available for human exploitation; the new oil. Anja Kovacs has pointed out that such a construction of data erases the connections between data and bodies. Since we have different policies to protect our data and our bodies, this reduces the harms of surveillance that Salim experienced to data violations. But to capture the true extent of these harms, the body must be visibilised in constructions of data.
Salma (name changed) is a nurse and a Muslim woman living in a Muslim locality in Jharkhand where drones have been used to monitor movements of people and identify COVID-19 lockdown violators. One evening during the lockdown, she received a call to assist a pregnant woman in a critical state for childbirth in her neighbourhood. Their locality had been sealed and no ambulances were being allowed entry. Considering it her duty to help the pregnant woman, she recalls hiding in the dark, close to the footpaths, to avoid being seen by the drones, and walking quietly to the woman’s house. Incidents of this nature were common occurrences, she said: “people were not getting medicine because if they left the house, drones were coming and monitoring, and people were being scared, so they’d go back inside” (translated from Hindi). The digital images captured by drones are a means to discipline physical bodies to stay at home without police officials having to be present. The privacy of the data collected by drones is of concern, but what is of greater concern is data facilitating control of bodies and regulating their mobilities from a distance. Guidelines on public mobility may be needed to control community transmission. However, when this is carried out in a manner that evokes fear without taking into account the socio-economic contexts and needs of people, then it is not the disease being controlled, but rather the bodies of people through threats of collecting their data.
While drones operate in the public eye, this form of disciplinary power also extends to the intimate space of the home through COVID-19 home-quarantine apps. The Karnataka government released the Quarantine Watch app, requiring all home-quarantined persons to upload geo-tagged mobile phone selfies every hour to prove their presence at home, failing which an FIR would be registered against them. Feminist studies show that women experience discomfort in filing cyber violence complaints due to a distrust associated with the state being able to view their private images on their phones during a case investigation. This discomfort is likely to extend to sending state officials selfies from the home. Misuse of this data would not just be a data violation, but could easily extend to voyeurism, slut-shaming, and predatory actions, which threaten a woman’s bodily integrity.
Another app released during COVID-19 by the Indian government, jointly developed with industry stakeholders, is Aarogya Setu, for contact tracing. It has received criticism for the proportionality, legality and necessity of its data collection. Beyond these valid concerns, the app categorises bodies as ‘low-risk’ or ‘high-risk’ based on self-reported health data and data from mobile devices that have the app installed in the area. Since the app is mandatory to access railways and metros, these categories determine whether a person can access these essential services. Moreover, companies having access to employees’ health data through the app may use it to determine their pay and insurance, impacting their livelihoods.
Moreover, these categories that individuals are slotted into by the app may not correspond to their physical realities due to various reasons. First, not everyone in a person’s vicinity may have a smartphone or the app installed, especially in light of the digital divide. Second, since the app depends on self-reporting which precedes medical testing, self-reported symptoms may not be reliable given the asymptomatic nature of the disease in many cases. Third, people may hesitate to self-report accurately because of the stigma associated with the disease. Fourth, the app could have false positives and negatives. Yet, the digital reality constructed by the app takes precedence over our physical realities; people have been forcibly quarantined through alerts generated by the app despite testing negative for COVID-19. The data on the app determines whether our bodies are diseased or healthy, irrespective of whether that data corresponds to our bodies. Data is not only embodied, it also takes precedence over physical bodies for accessing essential services.
The app was also made mandatory for gig workers as a safety measure. But even if the app were more accurate, whose safety is of concern here? Shaik Salauddin, the national General Secretary of the Indian Federation Of App-Based Transport Workers (IFAT) questioned why the app was mandatory for only drivers at Uber, and not passengers: “If I am a driver, and someone gets into my cab, how will the driver know if the person is positive or negative?… If the company or government has any care for the driver’s safety, then why don’t you provide gloves and masks to the people?” (translated from Hindi). While data may be able to provide important insights, it cannot keep people safe as an end in itself, especially when it is not in control of the people whose bodies generate it.
On the contrary, though justified as a safety measure, surveillance enabled through data can lead to violence. Shubhangi at the Association For Advocacy and Legal Initiatives (AALI), Lucknow, a feminist legal advocacy and resource group working with domestic violence survivors, shared that during the COVID-19 lockdown, women who have access to mobile phones at home “are being subject to violence because they are spending ‘too much time’ on the telephone.” Additionally, women using shared family phones were unable to report domestic violence by calling helplines after offline modes of reporting broke down during the lockdown. Bishakha Datta, the Executive Director of Point of View, an organisation that works on gender and sexuality, shared: “People are scared of leaving a trace of the number that they are calling.” Restrictions are often culturally placed upon women’s access to mobile phones and the Internet because the liberatory potential of data is a threat to patriarchal boundaries. Controlling women’s access to data is a way to control their bodies, preventing them from reporting violence and using phones. Feminists have written extensively about the home being an unsafe and unequal space for women. Surveillance ensures that these inequalities within the home are kept in place, legitimised and reproduced structurally.
Sometimes it is not the persons being surveilled, but those carrying out surveillance who may face violence through it, if they hold lesser power in that dynamic. Consider the case of Accredited Social Health Activists (ASHA) workers, who are women from largely marginalised class and caste locations, and have been on the front-line of community healthcare during the pandemic. The Punjab Government launched the Ghar Ghar Nigrani app for ASHA workers to undertake home-to-home COVID-19 surveillance. Ranjit Kaur, a union leader in Punjab for the All India ASHA Worker and Facilitators Union, said: “If someone has a cough in the village, and ASHA enters that in the app, if they get called for enquiry, they blame the ASHA for putting them in that situation. ASHAs are facing violence due to it.” (translated from Hindi). Many incidents of violence against ASHA workers have been reported during their door-to-door surveillance tasks. Because of the fear that data collected by the app could lead to people being incarcerated or quarantined, they resort to violence to keep their data-as-bodies safe. The question of safety must then once again be placed in its social contexts — whose safety is the state looking out for? Over a million ASHAs have not been paid by the state for months during COVID-19, nor have they been provided safety gear. When over a hundred ASHA workers protested these conditions during the lockdown, a case was filed against them by the Delhi police.
Visibilising the connection between bodies and data brings to light its power relations and contexts, so the specific harms arising from its violations can be made explicit. In all of the discussed cases here, surveillance undermines not just data privacy, but the bodily integrity, autonomy, and dignity of individuals. While these violations are not new, they are now taking place in opaque digitally mediated ways that were not possible before. Data protection policies must move away from the dematerialised constructions of data as a resource, and adopt an embodied approach so that individuals have agency over not just their data, but also their bodies. Such an approach would ensure that our human rights are protected within legal frameworks in the digital age.
Originally published in Observer Research Foundation.