This article is part of Privacy in the Pandemic, a Future Tense series.
In India, the COVID-19 crisis is turning out to be the perfect excuse for the government to consolidate its pre-pandemic surveillance ambitions. Since May 4, as restrictions began to ease, the government has mandated the installation of its contact-tracing smartphone app for anyone who works—whether for the government, a private company, or in the gig economy—or uses public transport. Failing to do so is criminal, says the home affairs ministry. (People without smartphones are supposed to report into a toll-free interactive voice response system.) Unsurprisingly, digital rights and civil society organizations are pushing back on the new directive.
Known as Aarogya Setu, the application came under criticism right from the start for its lack of data protection for users. Aarogya Setu, which is available in 11 languages, has self-assessment tools to check if users may be at risk. It also informs users if they have crossed paths with a COVID-19-positive patient and provides information on self-isolation. The app assures that user information is secure and all data is anonymized, but questions remain about information collection, purpose limitation, and data storage. On May 5, a white-hat hacker named Robert Baptiste even called out the app maker, the National Informatics Centre, over some disturbing privacy violations that could lead to disclosing COVID-19 patient locations. Despite the evidence, the NIC denied that user data was at risk.
At the point of installation, the app requires users to submit six data points (including name, age, gender, and travel history), which are then exported to an external server. This, says Sidharth Deb of the Internet Freedom Foundation, a digital liberties organization in India, is worrisome.* There is no clarity of who can access the server, no sunset clause (that is, no point at which the data will definitely be deleted), and no transparency about how the data is anonymized. Deb calls it a classic case of technological solutionism, “technology deployed without adequate dialogue of whether it’s, first, efficacious, and second, what the safeguards [are] to ensure that these systems won’t be abused.”
There are already alarming indications of mission creep with Aarogya Setu, which was, at first, voluntary. There are talks of a new fitness-band-like monitoring device that will integrate with the app. Already the app is being used as an e-pass, allowing you to travel (it will reportedly be mandatory for anyone who wishes to fly) and may be linked to telehealth providers. It is likely that new smartphones may come with the app preinstalled.
There is real danger that Aarogya Setu could be a gateway to nationwide surveillance. National security, personal safety, dispersal of essential services, and now disease surveillance—in the past few years, the Indian government has used all of these as pretexts to infringe more and more on privacy. The country has already seen an unbridled drive toward digitalization, automation, and surveillance, and the COVID-19 crisis has added a new layer to this, one that could have far-reaching humanitarian, social, and economic consequences.
For years, activists have been tussling with the government about a biometric ID project called Aadhaar. This is a unique 12-digit number that the authorities want to assign to every Indian citizen, tied to their fingerprints and retinal scans. Introduced as a tool to weed out illegal subsidy transfers and cut down on corruption, Aadhaar has seen an alarming scope creep. It’s now linked, often without consent, to mobile phone connections, bank accounts, income tax identity numbers, pension schemes, passports, electoral IDs, and more. The Aadhaar database is believed to be compromised, resulting in millions of fake accounts, instances of Aadhaar-enabled fraud, misdirection of welfare schemes, and surveillance of specific demographic groups.
The right-wing Bharatiya Janata Party government has also been trying to create a National Register of Citizens, ostensibly to weed out illegal immigrants. But it’s more alarming when viewed in conjunction with the Citizenship Amendment Act, which passed in late 2019 and was largely seen as anti-Muslim. The act lead to widespread protests around the country, the most famous one being the women-led Shaheen Bagh sit-in in Delhi. The protest and other peaceful demonstrations across the country were cut short by the pandemic. Since then, the government has been using the cover of the lockdown to (illegally) detain several key protesters under allegations of “rioting.” It is this legacy that makes many worry that COVID-19 tracking measures may morph into surveillance tools of a very different kind.
Vidya Subramanian, who works on the sociology of technology at the Centre for Policy Studies at the Indian Institute of Technology Bombay says that most people buy the myth that a little bit of data is a small price to pay for safety, however one defines that. These small inroads, then, become forever normal. “The public health care crisis is being used to push technological developments that we don’t yet know will even work. … That’s how you establish an authoritarian regime, [and] really bad practices for exploitative data extraction,” she says.*
Marginalized populations—women, LGBTQ people, the poor, sex workers, religious minorities—find themselves disproportionately exploited in these systems. There are already signs that India’s race, caste, and religious prejudices—inflamed by the right-wing government and its goal of creating a Hindu nation—will be exacerbated post-COVID-19, creating a perfect playing field for trying out restrictive technologies.
“Some degree of surveillance may be necessary to contain the pandemic,” says Radhika Radhakrishnan of the Internet Democracy Project. But “currently, the state is less focused upon surveilling the virus, and more focused upon surveilling the bodies of people.” She and others have emphasized that measures like contact tracing cannot be entirely technology driven. “It is nonnegotiably critical to have humans in the loop, and to have meaningful alternatives for those who do not have access to technology.”
Correction, May 11, 2020: This piece originally misspelled Sidharth Deb’s first name and misgendered Vidya Subramanian.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.