The Internet Democracy Project welcomes the consultation by the Ministry of Electronics and Information and Technology on the draft personal data protection bill (“draft Bill”). We hope that responses will be made public in the interest of a transparent process.
The draft Bill has introduced several important features, but seen together, it does not translate into a legislation that succeeds in shifting the power imbalance in favour of individuals and vesting citizens with the rights needed, considering the complexity and obscurity of data flows and business models, and the State’s own ambitions of governance through data.
The right to informational privacy is integral to human dignity and autonomy, as confirmed by the plurality judgment in the Puttaswamy (2017) judgment. In the context of a data protection bill, it would be operationalised through control over data, the autonomy to decide how and where it may be used and by whom. The draft Bill does not succeed in vesting such control with data principals.
The ‘gains’ emerging from what is called the digital economy are in large part due to the phenomenon of surveillance capitalism and the enabling infrastructure for the targeted advertising industry. To address the worst excesses of data monetisation in the digital economy means to have to tackle these issues. Looking at the draft Bill, it is clear that most of these thorny issues will not see any chance of being addressed if the bill is passed in its current shape. The unambiguous espousal of the use of personal data for prevention of contravention of law (Sections 17(2)(a) and 43) and of profiling for scoring (Section 17(2)(e)), as well as the absence of rights against profiling and automated decision-making contribute to the draft Bill’s inability to fully respond to some of these sticky issues in the digital economy.
Many of our misgivings about the draft Bill and report are due to a lack of conceptual clarity in the report. Crucially, the report conflates “collective values” with the limited restrictions to the right to privacy, in the plurality judgment in the Puttaswamy case (2017), and thereafter with legitimate interests of the State. Indeed, some restrictions placed by the State in certain circumstances can arguably be in collective interest. However, “collective values” and “common good” are not interchangeable with ease of data sharing within the Government. The interests of data principals and the State do not always align, and indeed, the report itself recognises that it is safe to proceed on the assumption that the State is prone to excess. And yet, the report makes these slippages in multiple cases. This conflation forms the basis of wide State exemptions and grounds for processing for the State, which is one of the main problems with the draft Bill.
The report states that it is the understanding of the drafting committee that the State has a duty to put in place a data protection framework that, while protecting citizens from the dangers to informational privacy originating from State and non-State actors, serves the common good. The draft Bill fails in protecting citizens from dangers arising from State actors, does not go far enough in protecting citizens from non-State actors and conflates ‘common good’ with State prerogatives.
Far too many definitions, categories and procedures are left to be clarified at a later stage, either through delegated legislation by the Data Protection Authority or through interpretation. Given the amount of power that the government holds in the appointment of the various wings and the continued functioning of the Authority, later stage clarification of ambiguities is a huge concern.
The general obligation under Section 4 that data fiduciaries process personal data of data principals in a ‘fair and reasonable’ manner as a normative standard is vague, and does not provide sufficient safeguards to data principals. Apart from being vague, this is a highly subjective standard applied in the context of data protection, as one person’s sense of what is fair and reasonable does not correspond to another person’s sense. One’s determination of what might be ‘fair and reasonable’ would also depend on their social and political location in society, rendering this standard hard to apply. To take the example of an incident in recent public memory, what constitutes harm in mind of an important public official with dominant caste, gender and class identity when his phone number and address was revealed, was different from that of many women, trans persons and gender non-conforming persons, for whom the revealing of such personal information can have a range of harmful consequences.
The criteria that are used to determine the purpose of processing under Section 5(1) (clear, specific and lawful) are weakened by the standard of reasonable expectation that follows in Section 5(2). What is reasonably expected today is necessarily shaped by the absence of any regulation around data processing for several decades, so does not serve as a useful circumscribing accompaniment.
In addition to purpose and collection limitation, ‘data minimisation’ should be one of the features introduced into the draft Bill. It is the general principle that one should use the minimum amount of data to fulfil a particular purpose. This principle should be included to create an obligation to ensure that the personal data sought to be collected for processing is in fact indispensable for the activity in question.
The detailed fields of information to be supplied under the notice requirement in Section 8 are a good measure. However, notice would continue to put a large cognitive burden on data principals. Solutions for operationalising notice should be given more attention.
Notice requirements under Section 8 should apply to government processing of personal data as well. Clear information about how the State processes information for carrying out its functions is the first step in making the State transparent and accountable. Section 8(1)(e) specifies that the basis of processing and the consequences of the failure to provide such personal data be notified when processing of personal data is based on grounds from Section 12 to 17 and sensitive personal data from Section 18 to 22. This is absolutely necessary; however, it is unclear why the various ground have only been mentioned under Section 8(1)(e), when it should be applicable across all the different sub-sections.
We appreciate that the normative value of consent is acknowledged. We recognise that it cannot be the sole feature on which the empowerment of data principal rests. However, it currently is undermined by the other wide grounds of processing, among other things.
Consent withdrawal under Section 12 should be meaningful. If all legal consequences of the withdrawal are to be borne by the data principal, even where, say for example, the purpose of processing is altered and notice given, this puts the data principal in a position of powerlessness.
Section 13, or the processing of personal data for the functions of the State is so broad that it undermines the purpose of the draft Bill. As a lot of the processing functions of the State are done in partnership with private companies, holding them to a different standard than data fiduciaries processing in a non-State capacity would shortchange the latter as well as harm citizens. Even if Section 37 were to limit what data processors can do with the data shared by a data fiduciary under a contract, this broad ground is still a concern.
Section 15, or processing of personal data necessary for prompt action, can be done on the ground of ‘breakdown of public order’ — a broad ground which is ripe for interpretation. As internet shutdowns in the country have have shown, similar mandates to tackle public emergencies and ensure public safety have generally been construed in the service of censorship and stifling of dissent, among other ends.
The ground for processing of personal data for purposes related to employment is too broad, allowing for increasingly intrusive workplace surveillance methods. Particularly, Section 16 (1)(d) which allows processing if it is ‘necessary’ for ‘any other activity relating to the assessment of the performance of the data principal who is an employee of the data fiduciary’, allows many kinds of surveillance to thrive, including profiling on the basis of social skills, personal circumstances etc. Section 16 (1)(d) should be omitted.
Processing of personal data for reasonable purposes is also overbroad. Where ‘necessary’ and ‘public purpose’ have not been better defined, this ground can end up leaving far too much room. In addition to narrow and clear definitions, if this section is retained, data principals should have the option to opt out of processing for public purposes.
Further, ‘prevention… of any unlawful activity’ is absolutely unacceptable as a reasonable ground. Technologies promising prevention of crime are ridden with problems of amplification of bias and unfair targeting. Preventive technologies are a classic case where firms dealing in them profit from the uptake, while the risks are socialised.
Processing of publicly available personal data is also another dangerous ground. Specifying this as a ground for which the Authority can outline reasonable purposes?, while there has been recent public uproar against social media surveillance hubs is sufficient indication of the disregard for what is considered ‘reasonable’ in the country. The Authority also has the discretion to make notice requirements under Section 8 inapplicable. This is harmful, and should be omitted.
This chapter institutes rights like the right to data portability under Section 26, which are very much welcome. However a number of standard-issue rights that empower data principals in the age of large scale data processing are missing.
A right to erasure should be included in addition to the right to data portability. Providing the latter in the absence of the former only allows data principals to exercise choice and creates a competitive ecosystem, but does not allow for control of where their data continues to be. The data storage limitation in Section 10 requires that data be deleted once the purpose of processing is complete, but in the absence of this being enshrined as a right, its value is very much diminished, as data principals lack tools to enforce it.
The right to be forgotten under Section 27 only provides a right to limited disclosure. This is insufficient, as it does not let data principals exercise their autonomy. However, a right to erasure should be balanced with a right to freedom of speech and expression, as has been attempted for the current, limited right to limit disclosure, so that information about, say, public officials in the public domain or that is in the public interest is not erased.
A right to object to profiling, especially when it can affect legal rights of persons should be included, and crucially, be applicable to government agencies as well. It is concerning that there is explicit provision created for the Authority to make a certain kind of profiling possible, by allowing the Authority to specify credit scoring as a reasonable ground for processing. While traditional credit scoring comes with its own set of biases, algorithmic credit scoring takes this flawed and discredited method and multiplies the harms that credit scoring systems produce. It should be great cause for worry that allowing for conclusions to be drawn from datasets which have nothing to do with financial transactions is a selling point of such systems. Existing companies that do credit scoring draw from data on social media profiles, about what games users play etc. Pushing access to banking and credit in a large way, on the one hand, and subjecting data principals to unfettered profiling on the basis of their data, on the other, is exploitative. Horrors of such scoring have been widely documented in other parts of the world.
Further, a true commitment to the collective good can be shown only by recognising that the draft Bill should apply to automated processing (including profiling) of not just personal data, but also non-personal data which leads to conclusions being drawn about groups. Without this, the many paragraphs dedicated in the report to the insufficiency of individual rights to empower citizens would be empty rhetoric. In the GDPR, this is recognised in Articles 4(4) and 22, although the extent to which it provides protection has been criticised and can do with further improvement as well.
A right to explanation should be part of the bundle of rights. At the moment, algorithmic legibility is being implemented by various jurisdictions. The GDPR enshrines a right to explanation. The lawmaking body of the city of New York passed a law requiring algorithmic accountability. Making opaque decision making systems subject to scrutiny is one of the most important things that data protection legislations should do to ensure citizens have knowledge of and recourse to algorithmic decision making that can affect their rights.
The right to confirmation and access under Section 24 should be strengthened. As it currently stands, data principals will only receive ‘a brief summary’ of processing activities undertaken by the data fiduciaries. This is not sufficient and should be expanded to require a copy of data held by the data fiduciary to be made available to the data principal.
This Chapter brings home many of the fundamental conceptual faults of the draft Bill. The many justifications in the report for blanket data localisation requirements are not fully substantiated, and the requirements positively increase the vulnerability of data principals and decrease their autonomy and choice.
One of the justifications in support of data localisation requirements for personal data is that it would increase the safety of such data. Given the conservative safeguards in the draft Bill, this is untrue — if anything, it increases the chances of data principals’ data coming under overly broad government surveillance. Assuming the premise of enhanced safety is true, if data fiduciaries are free to have copies of personal data outside the territorial limits as long as a copy is stored in India, it defeats the said claims of enhanced safety. Such a move would also not allow smaller entities to provide services to persons based in India, impeding choice of data principals.
By missing a chance to put into place accountability measures that allow for the government to access personal data of data principals in narrow circumstances where necessity and proportionality have been established, enforcement in fact suffers. Take, for example, the most pertinent negotiations at the moment with the United States, home to most of the tech giants that have a dominant presence in India with the Clarifying Lawful Overseas Use of Data Act (CLOUD Act). The act requires clarity of procedure amongst other things to be present in order to create a smooth channel for requests for personal data for law enforcement purposes.
Where absolutely required for clear and specific reasons, localisation requirements can be imposed on specific sectors, like data processed for the purposes of defense etc., but a blanket requirement should not be imposed, as it ultimately hurts user choice and makes users vulnerable to greater surveillance.
That such a blanket data localisation requirement imposes a very heavy burden on power generation in the country, and has not at all been factored into the analysis of whether such requirements are desirable. It should be considered.
The term ‘critical personal data’ under Section 40 in any case has not been defined under the draft Bill and its purpose here is not clear.
The draft Bill does not go far enough to ensure the independence of the Data Protection Authority. Given that the bill vests numerous functions in the Authority — from research, advisory roles, rulemaking, monitoring, inquiry, and building public awareness to adjudication powers — the Authority should be independent. The composition and appointment of the Authority under Section 50 is entirely up to the Central government. This is a serious compromise to the Authority’s independence.
The composition of the Authority under Section 50 should include a requirement for representation of minority groups. Similarly, under Section 68 of the draft Bill, the appointment of the Adjudication Wing is entirely up to the Central Government. This, too, is a serious compromise on the adjudicatory functions of the Authority.
We support the proposal for the Authority to be governed by a managing board. We add that such a body should have representation of data principals, including specific representation of minorities, and that clear terms of reference specified under the draft Bill itself. Any further rulemaking by the Authority will benefit immensely from requiring an open and transparent process of consultations. This also allows for inputs about innovations in processing as well as in privacy enhancing technologies that the Authority might have missed, as well as providing all those affected by new rules the opportunity to help shape them. The quality and the extent of engagement that TRAI consultations have seen is a testament to this, and the format of the TRAI consultations — with Consultation Papers, comment rounds, the opportunity for counter comments, public availability of all inputs, and Open Houses enabling direct dialogue — should be prescribed for consultations by the Data Protection Authority in the Bill itself as well.
This Chapter gives wide room for discretion to the Central Government — such as, for example, in Section 98 ‘Power of Central Government to issue directions in certain circumstances’, Section 106 ‘Bar on processing certain forms of biometric data’, Section 104 ‘Power to exempt certain data processors’, and Section 107 ‘Power ro make rules’. This makes the Authority amenable to the government in power and compromises its independence.
As noted before, the non-application of the draft Bill to non-personal data should be reconsidered, as conclusive anonymisation is not possible, and harm can accrue from certain kinds of processing of de-identified data sets also.
The substitution of Section 8(1)(j) of the Right to Information Act, 2005 by Section 112 and the Second Schedule undermines the RTI Act. As at least one member of the Central Information Commission notes, the substitution has loose language that allows for information which ‘relates’ to personal data which is ‘likely’ to cause harm, after ascertaining that such harm outweighs the public interest, to be grounds for withholding of data. We support the joint statement by RTI and privacy activists on amendments proposed to RTI by Srikrishna Committee.