Designing for Democracy: Does the Personal Data Protection Bill 2019 Champion Citizen Rights?
Though the Personal Data Protection Bill has been hailed as a success for citizen rights, an in-depth analysis of the same reveals evidence to the contrary.
This is part of a six-article series on questions surrounding data, privacy, artificial intelligence, among others. You can read the introduction here
The Personal Data Protection (PDP) Bill, 2019 has elicited a range of responses ranging from congratulatory to critical. Hailed by some as progressive for user rights and data privacy, the bill, nevertheless, takes away with one hand what it is perceived to be gifting by the other. This sleight of hand is performed by notionally offering a vision of a healthy user privacy framework, but is ultimately trumped by the breadth of exemptions the government permits itself when it comes to helping itself to user data. Despite the naming of the bill rhetorically positioning itself as one that seems to privilege users and their data, much of the bill in the final analysis is concerned with how the Indian government positions itself with regard to the handling of user data by corporations and itself.
The landmark Puttaswamy judgment, which stated that privacy is a fundamental right, especially in light of challenges posed by the ubiquity of the digital, forms the backdrop against which these conversations unfold. While the ball was set rolling by the Srikrishna expert committee, these discussions themselves were regarded by civil society stakeholders as being somewhat suspect due to irregularities in the composition of said committee, given that the majority of members had a record of being sympathetic to Aadhaar as well as of being opposed to the Puttaswamy judgment’s pro-privacy position. Notably, civil society objections highlighted the “complementarity” (Chisti 2017) of the right to information with the right to privacy, allowing citizens to have access to decisions taken by the government with regards to their personal and non-personal data:
“This includes matters of privacy, surveillance, aggregation of data, the commercial collection of data and its use and, more broadly, data used to restrict constitutional and other rights” (Chisti 2017).
Disregarding these objections, the Srikrishna Committee report recommended the publishing of the PDP Bill, and though it was open for consultation till October 2018, it has not been revealed which and on whose recommendations the final PDP bill was framed. This version was also accompanied by a report on the Indian digital economy and was ambitiously (though possibly somewhat misleadingly) titled “A Free and Fair Digital Economy: Protecting Privacy, Empowering Indians” (Committee of Experts under the Chairmanship of Justice B N Srikrishna nd). The PDP Bill was introduced into the Lok Sabha in 2019, and is currently open for comments from stakeholders (Agarwal 2020).
The PDP Bill has been used by the government as yet another opportunity to flex its paternalistic muscle: there is much made of its quest for protectionist strategy of data sovereignty against the continued onslaught of data colonialism enacted by foreign technology corporations such as Facebook and Google. This assertion has been made by the Commerce and Industry Minister Piyush Goyal at G20, “The data of a country […] is best thought of [as] a collective resource, a national asset, that the government holds in trust, but rights to which can be permitted” (Agrawal 2019), and is redolent of a phrasing that brings to mind the other well worn metaphor of data being the “new oil” used time and again to justify the collection of citizen data by as many means possible. As a speaker at Medianama’s roundtable on the PDP Bill put it:
“Just as oil found on private land is not private, but is the State’s, the argument here is also that data belongs to the government, and it can take it over. This is a very problematic way to look at non-personal data” (Jalan 2020).
In its current form, then, the PDP Bill seemingly positions itself as magnanimous with regards to the rights of the user, or the data principle (as users are characterised in the bill). The first bill to directly address user privacy, the bill seemingly aligns itself to some tenets of the General Data Protection Regulation (GDPR) model from the European Union (EU), allowing for rights to portability, explainability, object to data processing, as well as solutions arrived from the processing of such data. The bill mandates the establishing of the Data Protection Authority (DPA), who, on paper, safeguard the interests of data principals by challenging irregularities created by data fiduciaries, which include both private companies and the state (though the latter with considerable caveats, as will be discussed later). The right to explainability empowers the DPA to inform data principals as to how their data might be being used, made possible by a suggested mechanism called the consent manager, a portal that allows data principals to see how their data is being used and how. By putting this in place, the bill ensures that challenges to data fiduciaries are ring-fenced by what the DPA deems legitimate, rather than permitting alternative modes of inquiry, such as filing RTIs.
However, the seeming largesse of creating the DPA in the interests of citizens is already compromised by the proposed composition of the DPA itself; the selection committee to elect the DPA’s members does not have any representation from the judiciary in contrast to the previous version of the bill which included the chief justice in the line-up. In its current form, it is difficult to gauge how representative the DPA will truly be of citizen and consumer rights, and this lack of clarity might not be unintentional.
Since 2014, the Aadhaar plus the National Population Register (NPR) regime characterises an obsessive focus by the government on collecting as much information about Indian residents as possible, which in turn is linked to centralised databases. The increasing emphasis on compulsory enrolment (while Aadhaar is still not mandatory, NPR in order to facilitate enrolment in the National Register of Indian Citizens [NRC] is) the linkage of the two poses a particular threat to personal data as the NPR is not regulated by any privacy safeguards.
Against this backdrop, Section 35 of the Data Protection Bill considerably eases the government’s task of collecting data to compulsorily register its citizens, which flagrantly disregards the scope allowed by the Puttaswamy judgment, which only allowed “necessary and proportionate” processing of data by the government under a limited number of conditions. In addition to this, the emphasis on data localisation allows the government to collect data on transactions that ordinarily would have been processed outside the country before the Bill was tabled.
In a move that further compromises user anonymity, the bill also suggests that social media intermediaries (which are defined in Section 26 as “significant data fiduciaries” due to the volume of data they handle and process) should verify their users by means of government IDs. This, along with increasingly interventionist provisions which demands platform use “automated tools or appropriate mechanisms, with appropriate controls, for proactively identifying and removing or disabling public access to unlawful information or content” ostensibly in a bid to stem misinformation, dramatically increase the stranglehold on the rights of citizens and freedom of speech.
Before we establish that this bill is less of a success for citizen rights than it purports to be, it is necessary to understand the infrastructural interventions that define the relationship between the data principles and fiduciaries. Overwhelmingly, these relationships are enacted on platforms: infrastructures which are simultaneously both media and systems of governance shaped by both design elements such as affordances as well as transnational, geopolitical formations, a concept called platform logic (Schwarz 2017).
Platform logic has completely shaped the Indian citizen’s relationship to data and their attitudes to it. Platforms mediate every aspect of digital life, friendships, personal relationships, consumption as well as the citizen’s relationship to the state. India is currently a “platform society” a term coined by Van Dijk et al that “emphasizes the inextricable relation between online platforms and societal structures. Platforms do not reflect the social: they produce the social structures we live in” ( Van Dijck et al 2018).
One of the means by which the bill guarantees the DPA’s intention to protect users is by assigning a “data trust score” to data fiduciaries. Fiduciaries are granted these scores based on the robustness of their “privacy by design” policy, on the grounds described below:
Figure 1: An exceprt from the Personal Data Protection Bill 2019
Interestingly, none of these clauses specifically speak to the design of the interface itself which plays an absolutely crucial role in ensuring user privacy. In a report published by the CNIL, a French agency championing individual liberties in the digital world, the interface is described as “the first object of mediation between the law, rights and individuals" (le Grendal 2019). The interface of platforms plays a considerable role in creating a frictionless experience of the digital encounter, in large part coating the bitter pill of the reality of database logic. The database ultimately is a structured means of storing data that permits easy triangulation of information about each individual, a set of algorithmic means by which to conjure up a layer of insights not necessarily apparent to the user.
Simply put, control over the creation of these insights is at the heart of user privacy. What currently legislates user privacy are two tactical feints now well rehearsed by platforms, both corporate and governmental: that of “notice and choice” and the other of “digital/privacy literacy” (Helm and Suebert 2019). Both these principles thoroughly derive from the increasing responsibilisation of the individual under neoliberalism: “the process of transferring responsibility from one actor to another, usually from state agencies to individual social actors” (Wakefield and Fleming 2009).
The “notice and choice” paradigm hinges on the quality of the choice, and the quality of information offered by the platform when negotiating user consent. As the letter by concerned petitioners put it: any consent framework must be both relevant to people and take into consideration Indian realities, so that citizens are able to differentiate between manufactured and informed consent. However, this model has become more akin to a bait and switch, where users, coerced by the threat of exclusion (from welfare schemes, as in the case of Aadhaar) take on the responsibility of keeping their own data safe and help “platforms to distract attention from their responsibility.” Similarly, privacy literacy also assumes individual users are empowered enough to protect their own privacy, and thus the converse puts the onus of blame on the individual. As Helm and Suebert put it:
"Regimes of responsibilization can further be considered problematic because they mistakenly perceive the ideal of agency through education as a catchall solution even in situations where the problem is not as much a lack of individual literacy as it is a structural one that refers to power asymmetries created through exposure, surveillance and the asymmetric distribution of data power."
To expect that in a country like India, citizens who are largely disenfranchised by poverty and illiteracy, should have to constantly look over their shoulders to protect their own data from their own government, demonstrates a gross desertion of care. Some of the most dangerous assumptions made regarding users are predicated on a homogeneity of mental models with regards to concepts such as privacy and consent. Consent, as it is defined in Section 11 of the bill must be free, informed, specific, clear and capable of being withdrawn as easily as it is to give. However, most Indian users do not have this understanding nor relationship with consent when it comes to using technology: even the most educated are less than aware of what rights they are relinquishing when they tick the “I accept” box at the bottom of an end user license agreement. The recent case of national tea chain Chaayos using facial recognition without seemingly any privacy safeguards in place begs the question of how such relationships around consent will be negotiated by data principles and fiduciaries. In a hypothetical scenario, Bailey and Bhandari (2020) write how in such situations data principals will be able to “manage their consent, through an interoperable platform. Thus, the Chaayos customer can write to Chaayos, either directly or to the consent manager, to exercise her rights of confirmation, access, correction, erasure, and data portability. The withdrawal of consent can also take place through the consent manager.”
This sort of an expectation of a proactive, granular response from users who are constantly bombarded with requests for their data in order to use any sort of service is unreasonable and unrealistic. In addition, the government itself has explicitly exempted itself from being a significant data fiduciary, which means that none of these methods of recourse apply, leaving citizens utterly disempowered as to any insights as to how their data is being used.
What is still unclear is how the proposals in the bill will be enacted to ensure that “transparency is meaningful in practice” (Wagner et al 2020). While to assure such transparency, serious reconsiderations of the design of interactions, interface and user experience needs to be taken into account in order to intentionally avoid the use of dark patterns. Ducato and Morique (2018) define dark patterns as:
"...choice architectures used by many websites and apps that exploit individuals’ biases and heuristics to maliciously push users into doing something that otherwise they would not have done, if properly informed. Such practices are becoming particularly controversial in the data protection domain, where users are de facto forced to give consent and accept a specific privacy setting, decided by the operator of the website or the app."
Consequently, exercises such as the audit[1] of significant data fiduciaries by the DPA should also, ideally, ensure that companies are compliant not only with regards to processing data, but also in how accessible they make their privacy policies, through language, meaning and interface, to ensure that the users are not at a disadvantage when using apps and websites that request their data.
An even more intractable issue is what happens to the ownership of ancillary user data. That is, if data principal uses the right to portability to port information about themselves from one platform to another, they will inevitably pull information not only about themselves, but would pull their entire social graph (information about those users in their network, think Facebook friends, and maybe even friends of friends) along with them: and there is currently no provision in the bill to account for such a situation.
Other complex situations in which a data principal’s ownership of their data is even more complicated includes a scenario wherein, once such data has been used by a machine learning algorithm, in combination with other datasets, how one tracks the provenance of that data to claim it is still contested.
These questions and examples bring us back to the question of privacy literacy raised earlier in this article. That is, how citizen friendly can the PDP Bill truly be when it puts most of the onus on the citizen to track every move their own data makes? John Berger (2003) once wrote:
“Democracy is a proposal (rarely realized) about decision making; it has little to do with election campaigns. Its promise is that political decisions be made after, and in the light of, consultation with the governed. This is dependent on the governed being adequately informed about the issues in question, and upon the decision-makers having the capacity and the will to listen and take account of what they have heard.”
In order for this bill to fully represent the citizen, be they located at the last mile or the cutting edge, consultative processes that prioritise how technology is used and experienced must be put in place, else it runs the risk of being a document that guarantees its constituents’ rights only on paper.
The following comic, titled, 'Dad and His Invisibility Mundu,' is a speculative comic from the perspective of an 8 year old, which describes the manifestations and possible lived experiences that affect families when the legality around data and privacy moves away from what is humane and just.