Police surveillance technology in India reinforces caste prejudice
A team of lawyers and activists say that the introduction of surveillance tools to the criminal justice system amplifies its bias
From healthcare services to policing and vaccination drives, the Indian government has rolled out a series of ambitious technology platforms in an attempt to streamline services for its 1.3 billion population. Yet a number of these projects, including Aadhaar, the country’s controversial biometric identification system, and mass CCTV surveillance, have exacerbated the marginalization of poor and vulnerable groups.
In May, a team of lawyers and researchers from the Criminal Justice & Police Accountability Project, a Bhopal-based initiative focusing on the criminalization of marginalized caste communities, published an essay on the website of The Transnational Institute, an international research organization linking scholars and policymakers. It outlined how the use of law enforcement technology, including biometrics and video surveillance, is accelerating caste-based discrimination in India’s second largest state, Madhya Pradesh.
The team examined the police treatment of socially excluded groups in Madhya Pradesh. I spoke with Nikita Sonavane, a lawyer and the project’s co-founder to find out more.
Coda: What drove you to study the links between law enforcement technology and caste prejudice in India?
Nikita Sonavane: We are on the path to digitizing the policing system. So, as people who work with the communities that are constantly targeted by the police, we’ve seen that the surveillance methods — the way that certain communities have been policed historically — is something that will be bolstered by the advent of new technology. For us it was important to see what the possible ramifications of this sort of wide-scale digitization in India could be.
Coda: Law enforcement agencies in India have long monitored certain caste groups because they are perceived to be “likely” to commit crimes. What is digital surveillance doing to worsen this kind of caste prejudice?
NS: The principle of criminal law is innocent until proven guilty. That principle is already overturned for a lot of these communities because their criminality is presumed. And now that criminality and inequality will be digitally encoded. To put it very simply, it will give rise to this parallel digital caste system.
Coda: In your essay you write about India’s Crime and Criminal Tracking Network & Systems (CCTNS), which links every police station in the country online in real-time and allows for law enforcement to access a digital repository, which includes police crime reports and the biometric data of individuals such as photographs and fingerprints. What are your key concerns here?
The CCTNS has information not just about the person who has been considered to be a habitual offender — in terms of where they live, what kind of assets they own and other demographic details about them — but also has details about their friends and their family members.
Coda: We’ve seen Prime Minister Narendra Modi’s government launch technology-based platforms like CoWin during the pandemic, in order to boost vaccination drives. CoWin has received widespread criticism on grounds such as digital exclusion. Why is this government so keen on technology?
NS: It’s twofold. For the government it’s always a question of being able to exert a greater degree of control on its citizens, particularly since 2014, when the current government came to power.
With the CoWin portal, the vaccination program has been reduced to a sort of lottery system at best. Because we’re living in a country where there is an extensive digital divide, the majority of the population of this country will not be able to access that portal and therefore will not be able to get vaccinated.
Coda: You refute the government’s claim that systems such as the CCTNS will allow for “objective,” “smart” error-free, algorithm-based policing. Do you think such technology could be efficient in any scenario?
NS: Absolutely not. The idea that technology is going to make something that is inherently biased, oppressive and rooted in principles of casteist predictive policing is absolutely flawed. We have already seen that happen with CCTNS. Because in several states these sorts of systems have become the basis for surveilling certain communities, certain neighborhoods.
This is not an implementation question, this is a design question that cannot be fixed by technology or anything else. If anything, it will create this facade of efficiency and nothing more.
The story you just read is a small piece of a complex and an ever-changing storyline that Coda covers relentlessly and with singular focus. But we can’t do it without your help. Right now, we’re in the middle of our summer membership campaign. Show your support for journalism that stays on the story by becoming a member today. Coda Story is a 501(c)3 U.S. non-profit. Your contribution to Coda Story is tax deductible.