Skip to content

Automated surveillance technology using drones to spot problematic human behavior in crowds is going to be tested at Technozion and Spring Spree festivals at NIT Warangal, reports the Verge. Lead researcher Amarjot Singh of the University of Cambridge claimed that their system has 94% accuracy at identifying violent poses. However, this accuracy drops with more people in the frame (like there would be at a festival), for example, to 79% with 10 people in the frame.

Police surveillance is growing without much scrutiny in recent years. The laws governing such surveillance have grey areas in which a lot of video surveillance technology currently operates. Reported applications include face recognition technology, behavior recognition, as in the case of the surveillance drones reported in The Verge, facial recognition and linking with police records, including tagging personal information with Aadhaar and sharing it across states.

An increasing number of cities have police using various kinds of surveillance databases to get better information on suspects and potential criminals in the city. These databases, where individual policemen can add the information of people to have some disturbing implications. There are several cities using Facial Recognition Softwares to assist policemen keep track of criminals.

Surveillance of everyone, not just criminals or suspects

There are several cities where CCTV camera networks scan everyone on the street and match their faces against a database of suspects and criminals. Here is a partial list:

  • In 2015, Surat became the first city in India to deploy real time surveillance through facial recognition systems when they implemented NEC India's FaceWatch in collaboration with Innovative Telecom & Softwares. The system uses live feeds from a growing network of CCTV cameras and can be used to monitor for crime in real time. It is capable of facial recognition as well as Automatic number plate recognition. Also, "It automatically matches faces against a database of 30,000 criminal mugshots and can alert the police immediately of anyone on a watchlist."By August, Surat had 604 cameras in 114 locations, covering 10% of the city with plans to add another 900 cameras in a year and bring the total to 2,500 in two years.
  • In 2015, Hyderabad police launched vehicle mounted CCTV cameras with a 360 degree view and ability to store footage for 15 days.
  • In 2016, Mumbai got 4,617 CCTV cameras hooked to the RTO control room and backed by 1000 vehicles fitted with GPS in order to coordinate with the control room were made operational with the objective of tackling law and order, fighting and preventing crime, regulating traffic and detecting traffic-related offences. These cameras are also capable of Automatic Number Plate Recognition as well as Facial Recognition. Additional Chief Secretary (Home) K P Bakshi told the Indian Express, "We can search for an individual all over the city. The cameras will identify the face of a wanted criminal. The camera will also pick out faces of persons roaming around continuously in one place. The nearest police van will then be alerted about the person’s location."
  • In 2016, 160 CCTV cameras were installed in Visakhapatanam as a part of a hi-tech surveillance network.
  • In 2016, in Vijaywada in Andhra Pradesh, NEC's Facial Recognition System was used to identify suspects and criminals at the Krishna Pushkaram religious event which sees around 50 million pilgrims attending to take a holy bath in the Krishna river.
  • In 2017, Jaipur police trialed a facial recognition system with cameras installed outside the Ganesh temple at Modi Doongri and controlled from the command and control centre called "Abhay". The FRS would scan the people before it and match them against a database of serial offenders and suspects.
  • In 2018, Cameras with Facial Recognition Technology are expected to be in use in local trains on the Central line in Mumbai, by the end of the year 2018, at a total cost of 276 crore. The cameras "will store facial details of commuters (for 10 days). The cameras with facial recognition software would help trace past movements of any offender on a local train and arrest the person when he travels next." A total of 11,160 cameras will be procured - 76 cameras for each rake, with at least 6 cameras in each coach of the rake.
  • In 2018, Hyderabad city police are matching the faces of everyone on the city's streets against a database of one lakh criminas, from the control room at the Facial Recognition Analytics unit at the Commissioner’s office at Basheerbagh. IT Cell incharge, K. Sreenath Reddy said that the local police are alerted only when the resemblance is more than 70 per cent.
  • Thiruvananthapuram police are using 233 cameras in their surveillance network of the city.
  • Paradip in Odisha is to get a CCTV surveillance camera network within a month.
  • Retired ACP Dhoble (of the hockey stick wielding moral police fame) is now in the process of getting a facial recognition software for the city and believes it needs to be created with the "help" of his son Kshitij, who specialized in Artificial Intelligence at Aukland University. An effort that initially began with a goal of tracing missing people has expanded its objective to "tracking criminals" as well. "Meanwhile, they began compiling the information of all 15,847 police stations in India and uploaded it on the site. One aspect of the site is uploading the information of these police and stations. The other is to spot child beggars, labourers and send it to the site."

Police database for use with mobile app -FaceTagr

This is a database of criminal records that can be used with a Facial Recognition Software (FaceTagr) installed on Android mobile phones of beat policemen and inspectors working in the field. When a policeman scans a suspect's face, the mobile app returns data of police cases filed and police station limits for the criminal the face matches with. Databases being expandable, the database has the potential to store the records of criminals across the country.

The application that was originally built by Vijay Gnanadesikan, CEO of Haliscape Business Solutions, to help rescue children by matching records of missing and found children, was first trialled for police use in Chennal

  • In 2017, FACETAGR was adopted by T Nagar police station of Chennai, beginning with a database of 12,000 criminals. An additional 40,000 suspects were added to the app to improve the chances of police identifying faces. The app used by policemen to "scan" suspects. Once a suspect is scanned, the app returns information about them.
  • In 2018, Chennai police will expand the use of FACETAGR to include interstate criminals as well by expanding the data used by the application to other Southern states. Currently the database has information on 67,000 criminals, including information sent by the Pudducherry Crime Records Bureau. It is awaiting data from Andhra Pradesh, Telangana, Kerala and Karnataka. The application is in use in 10 out of 12 police districts and is installed on the phones of beat constables. 18 inspectors, subinspecotrs and 150 beat police of Washermanpet were the latest to get the app, with "700 criminals in A, A plus, B and C categories".
  • Chittoor adopted the app in December 2017 with data of 10,000 sandalwood smugglers and 3,000 suspected criminals.
  • Pudducherry has also adopted the use of FACETAGR in March 2018

e-Petty

The e-Petty app is being used across Telangana state to book cases in minor crimes under Sections pertaining to IPC, City Police Act, Gaming Act/ COTPA Act 2003, Motor Vehicle Act and Town Nuisance Act. The app can record photographic and video evidence from the crime scene, photographs of suspects and generate an automatic chargesheet based on evidence. The app tracks previous cases of individuals as well and identify repeat violators because the app links profiles online with Aadhaar card numbers.

Hyderabad

Hyderabad is probably the most surveilled city in the country. The Integrated People Information Hub pulls data from dozens of sources to create profiles of individuals that include not just their own comprehensive information, but that of parents as well. It is a data hoarding machine gone rogue, where there appears to be no reason or reasonable suspicion required to put citizens under surveillance. The surveillance includes call records, social media, relatives and friends, utilities and more.

Questions raised

The use of aggregated databases and Artificial Intelligence  in large scale applications is new in India and the laws don't yet have necessary support as well as restrictions on implementation. There is no doubt that information is power and information on suspects and criminals empowers police to do their jobs better. The lack of development of proper laws, policies, protocols and facilities for the police to record and access information in a secure manner has led to the adoption of various technologies in an ad hoc manner with little overisght.

However, largescale use of such applications raise several and serious questions:

  • Is it constitutional to treat every person as a potential criminal? When all the people entering the range of a Facial Recognition enabled camera are scanned and matched against databases of criminals, it amounts to intrusive surveillance. India lacks a data protection law or a law defining the contours of privacy, however the recent robust arguments against surveillance and observations by judges in the Constitutional Challenge to Aadhaar are very clear that Indians do have a right to privacy and surveillance violates this right.
  • Data ownership: FaceTagr is owned by Haliscape Business Solutiosn Pvt Ltd of Chennai. NEC is a global organization. It is unclear who owns or protects the data on these databases and what restrictions exist against its misuse.
  • Data access: Cortica, a foreign AI company has formed a partnership with the Best Group to analyze CCTV footage from public cameras to predict crime. While technologically it may be a challenging goal, a foreign company with considerable ties to foreign intelligence has capabilities and access to individuals on Indian streets. The software is capable of using data from not just video cameras but satellite and drone footage as well and is capable of analyzing human behavior, including differentiating between nature of crowds - routine market corwd or a protest, etc.In the case of Mumbai, a company run by a software professional and a retired police official appears to have  access to information from all police stations in India and are proceeding to build a database! It is unclear how and why a software under development by private individuals has access to nationwide sensitive data.
  • A market of the gullible: The lack of proper evaluation or policies requiring specific standards has left the police of India a ripe target for companies selling surveillance products who may exploit the real need for collecting information or corrupt insiders to gain contracts. Many of the technologies described here have not been subjected to robust testing and have no published research about their quality. Some of the stories describe extensive installations that become defunct or are not of adequate quality to begin with, as in the case of Visakhapatanam, left with 3 working cameras out of 160 within 2 years of installation at massive public expense. Others describe extremely efficient systems, but ones that violate the rights of the citizens they are supposed to serve.This risks spending public funds for purposes and methods that may not be in public interest. There is an urgent need to consult with independent experts and digital rights law researchers and other professionals without conflict of interest to put together guidelines for data collection for surveillance, data destruction when its purpose is served, securing of that data to prevent misuse and policies on who should have access and a transparent process for granting such access.
  • Who is a criminal or suspect: It doesn't take a lot for police to consider someone a suspect and there is little oversight. There is no warrant or independent authority required to initiate surveillance against anyone. Such a database has the capacity to take the local prejudices of police across state lines and cause considerable harassment to individuals in all areas covered by such databases.
  • Utility: While there is obviously a need for police to monitor suspects in order to gather evidence, the legality and utility of randomly spotting them on the street is debatable. What is the utility of someone say.... suspected of having conducted a robbery... being spotted in another state - if it even is the same person?
  • Technological limitations: Such "identification" is inherently probabilistic and can be wrong. A good example would be the Welsh police wrongly identifying over two thousand people as potential criminals when they used Facial Recognition at the 2017 Champions League final in Cardiff in a crowd of 170,000 spectators. This has the potential to create a lot of harassment as well as waste police resources when applied to the far bigger numbers of people on the street in Indian cities.
  • Bypassing consent: A person suspected by the police and asked to come for questioning has rights. They can agree or refuse and the police cannot actually force them to say.... stand in a line up to be identified without any due process. Or they may wish to have a lawyer present when interacting with a policeman as a suspect. However, use of software such as this allows a beat constable to completely arbitrarily scan people who may not even realize that they are actually in a situation with the law where they may need to exert choices to protect their interests.
  • Human rights: As often happens when the state adopts technology, the advantages of the technology have been understood and promoted, but there appears to have been little consideration given to human rights implications of falsely accused individuals, potential for corruption through entering or removing entries on the database for bribes or blackmail, consequences of false positives to innocents and other potential fallout. There needs to be better consultation by the state when adopting such technologies with professionals (other than those providing the technology as a solution) to assess the wider impact beyond the immediate problem the technology aims to solve and mitigate the potential for harm.
  • Ability to maintain technology: Out of 160 cameras installed in Visakhapatanam 2016, 3 cameras were working in 2018. One of them being pointed to the ground, was useless.
  • Aggregated or discrete databases? It is not known whether the databases used to identify criminals through CCTV or the FaceTagr app or e-Petty are linked where they coexist. Aggregation of data across these databases has even more potential for the violation of rights of citizens.
  • Magnifying social prejudices: A simple statistical reality is that positives - whether real or false - will be higher among those who get scanned more. In a country where there is considerable documented evidence of prejudice against religious minorities or underprivileged castes, classes and communities, the use of such a software has the potential to magnify and endorse prejudices that cause their targeting. Take for example, reported cases of slums being raided and all the men in them being asked to identify themselves. The chances of these men being identified - correctly or falsely - will always be higher than say a person living in a gated society, where such raids are unheard of, simply because such faces will get scanned more often than those whose circumstances don't lend easily to such situations.
  • Use of Aadhaar for profiling: the e-Petty app used in Telangana is a clear use of Aadhaar for profiling - something the government has consistently denied in the Supreme Court.
  • Lack of appropriate digital security: Apart from the data being shared across state borders, or being hosted on private servers or foreign companies being given access to it - which are issues of policy to determine what is appropriate and what is not, there are outright failures of digital security, which result in unintended and unauthorized access to the very sensitive data being collected. Researcher Kodali, for example, had pointed out that the Hyderabad police were using a third party portal to record and geotag crime. The portal having very poor security for the purpose it was being used for, had allowed the indexing of crime reports by search engines for years, including the names of rape victims - which is not legal in India.
  • Lack of independent audit or testing: The systems used for both largescale CCTV surveillance as well as scanning individuals using a mobile app do not have information available on their accuracy. The lesser the accuracy, the more such systems will end up wasting police resources on chasing dead ends and causing harassing citizens.
  • A need for legislation: It is undeniable that the police need effective ways to access databases to find information on suspects and criminals on the fly. It is also inevitable that this will involve a certain degree of invasion of privacy in the interests of conducting investigations. However, this cannot simply be left to whatever software developers believe can be done or police wish to adopt. There needs to be a regulatory framework that will identify situations when such use is legitimate and protect citizens from arbitrarily being entered into databases as suspects. There should also be regulation of what information should remain local and what should be disseminated - a local suspected of robbery does not need to be found acorss state borders, but an absconding criminal found in the footage of a murder should be. There is also a need for legislation to remove names from the databases when the people are no longer suspects - for example cases people were suspected in get closed with others charged.

Further reading:

  • Research published by the Center on Privacy and Technology at Georgetown Law, "The Perpetual Line-Up" on the unregulated use of public surveillance by law enforcement and the risks.
  • Technological bias: While MediaNama was not able to find any research about FaceTagr specifically, "Face Recognition Performance: Role of Demographic Information" by the FBI about accuracy of Facial Recognition in various population demographics is an interesting read on the biases caused by how the system is "trained" to recognize faces.
  • Policy Paper on Surveillance in India by the Centre for Internet & Society

In a rather bewildering tweet today, Madhu Kishwar asked, "Do champions of #RightToPrivacy realise that if 2 women hadn't complained, #babaRamRahim doings in his "Gufa"covered under right to privacy?" The tweet was so absurd that she was met with a barrage of retorts and taunts by people she was taunting.

I guess if we are a country just growing into our rights, there will be a lot of debates of this sort needed, where clear talking will help more than sarcasm at someone's ignorance. It also isn't an alien concept. We have a right to privacy already. Whether it was stated or not, we had protection against someone violating our space. That is how stalking or spy cams or leaking passwords and such is already illegal, even though the right itself was explicitly stated just recently. The world has not changed all that much in terms of what is "right" and "wrong". Privacy cannot make criminal things legit all of a sudden. That was just propaganda to influence the case into denying our right. Even the government now agrees we have a right to privacy. What has changed is for those whose privacy gets invaded by powerful players like big companies or the government (this judgment emerged directly from a constitutional challenge to Aadhaar on the grounds of privacy), who have the power to fudge an unstated right and interpret it to convenience. Now that it is explicit, they will no longer be able to fudge easily.

What the right to privacy will actually entail and exclude as per law will soon be determined, but the general meaning of the term endorses the right of a person to withhold or reveal information about themselves. There is also an implicit requirement that information revealed in confidence must not be shared beyond the purpose it was explicitly authorized for. This is the rock Aadhaar will flounder on - a mandatory and unaccountable database of private information on citizens cannot coexist with even the most shoddily defined privacy. And this 9 judge bench has given a most excellent verdict. But I digress.

Publicly available information is not covered by the right to privacy. For example, board results (because there seem to be a lot of jokes about keeping marksheets private from parents). You may refuse to reveal, but if the information is public, they will have access to it anyway. Certifications (no, the Prime Minister and Smriti Irani's degrees don't get covered by privacy either - they have stated the information themselves, what they are refusing to do is provide proofs for officially filed documents). This also goes for information submitted as proofs, etc. For example, if you have to provide proof of address to start a bank account, your right to privacy does not include starting a bank account without proof of residence. But yes, it definitely includes an obligation on the bank to not share it with third parties or use it for purposes other than verifying your address (for example, sending credit card spam).

As a fundamental right, Ram Rahim still has the right to privacy. Just because he is a convicted rapist does not mean you can make personal and confidential information about him public without his consent (public functions are public). The access to personal information can also be mandated for various reasons - This is where the grey areas lie. You have a right to withhold your bank balance and what you spend your money on from me, but do you have the right to withhold it from the income tax department? The standard understanding is no, because tax is your duty as a citizen. Others, more extreme argue that we voluntarily provide our information to the tax authorities and others don't and choose to be raided instead. The government will, no doubt soon be launching some form of propaganda to create a way to impose Aadhaar in spite of the recent clobbering in court through such grey areas. But it won't be easy, because there really is no way to prove that the information is necessary in the manner knowing income would be directly necessary to assess income tax. It is still not arbitrary or unlimited. you need warrants, to enter and search premises, for example. You can't randomly check whoever you suspect.You have to prove the need for it and get a warrant.

Similarly, your right to privacy is about you having the right to reveal at your discretion what personal information you choose to share. To apply that to sex and rape, it would protect homosexuals, for example. That is why they got all excited about the clarity of the wording. Or at least those who are not engaging in "unnatural sex" in a situation that could be called "public", unless one of them revealed their "crime". Short of homosexuality explicitly being legal, this is considerably better than their previous precarious position of not knowing what boundaries and personal rights they could count on. However, it doesn't protect Ram Rahim from his two victims who complained, because as their personal experience, the victims were perfectly entitled to reveal it to anyone they wished - even if it were consensual. And it wasn't, which makes it a flat out crime. Crimes are mostly private. But if there is enough evidence of it or a complaint, an investigation will attempt to access all relevant information. The most robustly defined right to privacy in the world cannot protect a rapist from conviction if his crime is proved and it cannot prevent an investigation against an accused either. It is a thin edge of what is a legit investigation or ethical whistleblowing and what is a breach of privacy - which is why exposes are so often accompanied by defamation suits.

Right to privacy is a right of persons, not organizations. If it were not the victims and a third party who came to know about the rapes and complained, the action being illegal would make it count as whistleblowing.

Organizations too can have requirements of confidentiality, but they don't have a right to information about them requiring consent to be shared - because they aren't people. Confidentiality requirements of organizations are usually explicit. There are things you can talk about (work timings, coffee maker sucks) and things you can't (trade secrets, business strategies). If the organization was willing to own the rapes as official business of the organization and not a crime that could not be revealed without breaching confidentiality agreements, they are free to sue the whistleblower or the complainants, but a crime that gets exposed remains a crime. An organization that claims it to be its official business would be a criminal organization.

This is also why you have (and need better) whistleblower protection laws - so that confidentiality cannot be used as an excuse to cover up crimes and persecute whistleblowers.

Hope all is clear now.