Facial recognition technology (FRT) is a system of recognising a facial expression and authenticating a person’s identification by looking at their face. It has a wide range of applications due to the surveillance and control function, including shopping malls, airports, stadiums, concerts, and law enforcement. Regardless of its merits, it puts an individual’s privacy and fundamental rights at risk.
What is FRT?
Face Recognition Technology (FRT) is an automatic mechanized operation that compares two photos of faces to assess whether they belong to the same person. A photograph is first uploaded to the facial recognition technology software, and then certain distinguishing features of the face, using a feature analysis algorithm, are evaluated. These evaluations are then transformed into mathematical representations, known as face templates. To examine if law enforcement authorities can detect a match between two faces, the templates are compared to the already existing templates in a database. Apart from assisting law enforcement agencies to identify perpetrators of crimes, FRTs can also be used to verify faces by way of comparing known templates with captured photographs, which are implemented on border check posts and immigration agencies. Four years ago, the Qingdao police used face recognition technology to identify twenty-five wanted individuals, wherein one individual was on the lookout since 2007. The FRT in the above case captures several photos and videos of the suspected individual and converts them into information. This information is reviewed and compared with photographs from the database of the police department, which can help identify the person with 98% accuracy and within less than 20 minutes.
FRT Applications in India
The deployment of FRTs has been spread among different institutional authorities mainly for the purposes of security and identification. Introduced in 2018, the Digi Yatra Policy makes use of facial recognition to identify passengers at currently 6 airports. The use of the Automated Facial Recognition System (AFRS) for the purpose of tracking and reuniting children was approved by the High Court of Delhi in the case of Sadhan Haldar v The State NCT of Delhi. Only about 3000 children’s faces were identified out of more than 10,000 matches by the FRT across the country. With regard to police authorities, the city of Hyderabad has established the “Command and Control Center” (CCC) that makes use of more than 600,000 cameras in facial recognition to track individuals. This technology is not limited to the state of Telangana, as there are 15 other FRT systems that have already been deployed by the State and Central Government to fulfil the objective of security and surveillance. In recent years, various states have made routine use of fingerprint and face recognition technology (FRT) to halt and inspect individuals. The use of such facial recognition technologies (FRTs) and closed-circuit television cameras (CCTVs) on different demographics is transforming essential public spaces into privacy-invading zones.
FRTs invading our privacy
In the case of Justice K. S. Puttaswamy (Retd.) v. Union of India, a nine-judge bench of the apex court passed a landmark verdict that held the right to privacy as a fundamental right safeguarded by the Constitution. A natural right that is deduced from the golden triangle of articles of the Constitution of India as an essential component of the right to life and liberty. It attaches to the person covering all information about that person and the choices that they make and is regarded as inalienable and fundamental. It safeguards a person from state surveillance in their homes, their movements, and their behaviours. As a result, every state action that infringes on an individual’s right to privacy must be brought under judicial scrutiny. The verdict was essential to stop future dilution of the right and to resolve the issues presented by the digital age, hence the apex court used a broad connotation of the fundamental rights in our constitution. According to the ruling, Individual liberty must be extended to digital spaces, and individual autonomy and privacy must be safeguarded. There are no regulations in India that permit the government or its agencies to deploy facial recognition technology. Furthermore, there is no legislation that specifies the method by which data can be gathered using a CCTV camera. With regard to the right to privacy, Section 69 of the Information Technology Act, read along with the Information Technology (Procedure and Safeguards for Interception, Monitoring, and Decryption of Information rules), 2009 states,
“Where the Central Government or a State Government or any of its officers specially authorised by the Central Government or the State Government, as the case may be, in this behalf may, if satisfied that it is necessary or expedient to do in the interest of the sovereignty or integrity of India, defence of India, security of the State, friendly relations with foreign states or public order or for preventing incitement to the commission of any cognizable offence relating to above or for investigation of any offence, it may, subject to the provisions of sub-section (2), for reasons to be recorded in writing, by order, direct any agency of the appropriate Government to intercept, monitor or decrypt or cause to be intercepted or monitored or decrypted any information generated, transmitted, received or stored in any computer resource.” In order to apply the above-mentioned provisions, the government would have to record in writing, the reasons for which such activities are being undertaken. It is important to note that such reason in compliance with subsection (2) of Section 69 of the IT Act 2000, has not been provided until yet. There is no explanation as to why the State is tracking individual sensitive data which is being implemented in FRTs across India.
Function Creep in the implementation of FRTs
A function creep can be defined as the incremental extension of a technology or system’s usage beyond its initial purpose, particularly when this results in probable privacy violations. A function creep has already occurred in 2019, when the Delhi Police evaluated personal details of the Anti-CAA protestors using the automated facial recognition system (AFRS) which is a popular example of Facial Recognition Technology (FRT). These personal details were further compared with a database of at least two lakh people, even though they were permitted to use FRTs only to find missing children. The function has been extended and there is no clarity on what they could be utilising FRTs for, how they are regulated, or if they are governed at all. This could result in the use of excessive force by the police or situations where specific groups are singled out without any lawful justification. This clearly indicates that the use of FRTs, a mass surveillance system disguised as a security tool, infringes our fundamental right to privacy. The applications of FRTs for mass surveillance could be devastating as it would threaten an individual’s right to free speech and expression, right to movement and even the right to protest as illustrated above.
In the meanwhile, High Court of Telangana is making preparations to hear a PIL regarding the FRTs and their implications to the right of privacy on the 15th of January, 2022. It argues that the use of FRT is wide, far-reaching and meant for mass surveillance, thus is disproportionate and the state must demonstrate probable cause for deploying the same. The way in which the State is using new technology raises severe concerns about the government’s regard for civil liberties guaranteed by the constitution. Without a doubt, the government should implement novel technologies in its operations. But during implementation, the State must guarantee that the application of these technologies complies with citizens’ fundamental rights.
Author(s) Name: Bharat Manwani (Gujarat National Law University, Gandhinagar)
 ZDNet Australia Staff, ‘Face recognition set for take-off in Australia’ (CNET Australia, 23 January 2008) <https://www.cnet.com/news/face-recognition-set-for-takeoff-in-australia/> accessed 7 January 2022
 Agence France-Presse, ‘From ale to jail: facial recognition catches criminals at China beer festival’ (The Guardian, 1 September 2017) <https://www.theguardian.com/world/2017/sep/01/facial-recognition-china-beer-festival > accessed 7 January 2022
 ENS Economic Bureau, ‘Facial recognition at airports: Government launches Digi Yatra’ (The Indian Express, 5 October 2018) <https://indianexpress.com/article/business/aviation/facial-recognition-at-airports-government-launches-digi-yatra/> accessed 7 January 2022
 Sadhan Haldar v The State NCT of Delhi (2017) WP (Crl) 1560/2017
 U Sudhakar Reddy, ‘8.3 lakh cameras in Telangana, Hyderabad turning into surveillance city: Amnesty’ (The Times of India, 10 November 2021) <https://timesofindia.indiatimes.com/city/hyderabad/8-3l-cameras-in-t-hyd-turning-into-surveillance-city-amnesty/> accessed 7 January 2022
 Soibam Rocky Singh, ‘Facial recognition technology: law yet to catch up’ (The Hindu, 31 December 2021) <https://www.thehindu.com/news/cities/Delhi/facial-recognition-technology-law-yet-to-catch-up/article33458380.ece> accessed 7 January 2022
 Justice K. S. Puttaswamy (Retd.) and Anr. v Union of India and Ors (2017) 10 SCC 1
 Anil Chandra Pradhan v L.I.C. And Others (1993) IILLJ 1080 Ori.
 Information Technology Act 2000, s 69 (1)
 Rina Chandaran, ‘Use of facial recognition in Delhi rally sparks privacy fears’ (Reuters, 30 December 2019) <https://www.reuters.com/article/us-india-protests-facialrecognition> accessed 7 January 2022
 Sparsh Upadhyay, ‘Telangana High Court Issues Notice On PIL Challenging Deployment Of Facial Recognition Technology (FRT) In Telangana’ (Live Law, 3 January 2022) <https://www.livelaw.in/news-updates/telangana-high-court-notice-pil-challenging-deployment-facial-recognition-technology> accessed 7 January 2022