Railway security system has its eyes on you with AI-based cameras

National transporter ramps up project to instal AI-based cameras that can recognise faces

Railways, artificial intelligence, security, cctv cameras
As facial recognition systems become widespread, rights activists want to know how data they collect is used and shared
Dhruvaksh Saha New Delhi
5 min read Last Updated : May 25 2025 | 9:37 PM IST
The next time you are at a crowded railway station, look around. Cameras that recognise faces are watching, joining Indian law enforcement’s increasing reliance on advanced surveillance technology for security and crime prevention as activists worry about the impact on privacy.
 
The Indian Railways, which carries 7 billion passengers annually, has its own police force and surveillance systems to secure and monitor stations, trains and other facilities. The national transporter has put up facial recognition system (FRS) cameras at more than 200 stations, but is using only a portion of their surveillance strength. It’s running a pilot that will make seven model stations completely FRS enabled through an integrated analytics serve--r. Howrah, Sealdah, New Delhi, Chhatrapati Shivaji Maharaj Terminus (Mumbai), Secunderabad, Danapur, and Chennai stations will be completely equipped with real-time video surveillance in a couple of years.
 
The cameras identify or verify individuals by using artificial intelligence and algorithms to compare images and video frames against a database of faces. FRS was launched in 2019 at Bengaluru station and in 90 days it detected 47 people with criminal records in railway premises, according to the Railway Protection Force (RPF).
 
“The pilot project was sanctioned under the Nirbhaya Fund for around ₹33.6 crore. Besides crime prevention, the technology has a wider use — it will bring speed to investigations of missing persons or abducted children, efficient crowd monitoring, and women’s safety,” said a railway law enforcement official.
 
The government used facial recognition cameras at the two-day G20 Summit in New Delhi in September 2023 and the weeks-long Kumbh Mela in Prayagraj early this year. For the Mela, information about more than 1,000 terrorism suspects was put up in the FRS software. As many as 116 FRS and 1000 regular security cameras scanned railway stations and vicinities to feed live footage to a control room at Prayagraj train junction. 
 
Tool of law
 
The RPF coordinates and shares information with other law enforcement agencies and it also investigates crimes on request from state police and other organisations. The government expects that FRS will help in investigations and expedite inter-agency coordination, particularly due to the analytics server which allows common access to enforcement agencies.
 
The railway ministry, in July 2023, started work on a Central Command and Control Centre (C4), a ₹18-crore facility being built in two phases in New Delhi. The building will serve as the main site for data and CCTV analysis and works related to cybersecurity operations for all 18 zones of the Indian Railways. 
 
“There are many requests at the local level. With FRS and (C4), time taken on station-to-station analysis and coordination will not be needed, as the integrated database can identify and share progress in real-time upon requisition,” said the official.
 
The government will study results at the seven model stations to decide how advanced technology can be used to secure railway infrastructure at a wider scale.
 
The facial recognition project has helped secure Mumbai's suburban railway network, according to multiple officials. Western Railway has reportedly detected around 528 criminals and solved seven robbery cases with the help of FRS and security cameras as of October 2024.
 
While FRS helps law enforcement, the lack of nuance in its model causes problems that are worrying for privacy.
 
“False positives are unavoidable — especially when it comes to Indian faces. Since the technology was not created in India, its foundation does not identify the nuances or unique characteristics of the faces in our region. So, there are cases when that happens but the margin of error is factored in the scope of the investigation,” said the official.
 
Internet Freedom Foundation, a digital rights advocacy group, has cited several studies that show that FRS is inaccurate, especially for people of colour (Indians included) and women.
 
“Components of facial recognition technology, such as computer vision systems are inherently non-transparent and their decisions are not easy to understand even by the people who built them. When such systems make an error, the developers or the operators deploying them cannot tell what reasoning the machine has done to get this error, let alone correct it,” said the Foundation.
 
The Railways’ FRS technology is partly imported: South Korean video surveillance manufacturer IDIS Global, Barcelona-headquartered Herta Security, and Russia-based NTech Lab are among its vendors. As an ecosystem of domestic vendors develops and the technology learns to recognise Indian faces, the government hopes to get more local participation. 
 
Data safety
 
Officials said that the analytics server is secure and the most advanced cybersecurity measures are in place for government undertakings such as the Centre for Railway Information Systems.
 
“An important safeguard measure that the Railways has is that none of this data will be shared with any third party or private entity. Given the sensitive nature of the data and the analytics, there is no monetisation in question,” said the official quoted above.
 
The state’s increasing use of FRS has faced scrutiny over data privacy concerns and prompted digital rights activists to question the need for a measure many see as overly intrusive and not credible enough to meet the criteria laid down in the Justice K S Puttaswamy vs Union of India judgment.
 
With FRS becoming prevalent in public life, experts believe the processes around cybersecurity and disclosure should be transparent.
 
“Given that this will inevitably be the future, the systems should be secure by design and should be auditable for anyone who wants to stress-test them. Ideally, there should be notice to users like for any other platform trying to digitally collect your data, but the trend is that in common and critical infrastructure, consent is considered implicit,” said a partner at a Delhi-based technology policy consulting firm.

One subscription. Two world-class reads.

Already subscribed? Log in

Subscribe to read the full story →
*Subscribe to Business Standard digital and get complimentary access to The New York Times

Smart Quarterly

₹900

3 Months

₹300/Month

SAVE 25%

Smart Essential

₹2,700

1 Year

₹225/Month

SAVE 46%
*Complimentary New York Times access for the 2nd year will be given after 12 months

Super Saver

₹3,900

2 Years

₹162/Month

Subscribe

Renews automatically, cancel anytime

Here’s what’s included in our digital subscription plans

Exclusive premium stories online

  • Over 30 premium stories daily, handpicked by our editors

Complimentary Access to The New York Times

  • News, Games, Cooking, Audio, Wirecutter & The Athletic

Business Standard Epaper

  • Digital replica of our daily newspaper — with options to read, save, and share

Curated Newsletters

  • Insights on markets, finance, politics, tech, and more delivered to your inbox

Market Analysis & Investment Insights

  • In-depth market analysis & insights with access to The Smart Investor

Archives

  • Repository of articles and publications dating back to 1997

Ad-free Reading

  • Uninterrupted reading experience with no advertisements

Seamless Access Across All Devices

  • Access Business Standard across devices — mobile, tablet, or PC, via web or app

Topics :Artificial intelligenceRailways securitycctv cameras

Next Story