Monday, November 24, 2025 | 12:56 AM ISTहिंदी में पढें
Business Standard
Notification Icon
userprofile IconSearch

US town turns to AI surveillance to fight crime, privacy fears rise

Dunwoody, Georgia, rolls out AI cameras, drones, and gunshot detectors to reduce crime-but critics warn of surveillance overreach and privacy erosion

Police in Georgia are using AI-powered cameras, drones and detectors to fight crime, but critics warn of privacy risks and lack of oversight.

Is AI the next Big Brother? US town deploys AI surveillance to fight crime | Photo: Pexe;s

Vasudha Mukherjee

Listen to This Article

A small city in Georgia has joined the growing list of US municipalities deploying artificial intelligence-driven surveillance technology to fight crime.
 
According to a report by Forbes, police in Dunwoody, a suburb of Atlanta, have rolled out monitoring tools supplied by Flock Safety, a surveillance technology company headquartered in Atlanta. These tools include cameras, gunshot detectors and drones.
 
Flock Safety founder Garrett Langley believes the company could help eradicate crime in the US within 10 years by combining AI tools with social programmes. The company is also developing an AI platform called Nova, designed to integrate surveillance data with public records—a move critics have deemed a major privacy risk.
 
 

How AI surveillance works

In Dunwoody, police have deployed more than 100 Flock cameras, gunshot detectors, and drones. The devices automatically capture licence plates and vehicle characteristics such as model, colour and visible damage, uploading the information to Flock’s cloud-based platform.
 
Law enforcement can then search the database, track vehicles across jurisdictions, link incidents, analyse live feeds, and use AI tools to transcribe 911 calls.
 
Each camera costs $3,000–$3,500, plus subscription fees. Dunwoody PD pays roughly $500,000 annually for 105 devices and software, the report said.
 

What is Flock Safety?

Founded in 2017, Flock Safety operates across 49 US states. The company runs more than 80,000 cameras nationwide and claims its technology has helped solve over one million crimes annually.
 
Backed by venture capital firm Andreessen Horowitz, Flock was last valued at $7.5 billion.
 
It is one of several emerging companies offering AI-based surveillance to law enforcement.
 

What does the US law say?

There is no comprehensive federal law governing AI use in policing. Instead, individual states and cities have enacted their own rules. Illinois, Vermont, Massachusetts, and New Jersey impose restrictions such as warrant requirements or limits on use for serious crimes. Oregon and New Hampshire prohibit linking facial recognition with police body cameras.
 
At the federal level, former President Donald Trump’s Big Beautiful Act imposed a 10-year freeze on state-level AI regulations, effectively curbing local efforts to restrict surveillance technology.
 

AI laws in Europe and India

The European Union’s AI Act classifies real-time biometric surveillance, including public facial recognition, as an “unacceptable risk” and largely bans it. Other high-risk AI applications must pass strict assessments and maintain human oversight.
 
India, however, does not yet have dedicated laws for AI surveillance. While the Supreme Court has recognised privacy as a fundamental right, the Digital Personal Data Protection Act, 2023 gives the government broad exemptions for national security and law enforcement—leaving AI monitoring largely unregulated.
 

Criticism of AI surveillance

Critics had long warned that the 10-year freeze would reduce oversight—and the concerns appear valid.
 
Earlier this year, a Washington Post investigation revealed that New Orleans police secretly partnered with a nonprofit to use live facial recognition cameras to make arrests, many of which were for non-violent crimes. This violated a 2022 city ordinance that allowed the nearly 200 cameras to be used only to track specific suspects in violent cases.
 
The incident was dubbed a “nightmare scenario” of government mass surveillance, with experts warning that facial recognition often makes errors—especially for people of colour, women, and older adults. Several wrongful arrests have been linked to such tech.
 
Flock itself has come under fire for deploying devices without proper permits. The company has also faced criticism for sharing surveillance data across state lines, in ways that might violate abortion and immigration laws. 

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Sep 09 2025 | 12:00 PM IST

Explore News