New AI toolkit spots child sexual abuse media online

Image
Press Trust of India London
Last Updated : Dec 04 2016 | 12:28 PM IST
Scientists have developed an artificial intelligence software that can automatically detect new child sexual abuse photos and videos in online networks and help prosecute offenders.
There are hundreds of searches for child abuse images every second worldwide, resulting in hundreds of thousands of child sexual abuse images and videos being shared every year.
The people who produce child sexual abuse media are often abusers themselves, said researchers including those from Lancaster University in the UK.
Spotting newly produced media online can give law enforcement agencies the fresh evidence they need to find and prosecute offenders, they said.
However the sheer volume of activity on peer-to-peer networks makes manual detection virtually impossible.
The new toolkit automatically identifies new or previously unknown child sexual abuse media using artificial intelligence.
"Identifying new child sexual abuse media is critical because it can indicate recent or ongoing child abuse," said lead author Claudia Peersman from Lancaster University.
"And because originators of such media can be hands-on abusers, their early detection and apprehension can safeguard their victims from further abuse," said Peersman.
There are already a number of tools available to help law enforcement agents monitor peer-to-peer networks for child sexual abuse media, but they usually rely on identifying known media.
As a result, these tools are unable to assess the thousands of results they retrieve and can not spot new media that appear.
The Identifying and Catching Originators in P2P (iCOP) Networks toolkit uses artificial intelligence and machine learning to flag new and previously unknown child sexual abuse media.
The new approach combines automatic filename and media analysis techniques in an intelligent filtering module. The software can identify new criminal media and distinguish it from other media being shared, such as adult pornography.
The researchers tested iCOP on real-life cases and law enforcement officers trialled the toolkit.
It was highly accurate, with a false positive rate of only 7.9 pre cent for images and 4.3 pre cent for videos.
It was also complementary to the systems and workflows they already use. And since the system can reveal who is sharing known child sexual abuse media, and show other files shared by those people, it will be highly relevant and useful to law enforcers.
The research was published in the journal Digital Investigation.

Disclaimer: No Business Standard Journalist was involved in creation of this content

*Subscribe to Business Standard digital and get complimentary access to The New York Times

Smart Quarterly

₹900

3 Months

₹300/Month

SAVE 25%

Smart Essential

₹2,700

1 Year

₹225/Month

SAVE 46%
*Complimentary New York Times access for the 2nd year will be given after 12 months

Super Saver

₹3,900

2 Years

₹162/Month

Subscribe

Renews automatically, cancel anytime

Here’s what’s included in our digital subscription plans

Exclusive premium stories online

  • Over 30 premium stories daily, handpicked by our editors

Complimentary Access to The New York Times

  • News, Games, Cooking, Audio, Wirecutter & The Athletic

Business Standard Epaper

  • Digital replica of our daily newspaper — with options to read, save, and share

Curated Newsletters

  • Insights on markets, finance, politics, tech, and more delivered to your inbox

Market Analysis & Investment Insights

  • In-depth market analysis & insights with access to The Smart Investor

Archives

  • Repository of articles and publications dating back to 1997

Ad-free Reading

  • Uninterrupted reading experience with no advertisements

Seamless Access Across All Devices

  • Access Business Standard across devices — mobile, tablet, or PC, via web or app

More From This Section

First Published: Dec 04 2016 | 12:28 PM IST

Next Story