You are here: Home » News-ANI » Health
Business Standard

Artificial intelligence software can spot child sexual abuse media online

ANI  |  Washington D.C. [US] 

Artificial intelligence software can now help cops spot new or previously unknown child sexual abuse media and prosecute offenders.

The toolkit, described in a paper published in Investigation, automatically detects new child sexual abuse photos and videos in online peer-to-peer networks.

The new approach combines automatic filename and media analysis techniques in an intelligent filtering module, which can identify new criminal media and distinguish it from other media being shared, such as adult pornography.

"The toolkit" automatically detects new child sexual abuse photos and videos in online peer-to-peer (P2P) networks.

Spotting newly produced media online can give enforcement agencies the fresh evidence they need to find and prosecute offenders.

"Identifying new child sexual abuse media is critical because it can indicate recent or ongoing child abuse," said lead study author Claudia Peersman from Lancaster University.

"And because originators of such media can be hands-on abusers, their early detection and apprehension can safeguard their victims from further abuse," Peersman added.

The research behind this technology was conducted in the international research project iCOP - Identifying and Catching Originators in P2P Networks - founded by the European Commission Safer Internet Program by researchers at Lancaster University, the German Research Center for Artificial Intelligence (DFKI), and University College Cork, Ireland.

The people who produce child sexual abuse media are often abusers themselves - the US National Center for Missing and Exploited Children found that 16 percent of the people who possess such media had directly and physically abused children.

The iCOP toolkit uses artificial intelligence and machine learning to flag new and previously unknown child sexual abuse media.

The researchers tested iCOP on real-life cases and enforcement officers trialed the toolkit.

It was highly accurate, with a false positive rate of only 7.9 percent for images and 4.3 percent for videos.

It was also complementary to the systems and workflows they already use. And since the system can reveal who is sharing known child sexual abuse media, and show other files shared by those people, it will be highly relevant and useful to enforcers.

"With iCOP we hope we're giving police the tools they need to catch child sexual abusers early based on what they're sharing online," said Peersman.

(This story has not been edited by Business Standard staff and is auto-generated from a syndicated feed.)

RECOMMENDED FOR YOU

Artificial intelligence software can spot child sexual abuse media online

Artificial intelligence software can now help cops spot new or previously unknown child sexual abuse media and prosecute offenders.The toolkit, described in a paper published in Digital Investigation, automatically detects new child sexual abuse photos and videos in online peer-to-peer networks.The new approach combines automatic filename and media analysis techniques in an intelligent filtering module, which can identify new criminal media and distinguish it from other media being shared, such as adult pornography."The toolkit" automatically detects new child sexual abuse photos and videos in online peer-to-peer (P2P) networks.Spotting newly produced media online can give law enforcement agencies the fresh evidence they need to find and prosecute offenders."Identifying new child sexual abuse media is critical because it can indicate recent or ongoing child abuse," said lead study author Claudia Peersman from Lancaster University."And because originators of such media can be hands-on ...

Artificial intelligence software can now help cops spot new or previously unknown child sexual abuse media and prosecute offenders.

The toolkit, described in a paper published in Investigation, automatically detects new child sexual abuse photos and videos in online peer-to-peer networks.

The new approach combines automatic filename and media analysis techniques in an intelligent filtering module, which can identify new criminal media and distinguish it from other media being shared, such as adult pornography.

"The toolkit" automatically detects new child sexual abuse photos and videos in online peer-to-peer (P2P) networks.

Spotting newly produced media online can give enforcement agencies the fresh evidence they need to find and prosecute offenders.

"Identifying new child sexual abuse media is critical because it can indicate recent or ongoing child abuse," said lead study author Claudia Peersman from Lancaster University.

"And because originators of such media can be hands-on abusers, their early detection and apprehension can safeguard their victims from further abuse," Peersman added.

The research behind this technology was conducted in the international research project iCOP - Identifying and Catching Originators in P2P Networks - founded by the European Commission Safer Internet Program by researchers at Lancaster University, the German Research Center for Artificial Intelligence (DFKI), and University College Cork, Ireland.

The people who produce child sexual abuse media are often abusers themselves - the US National Center for Missing and Exploited Children found that 16 percent of the people who possess such media had directly and physically abused children.

The iCOP toolkit uses artificial intelligence and machine learning to flag new and previously unknown child sexual abuse media.

The researchers tested iCOP on real-life cases and enforcement officers trialed the toolkit.

It was highly accurate, with a false positive rate of only 7.9 percent for images and 4.3 percent for videos.

It was also complementary to the systems and workflows they already use. And since the system can reveal who is sharing known child sexual abuse media, and show other files shared by those people, it will be highly relevant and useful to enforcers.

"With iCOP we hope we're giving police the tools they need to catch child sexual abusers early based on what they're sharing online," said Peersman.

(This story has not been edited by Business Standard staff and is auto-generated from a syndicated feed.)

image
Business Standard
177 22

Artificial intelligence software can spot child sexual abuse media online

Artificial intelligence software can now help cops spot new or previously unknown child sexual abuse media and prosecute offenders.

The toolkit, described in a paper published in Investigation, automatically detects new child sexual abuse photos and videos in online peer-to-peer networks.

The new approach combines automatic filename and media analysis techniques in an intelligent filtering module, which can identify new criminal media and distinguish it from other media being shared, such as adult pornography.

"The toolkit" automatically detects new child sexual abuse photos and videos in online peer-to-peer (P2P) networks.

Spotting newly produced media online can give enforcement agencies the fresh evidence they need to find and prosecute offenders.

"Identifying new child sexual abuse media is critical because it can indicate recent or ongoing child abuse," said lead study author Claudia Peersman from Lancaster University.

"And because originators of such media can be hands-on abusers, their early detection and apprehension can safeguard their victims from further abuse," Peersman added.

The research behind this technology was conducted in the international research project iCOP - Identifying and Catching Originators in P2P Networks - founded by the European Commission Safer Internet Program by researchers at Lancaster University, the German Research Center for Artificial Intelligence (DFKI), and University College Cork, Ireland.

The people who produce child sexual abuse media are often abusers themselves - the US National Center for Missing and Exploited Children found that 16 percent of the people who possess such media had directly and physically abused children.

The iCOP toolkit uses artificial intelligence and machine learning to flag new and previously unknown child sexual abuse media.

The researchers tested iCOP on real-life cases and enforcement officers trialed the toolkit.

It was highly accurate, with a false positive rate of only 7.9 percent for images and 4.3 percent for videos.

It was also complementary to the systems and workflows they already use. And since the system can reveal who is sharing known child sexual abuse media, and show other files shared by those people, it will be highly relevant and useful to enforcers.

"With iCOP we hope we're giving police the tools they need to catch child sexual abusers early based on what they're sharing online," said Peersman.

(This story has not been edited by Business Standard staff and is auto-generated from a syndicated feed.)

image
Business Standard
177 22

Upgrade To Premium Services

Welcome User

Business Standard is happy to inform you of the launch of "Business Standard Premium Services"

As a premium subscriber you get an across device unfettered access to a range of services which include:

  • Access Exclusive content - articles, features & opinion pieces
  • Weekly Industry/Genre specific newsletters - Choose multiple industries/genres
  • Access to 17 plus years of content archives
  • Set Stock price alerts for your portfolio and watch list and get them delivered to your e-mail box
  • End of day news alerts on 5 companies (via email)
  • NEW: Get seamless access to WSJ.com at a great price. No additional sign-up required.
 

Premium Services

In Partnership with

 

Dear Guest,

 

Welcome to the premium services of Business Standard brought to you courtesy FIS.
Kindly visit the Manage my subscription page to discover the benefits of this programme.

Enjoy Reading!
Team Business Standard