CRIME

New AI Is On The Hunt For Child Sex Abusers

The system could flag abusive online media much sooner

CRIME
Illustration: R. A. Di Ieso
Dec 02, 2016 at 3:12 PM ET

A new artificial intelligence program could help reduce how long it takes to catch child sexual abusers.

Around the world, hundreds of thousands of child sexual abuse images and videos are shared every year online. Police already use AI tools to keep up with them, such as PhotoDNA by Microsoft, Child Protection System, and Roundup. But these tools aren’t capable of flagging suspicious images unless they are already in an existing database of known child pornography.

The new software can identify new problematic content as it is first entering circulation within online communities and networks. And those images are among the most pressing when it comes to policing abusers and helping victims.

Developed by academics from various fields and research centers, the AI is part of a project called Identifying and Catching Originators in P2P Networks (iCOP) that has been packaged into a toolkit that has already proven successful. Interpol and other law enforcement agencies throughout Europe are already using it, said computational linguist Claudia Peersman, a leader of the project.

More Wyoming Man Receives 19-Year Sentence For Record Cache Of Child Porn

“The existing tools match the files that are being shared through existing databases, but our program detects new data,” Peersman told Vocativ. “If you look at peer-to-peer networks, images are shared at a pace that’s just not feasible for any human to go through [manually].”

Peersman embarked on the project after hearing an Interpol agent speak of his “spine-chilling” experiences seeing children as young as toddlers victimized in online pornography. She says that the software is intended to help identify new victims — ones who have recently or are currently experiencing abuse — more quickly than current methods allow. It is limited for now, however, since it can’t yet search the dark net, the parts of the web hidden from search engines where much of this content is shared. “It’s a lot more challenging to work on,” she said.

The software, which can be used in tandem with law enforcement’s existing databases, works by first picking out filenames that indicate child pornography being uploaded to peer-to-peer networks in realtime. Those filenames include numerals indicating ages, names for sexual acts, and known keywords like “lolita,” “kdquality,” “pt” (an acronym for preteen), and “childlover,” as well as other possible variations, including numerals in place of letters, like “l0l1ta”.

More Teaming Up To Combat Bitcoin’s Role In Child Porn

iCOP’s software is then able to automatically pick out images and video that may feature child sexual abuse by identifying subtle movements trademark of sexual abuse and other intricacies, like skin characteristics of the people within the media. Filtering by IP address location is also possible through iCOP, which would allow law enforcement agencies to specifically monitor images being uploaded in their jurisdictions.

When the iCOP was used in trials on real-life cases handled by law enforcement agencies that were guaranteed anonymity, the false positive rate— the number of times the program incorrectly identified child pornography — was less than 8 percent.

Peersman notes that AI technology of this kind may be used to identify forms online fraud, like romance scams on dating websites, in the future. Her team is already investigating other applications and working to expand the program to the dark web.