In this project, we aim to enhance the performance our iCOP toolkit, a live investigative software package that is designed to flag new or previously unknown CSAM on P2P networks to law enforcement, by extending the software to work on Southeast Asian CSAM and in livestreaming environments.
The core functions and main components of the iCOP 2.0 toolkit will be as follows:
• a novel filename classification approach that utilises a combination of linguistic clues and specialised vocabulary used by ASEAN offenders to share or stream CSEA
content on P2P networks to automatically identify suspicious media based on their
filename or video title;
• a new image and video classification module using multiple and - in case of video -
multi-modal (visual and audio) feature descriptions, leading to a robust and highly
accurate identification of Southeast Asian CSEA content;an innovative triage approach based on a synthesis of the above two models to flag the most pertinent CSEA content being uploaded or streamed on P2P networks.
A first issue with AI systems is that they inevitably yield a number of false positives, which could result in additional work for police investigators and potential reputational damage for the person sharing the media. However, evaluation of the current version of the toolkit resulted in a false positive rate of only 7.9% for images and 4.3% for videos. Secondly, any captured data should ensure compliance with existing regulations, which may vary depending on the jurisdictions in which the incident under investigation has occurred. Therefore, the toolkit will be further developed in accordance with a modular design, permitting flexibility in any operational application.
Peer-to-peer (P2P) networks make it easy for child sex offenders to share child sexual
abuse images and videos. Every second, there are hundreds of searches for child abuse images worldwide, and perpetrators share hundreds of thousands of child sexual abuse media (CSAM) every year. Intercepting such images and videos can help law enforcement officers apprehend child sex offenders, because the people who produce CSAM are often involved in recent or ongoing abuse themselves. Spotting them early by flagging new files can therefore help stop them more quickly and safeguard their victims from further abuse. However, in reality this is enormously challenging: the sheer volume of activity on P2P networks makes manual
detection virtually impossible. Although a number of tools already exist to help police
investigators monitor such networks for paedophile activity, they usually rely on re-identifying known media being uploaded and shared by offenders. As a result, these tools are not able to adequately filter the thousands of results they retrieve, nor can they identify new CSAM that are being released on to a network.
Researchers in the Cyber Security Research Group of the University of Bristol (UK) and
Lancaster University (UK), developed an approach that combines Artificial Intelligence and Machine Learning to flag new and previously unknown CSAM automatically and packaged it in a toolkit called iCOP. The software combines automatic filename and media analysis techniques in an intelligent filtering module. The current version of the iCOP toolkit was developed and tested using real-life European police cases in collaboration with law enforcement agencies across Europe. As a result, the software can successfully identify new CSAM and distinguish it from other media being shared, such as adult pornography. One issue with automatic systems is that they can also identify non-criminal content accidentally, resulting in additional work for police investigators and potentially reputation damage for the person sharing the media. However, the iCOP toolkit is highly accurate, with a false positive rate of only 7.9% for images and 4.3% for videos.
In the iCOP 2.0 project, we aim to enhance the toolkit’s performance by extending the
software to work on non-European CSAM and in livestreaming environments. More
specifically, given the rapid growth of online child sexual exploitation in Southeast Asian countries, such as Indonesia, Thailand and the Philippines, we will focus our research on developing new automated techniques to assist ASEAN law enforcement investigations pertaining to online child protection.
Existing tools used by ASEAN law enforcement are inadequate in detecting native perpetrators and victims. There is a strong need, and enthusiasm, for the adoption of ICOP 2.0 in the region. Tool training and testing are now underway in Malaysia and Thailand, supported by UNODC