Author ORCID Identifier

https://orcid.org/0000-0002-8793-0504

Document Type

Article

Disciplines

Computer Sciences, Sociology, 6.5 OTHER HUMANITIES

Publication Details

Submitted into the Child Abuse & Neglect journal

Abstract

Background: Producing, distributing or discussing child sexual abuse materials (CSAM) is often committed through the dark web in order to remain hidden from search engines and regular users. Additionally, on the dark web, the CSAM creators employ various techniques to avoid detection and conceal their activities. The large volume of CSAM on the dark web presents a global social problem and poses a significant challenge for helplines, hotlines and law enforcement agencies.

Objective: Identifying CSAM discussions on the dark web and uncovering associated metadata insights into characteristics, behaviours and motivation of CSAM creators.

Participants and Setting: We have conducted an analysis of more than 353,000 posts generated by 35,400 distinct users and written in 118 different languages across eight dark web forums in 2022. Out of these, approximately 221,000 posts were written in English and contributed by around 29,500 unique users.

Method: We propose a CSAM detection intelligence system. The system uses a manually labelled dataset to train, evaluate and select an efficient CSAM classification model. Once we identify CSAM creators and victims through CSAM posts on the dark web, we proceed to analyze, visualize and uncover information concerning the behaviors of CSAM creators and victims.

Result: The CSAM classifier, based on Support Vector Machine model, exhibited good performance, achieving the highest precision of 92.3\%, accuracy of 87.6\% and recall of 84.2\%. Its prediction time is fast, taking only 0.3 milliseconds to process a single post on our laptop. While, the Naive Bayes combination is the best in term of recall, achieving 89\%, and its prediction time is just 0.1 microseconds per post. Across the eight forums in 2022, our Support Vector Machine model detected around 63,000 English CSAM posts and identified near 10,500 English CSAM creators. The analysis of metadata of CSAM posts revealed meaningful information about CSAM creators and their victims, such as: (1) the ages and nationalities of the victims typically mentioned by CSAM creators, (2) forum topics where the CSAM creators assign their posts, and (3) online platforms preferred by CSAM creators for sharing or uploading CSAM.

Conclusion: Our CSAM detection system exhibits high performance in precision, recall, and accuracy in real-time when classifying CSAM and non-CSAM posts. Additionally, it can extract and visualize valuable and unique insights about CSAM creators and victims by employing advanced statistical methods. These insights prove beneficial to our partners, i.e. national hotlines and child agencies.

Funder

Safe Online Initiative of End Violence and the Tech Coalition

Creative Commons License

Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License
This work is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 4.0 International License.


Share

COinS