CSAM Offenders use Complex Technical Techniques
Anti-Human Trafficking Intelligence Initiative (@TeamATII)
A non-profit focused on the sharing of intelligence, best practices and the development of tools to fight trafficking.
Written by Jennifer Moreau, Director of Marketing and ESG Advisory at Anti-Human Trafficking Intelligence Initiative
?United States federal law defines child pornography as any visual depiction of sexually explicit conduct involving a minor (a person less than 18 years old). Outside of the legal system, we refer to these images as Child Sexual Abuse Material (CSAM) to most accurately reflect what is depicted – the sexual abuse and exploitation of children. Not only do these images and videos document victims’ exploitation and abuse, but when these files are shared across the internet, child victims suffer re-victimization each time the image of their sexual abuse is viewed. In a survey led by the Canadian Centre for Child Protection, 67% of CSAM survivors said the distribution of their images impacts them differently than the hands-on abuse they suffered because the distribution never ends and the images are permanent.
“The lifespan and distribution of CSAM is highly troubling and traumatizing for survivors of child sexual exploitation. It is important to raise awareness with survivors that new mechanisms exist to locate and remove CSAM images. By filing a report with NCMEC these new capacities can be deployed to eliminate legacy CSAM content from even the darkest expanses of the internet. While it used to be true that ‘once it’s there it’s always there’ this is not always the case any longer.” - Matt Richardson, Director of Intelligence and Child Safely, Anti-Human Trafficking Intelligence Initiative
This crime presents many challenges to law enforcement agencies, as emerging technology continuously creates new and innovative opportunities for CSAM perpetrators. The anonymity, accessibility and affordability of the internet also offers a unique environment for committing these offenses. Today people with a sexual interest in children gather in large Pedophile forums on the darknet. There could be hundreds of thousands of users on these forums, both active and passive. More than two-thirds of the discussions, on forums that were identified as CSAM forums, were about technical tools for messaging, exchanging funds or storing content in the cloud.
If we look at research and the spiral of abuse, it seems that people are visiting these forums on their journey to commit an abuse so that they can be accepted by others, learn from others and get inspiration. But there are also other people that visit the forums who are there to just chat and may be satisfied by not going any further than that. There are administrators and techies who join because they have a sexual interest in children but also to gather “friends” and status within the group.
The volume of forum users makes it difficult for law enforcement to prioritize and to find the hands-on abusers. There are the obvious ones which are abusing and sharing the material online and there at the talk-actives that are lying about their endeavors. One thing is consistent: almost all users in the forums are getting more and more aware of how to evade law enforcement and they try various methods to stay under the radar.
"Due to the volume of CSAM that is being uploaded daily, we need to adapt our tactics, techniques and procedures (TTP). Targeting the predators at scale allows us to get higher numbers of offenders in the least amount of time. We are able to utilize data in order to do bulk analysis on various breadcrumbs that are left behind like Bitcoin Transactions or Email Addresses. Training AI and running ML Models, comparing with other datasets and enriching the data with multiple API's allows us to identify hundreds or even thousands of predators, in minutes." - Larry Cameron, Chief Information Security Officer, Anti-Human Trafficking Intelligence Initiative
领英推荐
Most forum organizers know that creating a paywall-model forum is an easy way to keep out investigators. 1 in 5 webpages assessed as displaying child sexual abuse images and videos, in 2019, included a paywall alongside “preview” images to advertise premium access to further abusive content subsequent to payment. Payment can be in the form of a monetary transaction or trading content (images and videos) to gain access to “the community” and/or premium abusive content.
The discussions attempting to normalize, legitimize and justify the consumption and production of child sexual abuse material (CSAM) may be as unsettling as the content itself. To investigate online pedophile criminals at scale is burdensome for law enforcement, both in terms of direct financial commitment incurred by the paywalls on the forums, as well as the additional time required to investigate each individual case of suspected criminal activity.
?The current models of CSAM intelligence are being put to the test with criminals who have made the dark web a place where they can be anonymously advised by others on how to deploy anti-intelligence techniques to protect themselves. Because CSAM producers are often hidden behind private messages, prioritizing investigations or recommendations to Law Enforcement Authorities based on the amount of CSAM links shared by a CSAM criminal is not sufficient anymore. The most dangerous criminal is the one that has not shared any CSAM links – at least, under their own name.
Considering that the current model of dark and open web CSAM forums allow users to register multiple new accounts under different names. The aliases are not sufficient or efficient for an effective open-source intelligence approach. The identifiers must be re-structured and a cross-platform model must be enacted in order to find CSAM abusers in the forums.
Technology Trends Leading to Convictions
Anti-human trafficking technology increases the efficiency and accuracy of online CSAM detection. The proliferation of CSAM means digital forensic practitioners spend lengthy periods analyzing data and delaying investigations. Automatic detection assists with this workload, providing law enforcement with a time-efficient alternative to visually detecting CSAM. The risk of detection is a significant consideration in an individual’s decision to offend, and the perceived anonymity of the internet is a factor in CSAM offending. So any efforts to increase the risk of detection (and reduce anonymity) may reduce this crime.
Thankfully, there are companies that are focused on developing anti-human trafficking emerging technology platforms to get around anonymous alias’s and paywalls by using AI models to cluster criminals so law enforcement can focus on the right people in the forums. The same AI models could be used for research to reach out to people about seeking treatment for this addiction. A person who is on his journey to an abuse but not identified will most likely continue to the next darknet forum and continue the journey to the abuse. By finding these people as soon as possible, we can hopefully break the spiral.
Platforms that deploy web crawlers as a way to detect CSAM are also beneficial to law enforcement, particularly as the active involvement of investigators is minimal. Websites and forums hosting CSAM are often notoriously linked together. Web crawlers are automated scripts or programs that are used to automatically ‘crawl’ across many websites. A web crawler follows the links on each site, identifying the volume of confirmed CSAM to give investigators the actionable intelligence needed to remove the central sites, inhibiting CSAM distribution.
Taking down CSAM forums creates the potential for suspects to be identified and arrested, as well as child victims to be identified and located. The images and videos on these forums depict actual crimes being committed against children. The human element, children at risk, must always be considered when talking about this offense that is based in a high-tech world.
MTech Cybersecurity @ NFSU | Security Engineer | Technical Writer |
2 年Amazing Article
Trust and Safety | Digital Investigations | Policy Management | Child Safety | Content Moderation | Operations
2 年Thanks for sharing this article! The information is definitely alarming and upsetting, especially this piece: "The discussions attempting to normalize, legitimize and justify the consumption and production of child sexual abuse material (CSAM) may be as unsettling as the content itself."
BSA, Fraud, AML, Human & Wildlife Trafficking, OCSE Financial Crime Investigator, ACAMS Houston Chapter Founding Board Member, FinCEN SAR Check Box for Human Trafficking by Joann Alicea, Senior Compliance Officer
2 年Shared post in Human Trafficking Investigators Group