Online Child Sexual Exploitation and Abuse in India- An Extended Abstract
Guillermo Galarza Abizaid
Leading Global Child Safety Advocate with a passion for safeguarding children and enhancing Industry/government collaboration
As of 2024, India's population stands at 1,409,128,296[1] making it the most populous country in the world. Notably, India has the highest population of children globally, with over 480 million children, constituting more than one-third of its total population. Protecting these children poses a significant challenge, as India faces many of the same issues encountered by other nations but with limited resources.
This is an extended abstract of a paper created by a good friend, colleague, and true advocate of child protection in India. Although this person wishes to remain anonymous, they have permitted me to create an abstract of their work. I acknowledge their immense contribution and research and I will also address some issues from my personal experiences, having worked in India since 2007. I also want to express my gratitude to all our partner agencies doing tremendous work in India, including Kerala Police cyberdome Center to Counter Child Sexual Exploitation, Child Rescue Coalition, Inc. , Childlight - Global Child Safety Institute , WeProtect Global Alliance , International Justice Mission , National Center for Missing & Exploited Children , Internet Watch Foundation (IWF) 美国国务院 and many others. Their continued work and support is making India a safer place for children.
India is an enchanting country, brimming with life, culture, history, and incredible food. It is one of my favorite countries, primarily due to its people, the diversity, and the cultural differences that vary from state to state. Despite its beauty and rich cultural heritage, India faces significant challenges in combating all forms of child exploitation, including trafficking, child labor, prostitution, and online child sexual exploitation (OCSE).
?I thought I had seen it all during my years of traveling, but nothing prepared me for India. As I got off that United flight from Newark to Delhi in mid-February 2007, I had no idea what was coming my way. I was eager to travel to India to deliver training on OCSE at the Ghaziabad Central Bureau of Investigation (CBI) training academy. Words can’t describe my initial impressions: the bustling markets, the rich scent of food filled with spices, the rickshaws, and the overwhelming number of people. Now, years later, India has become a country that I truly admire and respect. India is not for everyone, but if you fall in love with it, you will desire to return time after time.
Introduction
This extended abstract summarizes some of the current issues, challenges, and opportunities in developing a national plan to combat Online Child Sexual Exploitation and Abuse (OCSEA) in India. It emphasizes the necessity of a coordinated national response and outlines strategic measures to effectively address this escalating threat.
Technological advancements, particularly during and after the COVID-19 pandemic, have significantly increased children's exposure to online sexual exploitation and abuse. India, with the largest population of children in the world, is no exception. Countering OCSEA and developing an appropriate policy framework is a challenging task, especially in a diverse country like India, with state and territory diversities and economic disadvantages. Despite India being a signatory to key international conventions on child safety and having enacted relevant legislation and policies, a comprehensive national strategy to counter OCSEA has yet to be formulated.
?
Technology Barriers
?
End-to-end encryption as a tool for abuse
The widespread introduction of End-to-End Encryption (E2EE) by service providers, intended as a safety feature, creates significant risks for children online by making it nearly impossible to detect child sexual abuse imagery, hindering law enforcement efforts, and fostering large-scale Child Sexual Abuse Material (CSAM) communities. Features like secret chats and bots on platforms such as Telegram, further facilitate these offenses by enforcing rules that compel users to share CSAM or face expulsion from groups, while allowing administrators to remain anonymous.
Misleading age ratings of apps?
In real life, parents wouldn't leave young children unsupervised with adult strangers, but this happens daily online, especially on gaming and live streaming platforms where children and adults interact, sometimes unknowingly. While some child-specific platforms have safety measures, they can create a false sense of security, as offenders may lure children to less secure spaces. This risk is heightened as children use more online platforms from a younger age, especially post-COVID-19, and parents may not realize the need for vigilance, given misleading age ratings on app stores.
?Cloud Storage Platforms
?Cloud storage platforms like MEGA.nz, Dropbox, OneDrive, MediaFire, and iCloud require only an email ID to start an account, making them popular among child sexual offenders for storing and distributing large amounts of CSAM. Offenders often transmit CSAM via links to cloud storage, which can offer up to 50GB of free storage. During recent operations in India, many offenders were found to use MEGA.nz, sharing accounts among users, both local and foreign, to distribute gigabytes of CSAM.
?
CSAM on the Dark Web
?The dark web enables those with a sexual interest in children to discuss their thoughts and desires anonymously and access and exchange CSAM on a global scale. The rise of smartphones with high-quality cameras has blurred the line between consumers and producers, increasing CSAM circulation. Today, darknet CSAM forums are central to a global community serving hundreds of thousands of individuals with such interests.
?
Grooming and coercing children to produce ‘self-generated’ CSAM
?
The Global Threat Assessment Report by WeProtect Global Alliance[2] highlights alarming trends in online grooming, self-generated CSAM, and livestreaming of child sexual abuse. Prevalence rates for online grooming range between 9-19%, with an 80% rise in such crimes over the past four years, according to National Society for the Prevention of Cruelty to Children data. Perpetrators often groom children on social media, chat rooms, and gaming platforms, moving conversations to private or encrypted messaging apps to avoid detection, a tactic known as 'off-platforming'. Online multiplayer games, with their social features, notably increase the risk of child exploitation. The Bracket Foundation's 2022 report[3] uses the 3C’s framework (content, contact, and conduct risks) to categorize these dangers. Grooming in social gaming environments can escalate rapidly, with an average grooming time of just 45 minutes, and in extreme cases, as short as 19 seconds. The Internet Watch Foundation[4] (IWF) found that children aged 11-13 most frequently appear in 'self-generated' imagery, with a 65% increase in such content involving children aged 7-10 from 2022 to 2023. In 2023, 92% of the 275,652 web pages actioned by IWF contained 'self-generated' imagery, marking a 27% increase from the previous year.
?
Live streaming of child sexual abuse
?
The scale of live-streamed child sexual abuse is difficult to measure due to several factors, including the challenges in investigating and prosecuting these cases once the livestream ends and the lack of monitoring on private streams. Livestreaming presents significant challenges in the context of OCSE. Here are some of the primary challenges:
?
?Artificial Intelligence (AI) CSAM
The public use of generative AI technologies has surged, leading to an increase in the creation of CSAM using these tools. Unlike traditional AI, which recognizes patterns and makes predictions, generative AI produces new content such as images, text, and audio. Since early 2023, there has been a rise in perpetrators using generative AI for CSAM. A 2023 IWF study[5] found 20,254 AI-generated images posted on a dark web CSAM forum within one month and discovered guides on generating AI CSAM widely shared in these forums. Increasingly, AI-generated CSAM features known victims and famous children. An investigation by Stanford Internet Observatory identified hundreds of known CSAM images in an open dataset used to train popular AI models like Stable Diffusion, highlighting the challenge of cleaning or stopping the distribution of open datasets without a central authority.
?
The Need
Ongoing Capacity Building and Training
Globally, law enforcement, prosecutors, and judiciary often fall behind in their understanding of OCSEA, and India is no exception. A comprehensive strategy is essential, including the creation of specialized training modules for specialist units, investigators, forensic scientists, prosecutors, and judges. This training should be centrally coordinated and funded, but executed by the states.
In the arena of digital forensics, the development of protocols and tools for digital analysis in child exploitation cases is complicated by the ever-changing variety of platforms used to commit offenses, the sheer volume of data to analyze, and the cost, time, and expertise involved in development. The issue is global and it is essential to develop a unique policy that suits India’s requirements.
While CyTrain, the National Cybercrime Training Centre, provides training for combating cybercrimes, it lacks a dedicated curriculum that addresses OCSEA. Southern states like Kerala and Telangana have made notable progress through eLearning on-demand programs such as the one from ICMEC and others ?
领英推荐
NCMEC CyberTips
The National Center for Missing & Exploited Children (NCMEC) operates the CyberTipline, a centralized mechanism for reporting instances of child sexual exploitation. While NCMEC is based in the United States, its CyberTipline has a global reach, including India. In 2023, NCMEC generated over 36 million reports globally, with India receiving the most at 8,923,738 reports.
The Indian Cyber Crime Coordination Centre (I4C) in India manages these reports through a national portal. The reports are then triaged and sent to the state-level Cyber Divisions or Specialist units formed to counter OCSEA, which collaborates with local law enforcement to ensure thorough investigations. While this process is functional, several challenges persist:
Integrating NCMEC CyberTips into India's national strategy against OCSEA can significantly enhance detection, reporting, and response efforts. Key components include:
This collaborative approach can help create a safer online environment for children in India and contribute to the global fight against child exploitation.
Updating Legislative Framework
India enacted the Prevention of Child Sexual Offences (POCSO) Act in 2012, establishing a comprehensive legal framework for protecting children from sexual exploitation. The POCSO Act introduced child-friendly procedures for reporting, evidence recording, investigation, and speedy trials through designated Special Courts. While the Act has effectively addressed physical child sexual abuse, it requires significant revisions to tackle OCSEA, which has evolved rapidly due to increased internet prevalence and technological advancements since 2012.
The 2020 report by the Adhoc Committee of the Rajya Sabha emphasized the need for explicit provisions on cyber-grooming and recommended aligning Indian law with international standards. Modern terminologies and offenses, including different types of CSAM, grooming methods, sextortion, self-generated CSAM, AI-generated CSAM, live-streaming of child sexual abuse, and offenses involving end-to-end encryption, peer-to-peer networks, dark web, and cloud storage platforms, need to be incorporated into the POCSO Act. Additionally, enhanced penalties for CSAM commercialization and provisions for long-term support for victims are necessary.
India’s Information Technology Act,[6] which penalizes CSAM, is outdated and is being replaced by the Digital India Act, expected to be passed in 2024. The new Act includes provisions for adjudicating user harm against children, age-gating, protecting minors' data, and ensuring their safety and privacy on social media platforms. The existing Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, mandate that Significant Social Media Intermediaries (SSMIs) deploy technology-based measures to identify child sexual abuse imagery, appoint compliance officers, and cooperate with law enforcement agencies. Despite these stringent regulations, compliance has been inadequate, prompting the Ministry of Electronics and Information Technology (MeitY) to issue directives for proactive CSAM removal.
?
Expanding specialized units
?
Counter Child Sexual Exploitation Centre
?
The Counter Child Sexual Exploitation Centre (CCSE) in Kerala is the first fully operational and most efficient state-level specialist unit and serves as a role model for other states. Key functionalities and duties of the CCSE that can be adopted by Specialist Units across States include:
?It is essential to form such a specialized unit in every state?
International Child Sexual Exploitation Database
Given the international nature of OCSEA, identifying victims and offenders remains one of the most challenging aspects of investigations. India needs a comprehensive database of images and videos of victims and offenders, leveraging modern AI capabilities. To address this need, a National Victim Identification Framework is proposed, which includes the creation of a National Victim Identification Database. This initiative aims to unify the Central Bureau of Investigation (CBI) and state specialist units for a consistent approach to identification.
India has joined Interpol’s International Child Sexual Exploitation (ICSE) database, which links victims, abusers, and crime scenes using audio-visual data. The CBI connected India to this database in July 2022, making it the 68th country to join. However, state police forces, including those with specialist units, currently lack access to the database, limiting its effectiveness. Granting access to state police would enhance victim identification by linking investigations and preventing duplication. Once established, the National Victim Identification Database can be integrated with the ICSE database.
?Preventing Offending and Re-offending
India launched its National Database of Sexual Offenders (NDSO) in 2018, which includes information on child sexual offenders and is maintained by the NCRB. However, its composition, effectiveness, and usage by law enforcement agencies (LEAs) and other stakeholders have yet to be thoroughly evaluated. To enhance the effectiveness of the NDSO and combat child sexual exploitation, the following measures are proposed:
?
Conclusion
In conclusion, developing the capacity to combat online child sexual exploitation in India is crucial for protecting children and ensuring a safer digital environment. Implementing a robust monitoring system for child sexual offenders, enhancing inter-agency coordination, and promoting effective case and trial management will significantly deter offenders and expedite justice. Leveraging local volunteer organizations and adapting successful international initiatives will provide essential community-based support and prevention. Additionally, a comprehensive evaluation and improvement of existing systems like the National Database of Sexual Offenders will strengthen the overall response to OCSEA. Through these concerted efforts, India can build a resilient framework to safeguard children against online sexual exploitation and abuse.
?
Special Agent at Homeland Security Investigations - Phoenix
3 个月As we recently discussed, it isn’t enough to only arm police with tools and capacity. We need to educate the youth, parents, teachers, and anyoelse that will listen. Making children hard targets in the first place is something we need to do better globally.
International Consultant with United Nations
3 个月Interesting write-up Guillermo Galarza Abizaid. Sharing for wider reach. Thank you to ICMEC for all the good work that is being done in India.