Online Child Sexual Exploitation and Abuse in India- An Extended Abstract

Online Child Sexual Exploitation and Abuse in India- An Extended Abstract

As of 2024, India's population stands at 1,409,128,296[1] making it the most populous country in the world. Notably, India has the highest population of children globally, with over 480 million children, constituting more than one-third of its total population. Protecting these children poses a significant challenge, as India faces many of the same issues encountered by other nations but with limited resources.

This is an extended abstract of a paper created by a good friend, colleague, and true advocate of child protection in India. Although this person wishes to remain anonymous, they have permitted me to create an abstract of their work. I acknowledge their immense contribution and research and I will also address some issues from my personal experiences, having worked in India since 2007. I also want to express my gratitude to all our partner agencies doing tremendous work in India, including Kerala Police cyberdome Center to Counter Child Sexual Exploitation, Child Rescue Coalition, Inc. , Childlight - Global Child Safety Institute , WeProtect Global Alliance , International Justice Mission , National Center for Missing & Exploited Children , Internet Watch Foundation (IWF) 美国国务院 and many others. Their continued work and support is making India a safer place for children.

India is an enchanting country, brimming with life, culture, history, and incredible food. It is one of my favorite countries, primarily due to its people, the diversity, and the cultural differences that vary from state to state. Despite its beauty and rich cultural heritage, India faces significant challenges in combating all forms of child exploitation, including trafficking, child labor, prostitution, and online child sexual exploitation (OCSE).

?I thought I had seen it all during my years of traveling, but nothing prepared me for India. As I got off that United flight from Newark to Delhi in mid-February 2007, I had no idea what was coming my way. I was eager to travel to India to deliver training on OCSE at the Ghaziabad Central Bureau of Investigation (CBI) training academy. Words can’t describe my initial impressions: the bustling markets, the rich scent of food filled with spices, the rickshaws, and the overwhelming number of people. Now, years later, India has become a country that I truly admire and respect. India is not for everyone, but if you fall in love with it, you will desire to return time after time.

Introduction

This extended abstract summarizes some of the current issues, challenges, and opportunities in developing a national plan to combat Online Child Sexual Exploitation and Abuse (OCSEA) in India. It emphasizes the necessity of a coordinated national response and outlines strategic measures to effectively address this escalating threat.

Technological advancements, particularly during and after the COVID-19 pandemic, have significantly increased children's exposure to online sexual exploitation and abuse. India, with the largest population of children in the world, is no exception. Countering OCSEA and developing an appropriate policy framework is a challenging task, especially in a diverse country like India, with state and territory diversities and economic disadvantages. Despite India being a signatory to key international conventions on child safety and having enacted relevant legislation and policies, a comprehensive national strategy to counter OCSEA has yet to be formulated.

?

Technology Barriers

?

End-to-end encryption as a tool for abuse

The widespread introduction of End-to-End Encryption (E2EE) by service providers, intended as a safety feature, creates significant risks for children online by making it nearly impossible to detect child sexual abuse imagery, hindering law enforcement efforts, and fostering large-scale Child Sexual Abuse Material (CSAM) communities. Features like secret chats and bots on platforms such as Telegram, further facilitate these offenses by enforcing rules that compel users to share CSAM or face expulsion from groups, while allowing administrators to remain anonymous.

Misleading age ratings of apps?

In real life, parents wouldn't leave young children unsupervised with adult strangers, but this happens daily online, especially on gaming and live streaming platforms where children and adults interact, sometimes unknowingly. While some child-specific platforms have safety measures, they can create a false sense of security, as offenders may lure children to less secure spaces. This risk is heightened as children use more online platforms from a younger age, especially post-COVID-19, and parents may not realize the need for vigilance, given misleading age ratings on app stores.

?Cloud Storage Platforms

?Cloud storage platforms like MEGA.nz, Dropbox, OneDrive, MediaFire, and iCloud require only an email ID to start an account, making them popular among child sexual offenders for storing and distributing large amounts of CSAM. Offenders often transmit CSAM via links to cloud storage, which can offer up to 50GB of free storage. During recent operations in India, many offenders were found to use MEGA.nz, sharing accounts among users, both local and foreign, to distribute gigabytes of CSAM.

?

CSAM on the Dark Web

?The dark web enables those with a sexual interest in children to discuss their thoughts and desires anonymously and access and exchange CSAM on a global scale. The rise of smartphones with high-quality cameras has blurred the line between consumers and producers, increasing CSAM circulation. Today, darknet CSAM forums are central to a global community serving hundreds of thousands of individuals with such interests.

?

Grooming and coercing children to produce ‘self-generated’ CSAM

?

The Global Threat Assessment Report by WeProtect Global Alliance[2] highlights alarming trends in online grooming, self-generated CSAM, and livestreaming of child sexual abuse. Prevalence rates for online grooming range between 9-19%, with an 80% rise in such crimes over the past four years, according to National Society for the Prevention of Cruelty to Children data. Perpetrators often groom children on social media, chat rooms, and gaming platforms, moving conversations to private or encrypted messaging apps to avoid detection, a tactic known as 'off-platforming'. Online multiplayer games, with their social features, notably increase the risk of child exploitation. The Bracket Foundation's 2022 report[3] uses the 3C’s framework (content, contact, and conduct risks) to categorize these dangers. Grooming in social gaming environments can escalate rapidly, with an average grooming time of just 45 minutes, and in extreme cases, as short as 19 seconds. The Internet Watch Foundation[4] (IWF) found that children aged 11-13 most frequently appear in 'self-generated' imagery, with a 65% increase in such content involving children aged 7-10 from 2022 to 2023. In 2023, 92% of the 275,652 web pages actioned by IWF contained 'self-generated' imagery, marking a 27% increase from the previous year.

?

Live streaming of child sexual abuse

?

The scale of live-streamed child sexual abuse is difficult to measure due to several factors, including the challenges in investigating and prosecuting these cases once the livestream ends and the lack of monitoring on private streams. Livestreaming presents significant challenges in the context of OCSE. Here are some of the primary challenges:

?

  • Anonymity: Livestreams enable real-time exploitation without leaving a digital trail, making it difficult for law enforcement to intervene promptly. Perpetrators often use anonymizing tools and platforms that do not require user verification, complicating identification and tracking.
  • Global Reach and Jurisdictional Issues: Livestreaming allows perpetrators and victims to be in different countries, leading to complex jurisdictional issues.
  • Detection: Automated tools for detecting abusive content in real time are still developing, and many existing tools struggle with the nuances of live-streamed content.
  • Platform Policies and Moderation: Different platforms have varying levels of commitment and capacity to monitor and take down abusive live streams.

?Artificial Intelligence (AI) CSAM

The public use of generative AI technologies has surged, leading to an increase in the creation of CSAM using these tools. Unlike traditional AI, which recognizes patterns and makes predictions, generative AI produces new content such as images, text, and audio. Since early 2023, there has been a rise in perpetrators using generative AI for CSAM. A 2023 IWF study[5] found 20,254 AI-generated images posted on a dark web CSAM forum within one month and discovered guides on generating AI CSAM widely shared in these forums. Increasingly, AI-generated CSAM features known victims and famous children. An investigation by Stanford Internet Observatory identified hundreds of known CSAM images in an open dataset used to train popular AI models like Stable Diffusion, highlighting the challenge of cleaning or stopping the distribution of open datasets without a central authority.

?


@Dr. Geeta Sekhon delivering a victim Centric approach training

The Need

Ongoing Capacity Building and Training

Globally, law enforcement, prosecutors, and judiciary often fall behind in their understanding of OCSEA, and India is no exception. A comprehensive strategy is essential, including the creation of specialized training modules for specialist units, investigators, forensic scientists, prosecutors, and judges. This training should be centrally coordinated and funded, but executed by the states.

In the arena of digital forensics, the development of protocols and tools for digital analysis in child exploitation cases is complicated by the ever-changing variety of platforms used to commit offenses, the sheer volume of data to analyze, and the cost, time, and expertise involved in development. The issue is global and it is essential to develop a unique policy that suits India’s requirements.

While CyTrain, the National Cybercrime Training Centre, provides training for combating cybercrimes, it lacks a dedicated curriculum that addresses OCSEA. Southern states like Kerala and Telangana have made notable progress through eLearning on-demand programs such as the one from ICMEC and others ?

NCMEC CyberTips

The National Center for Missing & Exploited Children (NCMEC) operates the CyberTipline, a centralized mechanism for reporting instances of child sexual exploitation. While NCMEC is based in the United States, its CyberTipline has a global reach, including India. In 2023, NCMEC generated over 36 million reports globally, with India receiving the most at 8,923,738 reports.

The Indian Cyber Crime Coordination Centre (I4C) in India manages these reports through a national portal. The reports are then triaged and sent to the state-level Cyber Divisions or Specialist units formed to counter OCSEA, which collaborates with local law enforcement to ensure thorough investigations. While this process is functional, several challenges persist:

  • Lack of Processing and Analysis
  • Coordination with Indian Authorities
  • Action and Follow-up
  • Training

Integrating NCMEC CyberTips into India's national strategy against OCSEA can significantly enhance detection, reporting, and response efforts. Key components include:

  • Strengthening Legal Frameworks
  • Capacity Building
  • Cross-Border Collaboration
  • Public Engagement using the CyberTipline

This collaborative approach can help create a safer online environment for children in India and contribute to the global fight against child exploitation.

Updating Legislative Framework

India enacted the Prevention of Child Sexual Offences (POCSO) Act in 2012, establishing a comprehensive legal framework for protecting children from sexual exploitation. The POCSO Act introduced child-friendly procedures for reporting, evidence recording, investigation, and speedy trials through designated Special Courts. While the Act has effectively addressed physical child sexual abuse, it requires significant revisions to tackle OCSEA, which has evolved rapidly due to increased internet prevalence and technological advancements since 2012.

The 2020 report by the Adhoc Committee of the Rajya Sabha emphasized the need for explicit provisions on cyber-grooming and recommended aligning Indian law with international standards. Modern terminologies and offenses, including different types of CSAM, grooming methods, sextortion, self-generated CSAM, AI-generated CSAM, live-streaming of child sexual abuse, and offenses involving end-to-end encryption, peer-to-peer networks, dark web, and cloud storage platforms, need to be incorporated into the POCSO Act. Additionally, enhanced penalties for CSAM commercialization and provisions for long-term support for victims are necessary.

India’s Information Technology Act,[6] which penalizes CSAM, is outdated and is being replaced by the Digital India Act, expected to be passed in 2024. The new Act includes provisions for adjudicating user harm against children, age-gating, protecting minors' data, and ensuring their safety and privacy on social media platforms. The existing Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, mandate that Significant Social Media Intermediaries (SSMIs) deploy technology-based measures to identify child sexual abuse imagery, appoint compliance officers, and cooperate with law enforcement agencies. Despite these stringent regulations, compliance has been inadequate, prompting the Ministry of Electronics and Information Technology (MeitY) to issue directives for proactive CSAM removal.

?

Expanding specialized units

?




Counter Child Sexual Exploitation Centre

?

The Counter Child Sexual Exploitation Centre (CCSE) in Kerala is the first fully operational and most efficient state-level specialist unit and serves as a role model for other states. Key functionalities and duties of the CCSE that can be adopted by Specialist Units across States include:

  • Acting as the single "State-level Coordination Centre" for detecting and removing OCSE.
  • Operating 24/7 with a team of carefully selected personnel
  • Efficiently triaging and processing CyberTipline reports: In 2023, the CCSE received 104,355 CyberTipline reports

?It is essential to form such a specialized unit in every state?

International Child Sexual Exploitation Database

Given the international nature of OCSEA, identifying victims and offenders remains one of the most challenging aspects of investigations. India needs a comprehensive database of images and videos of victims and offenders, leveraging modern AI capabilities. To address this need, a National Victim Identification Framework is proposed, which includes the creation of a National Victim Identification Database. This initiative aims to unify the Central Bureau of Investigation (CBI) and state specialist units for a consistent approach to identification.

India has joined Interpol’s International Child Sexual Exploitation (ICSE) database, which links victims, abusers, and crime scenes using audio-visual data. The CBI connected India to this database in July 2022, making it the 68th country to join. However, state police forces, including those with specialist units, currently lack access to the database, limiting its effectiveness. Granting access to state police would enhance victim identification by linking investigations and preventing duplication. Once established, the National Victim Identification Database can be integrated with the ICSE database.

?Preventing Offending and Re-offending

India launched its National Database of Sexual Offenders (NDSO) in 2018, which includes information on child sexual offenders and is maintained by the NCRB. However, its composition, effectiveness, and usage by law enforcement agencies (LEAs) and other stakeholders have yet to be thoroughly evaluated. To enhance the effectiveness of the NDSO and combat child sexual exploitation, the following measures are proposed:

?

  • Implement regular monitoring of child sexual offenders and suspects, both physically and electronically, and facilitate intelligence sharing among stakeholder LEAs.
  • Enhance deterrence through the publicity of proactive actions taken against offenders.
  • Ensure coordination between LEAs, prosecution, and judiciary for case and trial management of habitual online child sexual offenders, aiming for speedy investigations and trials to prevent bail.
  • Encourage and support local volunteer organizations in diverting potential offenders, with initiatives like the Lucy Faithfull Foundation’s Stop It Now! in the UK serving as models. Adapt these initiatives to local contexts and, if successful, consider empowering them through legislation.
  • Conduct a thorough evaluation of the current NDSO and enhance its effectiveness by re-engineering processes and significantly improving its usage within the criminal justice system.

Conclusion

In conclusion, developing the capacity to combat online child sexual exploitation in India is crucial for protecting children and ensuring a safer digital environment. Implementing a robust monitoring system for child sexual offenders, enhancing inter-agency coordination, and promoting effective case and trial management will significantly deter offenders and expedite justice. Leveraging local volunteer organizations and adapting successful international initiatives will provide essential community-based support and prevention. Additionally, a comprehensive evaluation and improvement of existing systems like the National Database of Sexual Offenders will strengthen the overall response to OCSEA. Through these concerted efforts, India can build a resilient framework to safeguard children against online sexual exploitation and abuse.


[1] https://www.cia.gov/the-world-factbook/countries/india/#people-and-society

[2] https://www.weprotect.org/global-threat-assessment-23/

[3]https://static1.squarespace.com/static/5d7cd3b6974889646fce45c1/t/632f3344eacdbb108c8c356f/1664037701806/metaverse+%26+gaming.pdf

[4] https://annualreport2021.iwf.org.uk/trends/selfgenerated

[5] https://www.iwf.org.uk/about-us/why-we-exist/our-research/how-ai-is-being-abused-to-create-child-sexual-abuse-imagery/

[6] https://www.huntonak.com/privacy-and-information-security-law/india-passes-digital-personal-data-protection-act

?

Austin Berrier

Special Agent at Homeland Security Investigations - Phoenix

3 个月

As we recently discussed, it isn’t enough to only arm police with tools and capacity. We need to educate the youth, parents, teachers, and anyoelse that will listen. Making children hard targets in the first place is something we need to do better globally.

回复
Geeta Sekhon

International Consultant with United Nations

3 个月

Interesting write-up Guillermo Galarza Abizaid. Sharing for wider reach. Thank you to ICMEC for all the good work that is being done in India.

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了