Non-human in Human Trafficking Prevention: Impact of AI Chatbots
AusCam Freedom Project
Preventing sexual exploitation and trafficking of girls in Cambodia.
Writer: Dr Ian Weber
Discussions on the potential of Artificial Intelligence (AI) across the digital world have recently dominated comments, posts, and feeds. Most of these posts point to a transformational moment in history, and AI offers great potential to address human trafficking and prevent Online Child Sexual Exploitation (OCSE).
OCSE is a significant threat to children’s safety in Cambodia, with sextortion, grooming and child sexual abuse materials the most prevalent forms of abuse. With children spending more time online during the COVID-19 pandemic, offenders gained more access to vulnerable young people with poor digital literacy using the internet with little parental oversight.
A recent UNICEF report titled Disrupting Harm in Cambodia indicated that 11% of internet-using children aged 12-17 had experienced clear examples of online sexual exploitation and abuse in 2021. If scaled to Cambodia’s population within this age group, the figures represent 160,000 children subjected to online sexual exploitation and abuse. One of our partners, APLE, found that the rate of OCSE had doubled between 2020 and 2021, painting a terrifying and tragic picture for Cambodia.
AusCam Freedom Project intervenes to protect girls from Online Child Sexual Exploitation before it begins, before its impact and trauma mark their lives, providing safe spaces to build their skills and confidence. We seek to respond to the massively growing scale of the problem by assisting young women to become agents of change, reaching out and protecting others in a way that can multiply.
AusCam is working with partner organisations APLE and Terre des Homme Netherlands to launch a chatbot and avatar on our website and Messenger platform to create a virtual assistant that offers our Facebook users assistance, information, and services to girls and young women without requiring them to interact with people at the initial point of contact.??
The virtual assistant can help girls and young women navigate complex pathways and provide people with options for reporting and tips on collecting evidence on keeping safe. These incidents can be sent to regulators and social media platforms requesting the images to be removed and instigators blocked from the site.?
AusCam’s Executive Director Nigel Goddard said implementing the chatbot system provides appropriate online and offline help and protection to girls and young women.
‘This chatbot will enable girls and young women to seek help and report exploitation,’ he said.
‘It is a key step in our new strategy to equip and mobilise young women to connect, campaign, and challenge one another to be safe online.’
?‘We celebrate this technology as a way of building a scalable, cost-effective response to prevent the trauma and tragedy that is Online Child Sexual Exploitation.’
AusCam’s virtual assistant initiative follows current global research trends on the capacity of AI to support the protection and prevention of OCSE.?
Australia’s Royal Melbourne Institute Technology (RMIT) launched a new chatbot, Umibot, to fight image-based abuse. Image-based abuse – when someone takes, shares, or threatens to share nude, semi-nude or sexual images or video without consent – has become a growing problem experienced by one in three Australians. Such figures are alarming given that many incidents go unreported for several reasons, including embarrassment, lack of support, and cultural issues.?
Lead researcher for the Umibot project Professor Nicola Henry from RMIT’s Social and Global Studies Centre, said that many people did not know where to go for help, and some did not know what had happened to them was a crime.?
‘The victim-survivors we interviewed said they were often blamed by friends, family members and others and made them feel ashamed, which made them even more reluctant to seek help,’ she said in a media release.?
领英推荐
RMIT Research fellow and co-researcher Dr Alice Witt said the chatbot was not a replacement for human support but designed to help people navigate complex pathways and provide people with options for reporting and tips on collecting evidence on how to keep safe.?
She said Umibot was not just for victims but also to help bystanders and even perpetrators as a potential tool to prevent this abuse from happening.
Other research raises the issue of non-standardised procedures across messaging applications, social media, and video hosting websites reporting such Non-consensual Intimate Images (NCII) abuses. In some situations, it is not possible to report specific media content as NCII. Often, reporting abuses requires the victims to fill in an accessible text narrative of the crime, which could lead to blank-page anxiety, second victimisation and curtailing access to justice.?
Italian researchers Mattia Falduti and Sergio Tessaris at the Free University of Bozen-Balzano published research in 2022 titled, On the Use of Chatbots to Report Non-consensual Intimate Images Abuse: The Legal Expert Perspective, demonstrated how chatbots could support users to access justice procedures.
AusCam will adopt a similar approach to our virtual assistant offering a less confrontational point of contact for girls and young women to seek information and support to address Child Online Sexual Exploitation and abuse. We use a Risk Assessment & Response Tool to identify girls in vulnerable situations, intervene to reduce risk factors, and help them build their resilience for the future.?
Executive Director Goddard said an opportunity to explore and learn, the emotional safety and? anonymity that chatbots create often make them a very effective? hotline and reporting tool.?
‘At AusCam, we are gearing up to have social workers equipped and ready to respond to the potential influx of girls by combining chatbots, assessment tools, and case management systems to optimise the support and healing from trauma that we can provide.’
The virtual assistant will provide resources, such as media links and videos, to children, young people, parents, and the counsellors involved in child protection. It will make it possible for girls, young women, parents, and support groups to:
Moving forward, we will develop the chatbot to promote safe ways that girls can research forms of exploitation, and to help child-led researchers to connect and amplify their voices.?
AusCam has established a Trusted Partner agreement with Meta to report any cases related to safety concerns and other concerns as guided by the company’s community guidelines. Meta provides a unique channel for such reports in two ways: by email and an in-app report mechanism. Online material contravening its community guidelines will be removed and the account deleted.
AusCam sees a lasting, protective change within Cambodia through equipping, empowering, and connecting girls and young women. As girls and young women speak out against the issues and share ways to protect themselves, they are building a core of leadership and resilience within their life.?
Appropriating the same technology perpetuating OCSE abuses and coupling it with a growing sisterhood of change agents, AusCam is starting a change that will spread sideways, young woman to young woman. Hope, knowledge and the community and identity of a sisterhood of everyday heroes will become a force equal to protecting girls and young women across Cambodia.
Go to www.auscamfreedomproject.org to find out more about our work.