#11 Capitol Hill Social Media Hearing Reinforces Need for Safeguarding Children from Online Sexual Abuse
Left to Right: Nate King, Sally Frank, Natasha Fernandes, Vanessa Bautista

#11 Capitol Hill Social Media Hearing Reinforces Need for Safeguarding Children from Online Sexual Abuse


On January 31st, five of the world’s biggest social media companies fronted a US Senate Judiciary Committee to answer questions about their failure to protect children online.

The CEOs of Meta, X, TikTok, Snap and Discord faced questions about what they are doing to address the online sexual exploitation of children on their platforms.

Australia’s eSafety Commissioner Julie Inman - Grant has been asking these same questions of big tech over the past year by issuing transparency notices requiring companies to explain how they are meeting Australia’s basic online safety expectations.

What eSafety uncovered was some companies’ neglect of basic safety features that could prevent the online sexual exploitation of children on their platforms, and in other cases, a refusal to even report how they deal with the problem.

Around the world, we are witnessing an arm wrestle between multinational tech corporations and national governments over online child safety.

On the eve of the big tech hearing in Washington DC, a US-Australian agreement came into force, which raises the requirements of service providers holding electronic data for the purpose of countering serious crime including the online sexual exploitation of children.

The agreement, announced by Hon Mark Dreyfus KC, MP and his counterpart, US Attorney-General Merrick B. Garland, strengthens international cooperation by requiring more timely access by law enforcement to data held by electronic service providers in each country to better enable agencies to prevent, detect, investigate, and prosecute serious crime.

Online sexual exploitation of children is a rapidly growing crime that takes many forms. In the Philippines and elsewhere, children in poorer countries are sexually exploited by an adult in-person via video livestream using everyday social media and video-chat apps, while the abuse is livestreamed to products at the direction of paying western offenders, including in Australia.

The best way to protect children from this abhorrent abuse is to prevent it from occurring in the first place, and a key form of prevention is when tech companies make their platforms safe by design, making it harder for offenders to exploit children online. That’s why tech sector regulation that moves the entire industry is essential in the fight against online sexual exploitation of children.

That’s why tech sector regulation that moves the entire industry, like Australia’s basic online safety expectations of tech companies, is essential in the fight against online sexual exploitation of children.

The Australian Government is currently in the process of strengthening Australia’s basic online safety expectations, with submissions open until 16 February 2024.

The proposed amendments to these industry expectations explicitly require tech companies to ensure the ‘best interests of the child’ are a primary consideration in the design and operation of their services, as clearly, they have been failing to do so.

IJM is calling for a clear framework that sets out the obligations of tech companies to provide digital evidence to law enforcement during investigations related to child sexual abuse material as part of Australia’s basic online safety expectations.

Law enforcement must have timely access to electronic data to assist online child sexual abuse investigations in real time and bring children to safety, wherever they are in the world.

?

For more information about IJM Australia, visit: www.ijm.org.au

要查看或添加评论,请登录

International Justice Mission Australia的更多文章

社区洞察

其他会员也浏览了