Helping Aussie kids stay afloat online
eSafety Commissioner
Empowering all Australians to have safer, more positive experiences online.
Today, I had the opportunity to address the Joint Select Committee on Social Media and Australian Society, as part of its inquiry into the influence and impacts of social media across many areas of our community.
Lessons learned in the water
Cast your mind back to one of your favourite beach experiences. There’s nothing but blue sky over the ocean, the sand is warm under your feet, and you can hear the joyful sound of children excitedly frolicking in the water. And thankfully, they instinctively know they need to swim between the flags.
That is because basic water safety awareness has been built up over generations with the perils of our expansive coastlines being well-known and understood, particularly from the tragic experiences of others.
We do not fence the ocean or keep children entirely out of the water but we do create protected swimming environments that provide safeguards and teach important lessons from a young age.
This is why Australia is a world leader when it comes to water safety.
It’s built on many years of advocacy, collaboration, education and legislation.
It is a coordinated effort on many fronts:
This is how we keep our children safe in the water while building up their skills, resilience and confidence.
Putting the right guardrails in place
We need to take a similar approach to ensuring online safety for our children. And that is precisely what eSafety has sought to do through education and regulation.
Just like surfing in the ocean or swimming in a pool, we know social media has many benefits for young people. It can be a source of inclusion, social connection, belonging, stress mitigation – and fun.
These benefits are especially important for young people who experience difficulties with participation and social inclusion in other contexts.
But social media also brings risks that parents, carers and our young people are rightly concerned about, the darker waters that may hide predators or algorithmic rips.
Our research shows that almost two-thirds of 14-17 year olds have viewed extremely harmful content in the past year including drug abuse, suicide or self-harm, as well as gory or violent material.
领英推荐
And more than a quarter were exposed to content promoting unhealthy eating habits.
Ensuring children’s digital environment is age-appropriate
eSafety has long advocated for age verification as part of a suite of measures to protect children from online content and conduct they are not ready for.
So, we’re pleased to see the age assurance trial being conducted by the Department of Infrastructure, and we’re poised to start the Phase 2 industry codes process to protect children from harmful content like pornography across the ecosystem.
Until these comprehensive age assurance systems are in place, implementation of social media age bans will be challenging to measure, implement and enforce.
We also need to be mindful of the many nuances this issue holds, and wary of unintended consequences.
For example, our research has shown maturity and resilience play an important role in determining children’s experiences on social media, both good and bad.
While more research is needed, expert bodies like the US National Academy of Sciences highlight the importance not only of training and education, but of holding industry to account for enforcing platform design features that create safer online spaces. This is why we believe Australia’s safety by design initiative has been so fundamental, and is taking off globally.
We must get the fundamental building blocks of these social media services right by demanding that tech companies enforce their own policies, use available technologies to tackle harms, and ensure that they are complying with Australian laws.
New standards for online safety
Our mandatory industry codes and standards are built on these ‘safety by design’ principles, setting out enforceable measures companies must take to protect children.
Today, mandatory standards have been registered that will require sectors of the AI industry to embed safeguards into open-source AI software such as “nudify apps” – powerful and freely available apps that can be used to create deepfakes, including of children– the first such prescriptive measures anywhere in the world.
We are entering into a complex new era of technical regulation but we need to continue focusing on keeping our children safer on the platforms and services they are on right now.
This is why a nationally coordinated and holistic approach is vital, ensuring our children swim between the flags and our lifeguards patrol the sands, letting our kids reap the benefits of connection while minimising the risks. Today, and into the future.
This is an edited version of eSafety Commissioner Julie Inman Grant’s opening statement to the Joint Select Committee on Social Media and Australian Society.