Push for stronger child online safety laws
Rural health groups welcomes the under-16 social media ban but warns deeper tech reforms are needed to keep children safe online.
What’s happening?
The Australian Government has introduced legislation that will restrict social media access for under-16s.
The National Rural Health Alliance (NRHA) has welcomed the move as an early sign that the government is ready to act on digital harm.
The group also warns that the measure is only a small start and cannot stand alone.
NRHA Chief Executive Susi Tegen says families cannot shoulder the workload created by the ban.
“We all wish for young people to be safe online, but banning under-16s from social media risks pushing them into unregulated spaces, cutting them off from support networks and placing the burden of policing technology onto families that are already stretched.”
She adds that today’s step should be the beginning, not the end.
“Today’s legislative step is welcome, but it must not be the full stop, it must be the first comma. Real protection comes from regulating platforms, not punishing children. Build safety in, don’t legislate absence.”
Why it matters
The NRHA says the real problem sits with platforms that shape behaviour through design choices that fuel mental health risks. The group argues that teenagers are not the issue.
“The real issue isn’t teenagers, it is the platforms designed to harvest attention, amplify harm and escape accountability,” Ms Tegen says.
The Alliance warns rural young people, who often face isolation due to geography, rely heavily on online spaces for connection, expression and support.
Local impact
Rural and remote families already juggle fewer services, higher pressure and limited offline options.
Cutting children from major platforms without wider protections risks deepening that isolation and could make harmful content harder to track.
Young people are already moving to VPN tools and encrypted spaces that sit outside oversight.
By the numbers
Rural young people depend heavily on online platforms due to limited local networks and services.
Global child and youth advocates have warned for years that only wider reforms can curb harm online.
Tech platforms continue to evade accountability while children move into less visible digital spaces.
Zoom in
Ms Tegen says stronger safeguards are needed at a system level. She warns that current patterns of harmful content, commercial targeting and body-image pressure have already outpaced regulation.
“It is a problem that has already galloped ahead, unless stronger system responses follow in rapid succession.”
She also notes public concern about exploitation and bullying is valid, but platform responsibility remains the missing piece.
Zoom out
Tech-corporate responsibility is still not enforced in Australia in a broad or binding way.
Without standards that force platforms to limit harmful design features, the NRHA says laws will only shape where children go, not how safe they are.
The group argues that the reform risks being seen as political theatre if bigger changes do not follow.
What to look for next?
The Alliance says further reforms must be rapid and wide.
It calls for platform duty-of-care laws, mandatory safety-by-design settings, updated digital literacy programs that include parents, youth-friendly complaint pathways and a specialist model for rural and remote children.
“We acknowledge that this legislation shows political intent, but the horse has bolted,” Ms Tegen says.


