Keeping Kids Safe
- Established the Australian Centre to Counter Child Exploitation
- Established the eSafety Commissioner
- Established the National Office for Child Safety
- Undertook Inquiry into Age Verification for Online Wagering and Pornography
- Designed and Implemented the inaugural National Strategy to Prevent & Respond to Child Sexual Abuse
- Cancelled Paedophiles’ Passports and Blocked Their Travel
- Equipped the Australian Federal Police in Operation Arkstone and Others
- Invested $3.4m into the Daniel Morcombe Foundation for Online Exploitation Prevention Programs
- Doubling the Size of the Australian Centre to Counter Child Exploitation
- Industrialising Law Enforcement Strike Capabilities to Counter Child Sex Offenders and Predators
- Implementing a $6.7m Trial of Age Verification and Assurance Technology for Online Porn, Wagering and Alcohol Sales
- Ban Sports Bet Advertising Before, During and After Sporting Broadcasts
- Establish a Royal Commission into Child Sexual Abuse in Indigenous Communities to seek justice and protect kids
- Outlaw Criminal Post-Boasting to stem the criminal trajectory of young thugs
Find out more at www.andrewwallacemp.com.au/plan
Keeping Kids Safe Survey
Inquiries & Initiatives
Debates in Parliament
Social Media Reform
More Than a Tweet or Tweaking
Originally published in The Courier Mail, 28 March 2024
In a matter of weeks, my wife and I are set to become first-time grandparents. We could not be more excited.
But as I look how our world is changing, particularly online, I’m more worried than ever about how my kids will be able to protect their kids.
The US Surgeon-General issued a public advisory, warning his country that social media is driving a youth mental health crisis.
And the number of children exploited and bullied online by both predators and peers is only getting worse.
That’s why I’ve fought so hard for age verification for online pornography and social media accountability.
We can’t expect kids to keep themselves safe online. Industry clearly has no interest in doing so either.
This is exacerbated by the Federal Government’s refusal to support age assurance legislation, despite the overwhelming support of parents, experts, the eSafety Commissioner, and even their own backbenchers.
Beyond big porn, we’re watching social media run roughshod over business, media, and democracy itself.
So how can we expect parents to take on these big tech platforms alone?
Government has a duty to equip parents and police with the mechanisms they need to protect our kids and hold social media companies to account.
The Coalition’s proposal to create a new Commonwealth offence to criminalise posting material that depicts violence, drug offences or property offences is welcome news.
I agree – it’s time to outlaw the act of promoting crime online. But I think we’re missing the bigger picture.
We need to ask ourselves why this kind of material gets attention in the first place.
Two answers: anonymity and algorithms.
We used to call social media a virtual town square, giving ordinary people the extraordinary opportunity to engage in a public-facing, worldwide setting.
But social media is becoming less a force for good than a facility for harm.
I’m not just talking about trolls and keyboard warriors who tweet mean things or post offensive memes.
Social media has become labyrinth of sometimes untraceable channels, swamped with automated, anonymous figures who show little to no regard or accountability for others.
Anonymous perpetrators of violence use social media to bully, ‘sextort’ and harass their victims.
Anonymous predators use social media to groom, exploit, and abuse children.
Anonymous parties of state-sponsored and organised crime gangs assemble armies of operatives and automated bots delivering targeted and harmful content, in an effort radicalise the vulnerable, terrorise dissidents, and disrupt democracy.
Their greatest tool, beyond the unaccountability that anonymity affords, is the simple algorithm.
Algorithms are the complex rules which guide what we see online. They’re designed to keep us hooked.
The emphasis on engagement puts the pressure on content creators to publish increasingly extreme material for likes and shares.
Algorithms amplify our biases, desensitise us to borderline content, and remove the moderating influence and accountability afforded by peers, parents, and social norms.
As a result, our newsfeeds become saturated with harmful material intent on consuming every square inch of our newsfeed.
And it’s not just interference by foreign actors, or non-state actors building extreme political silos which worries me.
It’s the insidious way algorithms entrap our vulnerable kids in cycles of harmful content.
2 in 5 Australian kids see porn on their newsfeed without seeking it.
“Thinspiration”, “hourglass abs”, and dangerous fad diet reels quickly roll into tips for purging and self-harm, driving an epidemic of eating disorders.
Foreign disinformation campaigns fade into extremist content for radicalisation and recruitment.
In the face of such evil, community notes, flags, and viewer discretion notices aren’t enough.
We need to address social media algorithms now.
Some US states are already looking at forcing tech companies to publicise their algorithms and to restrict the algorithms from promoting harmful content or products.
In my inquiry into family, domestic and sexual violence in 2021, I recommended that the Federal Government implement a mandatory ID verification regime for social media platforms, to address the problems stemming from anonymity.
In the 3 years since, I’m more convinced than ever that the time is now for social media ID verification.
We face a grave threat online for which parents and police are entirely unequipped.
With nearly 3 in 4 Australians already on social media, this issue needs more than a tweet and it needs more than tweaking.
Mandate age assurance for porn. Criminalise content which encourages crime. Regulate harmful social media algorithms. Verify ID for social media.
There’s a generation of our kids – our future adults – who depend on it.