BIG TECH HELD TO ACCOUNT AS VERDICTS EXPOSE FAILURES ON CHILD SAFETY

MEDIA STATEMENT 26 March 2026

Over the past 24 hours, two separate landmark court cases in the United States have delivered a clear and powerful message to Big Tech: enough is enough.

In New Mexico, a jury found Meta liable for failing to protect children from serious harm, including exposure to explicit content, exploitation and predatory behaviour.

At the same time, a separate jury in California found both Meta and YouTube liable in a social media addiction case, determining that their platforms were negligently designed in ways that harmed a young user’s mental health.

Let that sink in. Two different courts. Two different cases. Two different forms of harm. The same conclusion.

This is the chickens coming home to roost for Big Tech.

As someone who played a leading role in the Parliamentary inquiry that handed down the Protecting the Age of Innocence report in 2020, none of this comes as a surprise.

That inquiry heard confronting evidence about how easily children could access harmful material online, how platform design could amplify that harm, and how existing safeguards were simply not fit for purpose.

What we are now seeing internationally reflects exactly what we uncovered here in Australia.

On one hand, children are being exposed to deeply harmful and exploitative content. On the other, they are being drawn into addictive platforms that can have serious consequences for their mental health, including anxiety, depression and body image issues.

These cases make it clear that this is not a single issue. It is a systemic failure.

This is why the Coalition has long argued that protecting children online must come first and dragged this Labor Government kicking and screaming to take any sort of meaningful action.

This is not about politics. It is about responsibility and good policy, protecting our most vulnerable.

If a company builds and profits from a platform, it must take responsibility for the environment it creates.

The era of claiming to be a passive platform is over. Accountability has arrived.

Australia has taken important steps to strengthen online safety protections and raise the minimum age for social media access, but there is more to do.

We must ensure that age verification is strong and effective, not a box-ticking exercise.

We must ensure that harmful content and bad actors are identified and removed quickly.

And we must ensure that platform design itself does not deliberately exploit or harm young users.

These verdicts are a turning point, confirming what families, educators and child safety advocates have been saying for years.

I will continue to advocate for practical reforms that put the safety of Australian children ahead of the profits of multinational tech companies.

Because protecting our kids must always come first.

[ENDS]

Media Contact: Brendan West 0402 556 646  Brendan.west@aph.gov.au

Facebook
Twitter
LinkedIn
WhatsApp
Email