top of page

Ban the App, Keep the Problem?

Ban the App, Keep the Problem?
Ban the App, Keep the Problem?
If you squint, the ban looks sensible. The harms are real; the stories are heartbreaking; the platforms have been slow to grow up. A hard age line feels like action.

But as public policy, it’s a band-aid on a fracture. We’re treating the symptom (screen time) while the underlying disease (unsafe design, predatory algorithms, surveillance-for-profit, patchy supports for kids and families) hums along, untouched.

Here’s the tension I want us to hold without flinching:

  • Social media can be harmful, especially for developing brains, sleep, body image, and attention.
  • Social media is also where young people live: friendships, identity work, communities of support, news, creativity, help-seeking - 73% of young people report to use social media for mental-health support at some point. Cutting that off with a single stroke risk pushing distress underground and widening inequity (the resourced will route around it; others won’t).

Australia’s own child-rights and digital-childhood experts have already warned: an across-the-board ban is a blunt instrument. It shifts responsibility onto parents without fixing unsafe products, is technically hard to implement without fresh privacy risks, and can even disincentivise platforms from building protections for the under-16s who will inevitably slip through any age gate. It also clashes with the UN’s framing that children should have safe access to participate in the digital world, not be excluded from it altogether. 

So yes: protect kids. But let’s protect them well.

What a grown-up package would look like beyond the band-aid


1) Make the products safe by design, not just the users “older.”Move from individual policing to systemic duties of care: risk assessments, default-off virality for minors, friction on forwarding, no late-night push alerts, robust reporting and human moderation, and an enforceable kids’ privacy code with teeth. Experts are explicit: product standards beat parental consent theatre. 

2) Regulate the feeds, not only the ages. Algorithmic accountability for under-18s: visibility into recommendation systems; opt-out into chronological feeds; hard bans on amplifying self-harm, eating-disorder and hate content; independent audit/penalty pathways when platforms fail to down-rank known harms. Again: a ban doesn’t improve the platforms kids will still use. 

3) Protect privacy while you verify. If government insists on age assurance, require privacy-preserving methods (no central ID honeypots; strict data minimisation; third-party escrow; deletion by default), and publish security audits. A recently published open letter by a network of Australian experts flags that current techniques are immature and risky, treat that as a red light, not a footnote. 

4) Don’t outsource the hard bits to parents alone. Give families real help: plain-English guidance, funded digital-wellbeing coaching through schools and community health, and quick-connect pathways to youth mental-health supports. The experts are clear that responsibility-shifting without scaffolding is unfair and ineffective. 

5) Meet young people where they already are. If kids will be online anyway, build safer rooms in the house: moderated spaces for peer support, youth-controlled privacy, and partnerships with credible services (think: ReachOut-style content embedded in-platform), rather than pretending absence equals safety. The open letter again: focus on support and empowerment, not just restriction. 

6) Close the ad loopholes that target distress.Turn down the firehose: strong limits on profiling minors; ban targeted ads around gambling and extreme body-modification content; publish ad libraries for all youth-adjacent categories. A ban without ad reform moves deckchairs.

7) Make schools part of the circuit breaker. National digital-literacy curriculum that teaches attention hygiene, reputation care, bystander skills, and algorithm awareness delivered like road safety: early, often, practical.

8) Fund enforcement, not just press conferences. Give the regulator enough engineers, investigators and lawyers to be scary. Publish enforcement dashboards and name-and-shame timelines so the public can see when platforms comply, or don’t.

Empathy first, optimism on purpose


A ban can feel like certainty in an uncertain space. But we don’t need certainty; we need competence.
Empathy says: “Kids aren’t the problem; designs are.” It also says: “Parents are exhausted; don’t hand them an impossible job and call it policy.” And it continues to speak its wisdom as is says: “First Nations communities, culturally diverse families, rural kids each navigates online life differently; one big rule will land unevenly.”

Optimism, not the fluffy kind, the administrative kind, says we can raise the floor. Australia has done hard, technical, nation-scale reforms before. We can do them again here: regulate the product, preserve children’s rights to safe participation, and back families and schools with real scaffolding.

A ban may scratch the itch. A system will heal the wound. Let’s choose the latter.

Comments


Featured Posts
Recent Posts
Archive
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page