Will Australia’s under-16 social media ban influence other countries? What is India thinking? – Firstpost

Will Australia’s under-16 social media ban influence other countries? What is India thinking? – Firstpost

  • Post category:World News
Share this Post


Numerous countries are watching as Australia enforces the world’s first across-the-board prohibition on social media usage for children under 16.

The rule, which officially came into force on December 9, compels globally dominant platforms — including TikTok, YouTube, Instagram and Facebook — to prevent young users from accessing their services.

Because of its severity and wide-reaching consequences, governments worldwide are now examining whether similar frameworks could or should be tried elsewhere — and what the implications might be.

STORY CONTINUES BELOW THIS AD

Several countries have explored age-based systems, while others rely on parental approvals, national firewalls or broad censorship. Still others are considering new pathways, such as stricter app-store obligations or device-level controls.

The motivation for such a firm policy in Australia stems from years of discussion about how digital ecosystems influence younger users.

Lawmakers cited the impact of algorithms, compulsive engagement, and the risks of harassment or exploitation as key reasons for elevating protection levels. They also noted that simply depending on companies’ voluntary age-verification systems or parental settings has proved inadequate.

Now
several other governments are closely observing how the rollout unfolds, as many face similar social pressures at home.

Early reactions suggest that Australia’s model may influence discussions in Europe, Asia and the United States, even if few countries are likely to replicate the framework exactly.

How countries are exploring age thresholds, parental controls

United Kingdom: The UK’s Online Safety Act compels platforms to prevent minors from encountering harmful material and mandates new safety standards. The law, however, does not set a specific minimum age for accessing social networks.

France: Under a national law passed in 2023, minors below 15 require parental permission to create digital accounts.

But practical implementation has proved difficult because platforms must verify the age of the child as well as the identity of the guardian — an area still fraught with technological issues.

Germany: Teenagers aged 13 to 16 are allowed to use social networks only if their guardians agree. Even with that system in place, child-protection advocates in Germany argue that existing measures still fall short, with younger children able to bypass safeguards.

STORY CONTINUES BELOW THIS AD

Italy: Italian rules establish that children under 14 need parental approval before opening social media accounts, although users above that age can join freely.

Norway: Authorities have proposed raising the digital age of consent to 15 from 13, with parents permitted to sign on behalf of younger teens.

Discussions are underway to introduce a statutory minimum age of 15 for social media access, meaning the country could ultimately become one of the strictest in Europe.

At the broader EU level, especially in Denmark, legislators recently adopted a resolution urging member states to implement a minimum access age of 16 for social networks, arguing that it would ensure more age-appropriate digital participation.

The same resolution also suggested a harmonised minimum age of 13 for general digital services and video-sharing platforms. Although the guidance is non-binding, it highlights growing pressure across Europe for more rigorous age-based digital frameworks.

Some Asian nations already have long-standing systems that deeply regulate digital activity.

STORY CONTINUES BELOW THIS AD

China: China has developed a heavily restricted internet environment over the past two decades through its “Great Firewall,” which blocks major international social networks altogether.

Separate from the firewall, Chinese regulators have established a “minor mode,” which includes device-level settings and app rules limiting screen time for young users across various age brackets.

Malaysia: Officials announced in November that Malaysia would introduce a blanket ban for users under 16 beginning next year, making it one of the few countries planning
a rule similar in spirit to Australia’s.

Afghanistan: Since the Taliban regained control, internet access and social networks have faced substantial monitoring and uneven availability. In some parts of the country, authorities have suspended broadband and WiFi services, citing moral or religious reasons.

Iran: Iran uses a highly restrictive model, preventing access to platforms such as Facebook, YouTube, X, Telegram and WhatsApp. Instagram, one of the last widely reachable services, was also blocked in 2022 after nationwide protests.

STORY CONTINUES BELOW THIS AD

Although millions of Iranians depend on VPNs to reach global platforms, they risk penalties or service disruptions while doing so.

North Korea: North Korea operates one of the world’s most isolated digital ecosystems. Citizens cannot access the global internet or foreign social networks, and rely on a state-controlled intranet called “Kwangmyong.” Any attempt to use external connections can result in harsh punishment.

Myanmar: Since the 2021 coup, Myanmar’s military government has imposed recurring blocks on social media and messaging services — Facebook, WhatsApp and Instagram among them — particularly during periods of public dissent.

Authorities have also targeted VPNs to prevent people from bypassing blocks. The official justification is the prevention of misinformation; however, critics argue these bans are largely political.

United States: In the United States, digital child safety is primarily regulated at the federal level through the Children’s Online Privacy Protection Act (COPPA).

COPPA prohibits any online service from gathering personal data from children under 13 without verifiable parental approval. However, COPPA does not regulate access to platforms, prompting states to take their own actions.

STORY CONTINUES BELOW THIS AD

Several American states have passed legislation requiring stronger protections:

  • Nebraska: A new law demands age checks for all users, with platforms required to obtain parental permission before minors create accounts. This law will start being enforced in July 2026.

  • Utah, Texas and Louisiana: These states have approved rules requiring app stores to verify users’ ages before they download or update certain apps. Tech companies such as Meta support assigning verification responsibilities to app stores, saying it centralises the burden. However, Apple and Google oppose such rules, arguing they would compel them to collect more information from adult users.

This year, the US Supreme Court upheld a Texas requirement that pornographic websites verify users’ ages. Although the case involved adult content rather than social networks, the ruling indicated that the Court may not oppose all age-based measures.

Major social media companies typically set 13 as the minimum age for joining their services. These policies are not legally mandated in most jurisdictions but are part of self-regulation efforts designed to align with COPPA in the US and global expectations around child safety.

In practice, these systems rely heavily on users being honest about their age, which leaves significant gaps.

Surveys conducted across multiple European countries show widespread non-compliance, with substantial numbers of children under 13 using platforms regardless of official rules.

Child-safety advocates argue that without robust age-verification technologies — something still hotly contested for privacy reasons — these voluntary policies remain insufficient.

What this means for India

Australia’s decision has raised an important question for India: could a similar prohibition work here? Experts in the country remain divided, with many emphasising systemic and behavioural issues rather than legal ones.

Efforts within India have already leaned toward community engagement and counselling-driven strategies.

Instagram, TikTok, Snapchat, Kick, YouTube, Facebook, Twitch, Reddit, Threads and X applications are displayed on a mobile phone, in this picture illustration taken on December 9, 2025. File Image/Reuters

Over recent months, thousands of parents have taken part in digital-awareness programmes designed to improve communication between adults and young people.

Speaking to The Times of India (TOI), Dr Manoj Sharma, who leads the Service for Healthy Use of Technology (SHUT) Clinic at the National Institute of Mental Health and Neurosciences, highlighted the growing concern.

STORY CONTINUES BELOW THIS AD

“This parent support group that we run has taken place online in the past four months. We covered 5,000 parents in three months, and then 40 new parents from across the country joined in such discussions the next fortnight.”

He noted that many parents feel overwhelmed by technological habits in their households.

“Parents want to know the dos and don’ts for families and how to manage escalation. They too are feeling stressed out with professional and family pressures and want to keep their mental health intact.”

Another major concern for Indian policymakers is the real-world behavioural impact on very young users.

Dr Monica Sudhir, a counselling psychologist who works directly with adolescents, described the severity of cases she encounters.

“I have seen really young children become extremely hyperactive and addicted in my clinic — kids who are not able to leave their phones or have huge, violent anger tantrums if the devices are taken away,” she told TOI, stressing that any purely legislative solution will address only part of the problem.

STORY CONTINUES BELOW THIS AD

She added that classroom dynamics also need renewal.

“We cannot stick to traditional methods of teaching in an era when attention spans and retention are declining. Students struggle to form coherent sentences with excessive use of technology, and it is a larger problem. We need to equip children with the ability to process information — give them assignments in the classroom instead of allowing them to do it at home where they will use AI end-to-end.”

These expert views suggest that while India faces the same challenges as Australia — screen addiction, exposure to harmful content, academic difficulty and declining attention spans — the preferred response may be broader educational reform rather than a sweeping block on social platforms.

India’s approach, at least for now, appears to lean toward education, awareness, and strengthening digital literacy, rather than adopting a strict prohibition.

Also Watch:

With inputs from agencies

End of Article



Source link

Share this Post

Leave a Reply