Several European countries are calling for stricter rules to keep children off social media, amid growing concern over its impact on mental health, safety and development. The push is led by Greece, with support from France and Spain, and comes as the EU faces mounting pressure to strengthen protections for minors online.
European ministers were expected to discuss the proposal in Luxembourg on Friday. Among the key ideas is the introduction of a common “age of digital adulthood” across the 27-member bloc. This would require platforms to obtain parental consent before granting children access to social media.
France, Greece and Denmark support a ban for those under 15, while Spain wants the limit set at 16. The push follows Australia’s decision to restrict social media access for under-16s later this year. New Zealand and Norway are also considering similar steps.
Backlash against Big Tech
The European Union already has some of the world’s toughest regulations for digital platforms. It is currently investigating several major companies, including Meta and TikTok, under the Digital Services Act (DSA) over alleged failures to shield minors from harmful content.
The European Commission recently published draft guidelines to help platforms protect children. These include setting underage accounts to private by default and providing easier ways to block or mute users. The guidelines are non-binding for now.
France has taken a lead on the issue. A 2023 law requiring parental consent for users under 15 has yet to receive EU approval. Separately, France this year ordered adult websites to implement strict age verification, prompting several platforms to shut down in protest.
TikTok also came under pressure. The platform recently removed the “#SkinnyTok” hashtag — associated with promoting extreme thinness — following a push from French authorities.
“We have an opportunity that shouldn’t be missed,” said France’s Digital Minister Clara Chappaz. “Age verification is possible, and we want the same standards for social media as we now have for adult content.”
Algorithmic concerns
The proposal from France, Greece and Spain raises concerns about the design of social media algorithms that expose children to addictive and harmful content. Officials warn that such exposure contributes to anxiety, depression and low self-esteem among young users. Excessive screen time, they argue, also affects children’s ability to develop critical thinking and social skills.
One of the proposal’s key recommendations is to embed age verification directly into digital devices like smartphones. The European Commission plans to pilot an age-verification app as soon as next month, which it says can function without compromising users’ personal data.
The three countries also want an EU-wide system that enables parental controls and restricts the use of certain apps by minors. Danish Digital Minister Caroline Stage Olsen said children deserve the same level of protection online as they do offline.
“Social media platforms are designed for engagement, not for safeguarding minors,” she said. “We have to act now to ensure they’re safer than they are today.”
Growing scrutiny
Alongside the DSA probes into Meta and TikTok, the European Commission has also launched an investigation into four pornographic websites over alleged failures to block underage users.
While the bloc’s efforts are gaining momentum, officials admit that agreement on exact age limits remains a sticking point. However, there is broad consensus on the need to enforce proper age checks and limit children’s exposure to harmful content.