Bitcoin

EU Clears Path for National Social Media Bans for Minors in New Digital Guidelines

The EU erases the way for national social media prohibitions for minors in new digital directives

The European Commission has published new guidelines on Monday under its powerful Digital Services Act (DSA), officially allowing the Member States to impose national restrictions – or pure and simple prohibitions – on minors’ access to social media.

Although not binding, this decision represents a significant change in policy which should radically reshape the functioning of digital platforms across the European Union, especially since several countries are preparing to implement specific age laws.

The guidelines are involved in the midst of public and political pressure on the EU to act decisively by protecting children online. National governments in France, Denmark, Spain, Greece and the Netherlands have long criticized the Commission to drag their feet on the issue, arguing that children are exposed to addictive characteristics, harmful content and risk of confidentiality without significant regulatory barriers.

Register For TEKEDIA Mini-MBA Edition 18 (September 15 – December 6, 2025)) Today for early reductions. An annual for access to Blurara.com.

Tekedia Ai in Masterclass Business open registration.

Join Tekedia Capital Syndicate and co-INivest in large world startups.

Register become a better CEO or director with CEO program and director of Tekedia.

The committee’s decision indeed gives the Member States the green light to take the lead. Countries can now apply stricter rules – such as prohibitions for users under a certain age or requiring compulsory parental consent – within the framework of the DSA, which already obliges the main platforms to assess and mitigate systemic risks, especially for vulnerable users such as children.

Denmark Digital Minister Caroline Stage Olsen, who presented the guidelines in Brussels alongside the EU technology chief, Henna Virkkunen, said: “Age verification is not pleasant to have. It is absolutely essential.”

Major impact expected based on platform users

With countries such as France and the Netherlands which put pressure on complete prohibitions on the use of social media for children under the age of 15, and others like Greece and Denmark promoting compulsory parental consent for minor users, platforms such as Tiktok, Instagram, Snapchat and Youtube are now confronted with the prospect of losing millions of users.

Industry analysts claim that guidelines could lead to a significant contraction of the user basis for those under 18 in Europe over the next two years, according to the way in which national governments implement politics.

According to internal estimates of certain platforms, minors represent up to 25 to 30% of active daily users in certain EU countries. The loss of this segment would overthrow everything, from content algorithms to content with targeting and monetization of ads.

The age verification application moves to the deployment

To help apply these modifications, the Commission has also published technical specifications for a new age verification application that will allow users to confirm their age using an identity or facial recognition technology issued by the government. The application should test in five EU countries – France, Greece, Spain, Italy and Denmark – which are actively developing or considering their own national restrictions.

The EU has declared that although the application is voluntary, it is designed to serve as a common infrastructure which can be adapted by countries which fix different minimum age thresholds, whether 13, 15 or 18.

The initiative comes while platforms are faced with an increased exam to rely on self-depressed age information, which turned out to be ineffective to prevent minors from accessing inappropriate age content.

“It is difficult to imagine a world where children can enter a store to buy alcohol, go to a nightclub by simply declaring that they are quite old, no bodies, no identity checks, just a simple yes, I am over 18 years old”, but that’s what “was the case online for many years,” said Stage Olsen.

What platforms should now

Beyond the application of access restrictions, commission directives also describe the best practices that platforms should follow to protect minors still authorized on their services. Key recommendations include:

  • The deactivation of addictive features such as “streaks”, “love” and read the receipts which put pressure on children in prolonged use.
  • Deactivation of access to the camera and the default microphone for minor users.
  • Make the private accounts by default and restrict that can display or interact with them.
  • Eliminate behavioral monitoring to prevent platforms from using children’s navigation habits to customize content or announcements.
  • Deployment of a risk -based approach, requiring platforms to assess their systems for any damage to children and take tailor -made measures to mitigate them.

Although these directives are technically voluntary, the application of the DSA means that platforms that do not demonstrate efforts to comply could face heavy fines of up to 6% of their turnover and global risk in suspension in the EU.

The technological industry pushes against fragmentation

In response, large technological companies have launched a lobbying campaign, arguing that guidelines could lead to a fragmented regulatory landscape across the EU. Companies warn that if each Member State adopts its own age rules and application tools, it will become more and more difficult – and expensive – for platforms to be respected.

Meta, who owns Instagram and Facebook, said in a statement that he supports “a harmonized and transparent approach to age verification”, fragmentation at the national level could “undermine the efficiency of current digital frameworks” and “risk confusing users”.

Other companies, including Tiktok and Snap, would also be concerned about the requirement to rethink user interfaces and backend systems in the countryside, especially since many minors are often on their age during registration.

A new regulated access era

The committee’s decision marks one of the most ambitious attempts to date of a global regulator to limit and reshape the way young people access digital platforms. With the Digital Services Act now in force and the member states embarked on to set national thresholds, the days of access without restriction to social media for minors in Europe can end.

France should already start to debate a bill which would prohibit all users under 15 to join the social platforms unless they have verified the consent of the parents. Spain and Italy should follow closely, while Denmark is currently revising its digital policy framework to demand stricter protections for children.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button