
August 2025
by Adela Nuță and Ana Crăciun
The use of online platforms among the younger population is experiencing continuous growth, creating a digital environment in which the presence of minors is increasingly pronounced. While minors’ engagement with the digital space brings undeniable benefits, such as easy access to educational resources, the facilitation of social interaction among peers with shared interests, and the development of cognitive and relational skills it also generates significant risks that call for a coherent and integrated regulatory response.
Exposure to harmful content, the risk of cyberbullying, commercial pressures exerted through persuasive or addictive design, and the disruptive effects on the cognitive and emotional development of children are systemic challenges that require proactive, preventive, and contextually anchored measures. Moreover, the accelerated integration of artificial intelligence into digital platforms, as well as the proliferation of content altered through deepfake technologies, are likely to influence the way young people perceive and navigate the digital space.
In response to these challenges, on 14 July 2025, the European Commission published a practical guidance document dedicated to the protection of minors in the digital environment. This document was developed as a supporting tool for the implementation of Article 28(1) of Regulation (EU) 2022/2065 on Digital Services (the Digital Services Act – DSA). Although not legally binding, the guidance contains structured recommendations and concrete examples of good practices, aiming to assist online platform providers in identifying and implementing adequate and proportionate measures to ensure a high level of protection for underage users.
The guidance is accompanied by a technological prototype for age verification (the “age-verification blueprint”), developed on an open-source basis, which may be voluntarily adopted by Member States and economic operators. This prototype is built in accordance with the technical specifications of the forthcoming European Digital Identity Wallet (EUDI Wallet), which is expected to be implemented across the Union by the end of 2026. The application is designed to facilitate age-appropriate access to online content and services without collecting or disclosing users’ sensitive personal data, such as their exact date of birth or full identity. The underlying technology relies on “zero-knowledge proof” methods, which allow verification of a specific condition (e.g. being over 18) without transmitting additional personal information.
This initiative aims to facilitate lawful and secure access to digital content or services that are age-restricted, including those related to online pornography, gambling, alcohol, tobacco, nicotine products, or other regulated goods. The application is configurable and may be adapted to verify different age thresholds, depending on the national legislation of each Member State.
Although several European countries have called for the establishment of a common minimum age for access to social media across the EU, the European Commissioner for the Digital Portfolio has underlined that reaching a consensus on this matter would be difficult, given the cultural differences among Member States. Instead of imposing a uniform age limit, it has been argued that it would be more effective for operators to identify and mitigate risks associated with the design architecture of their platforms by implementing tailored solutions adapted to each specific context.
The pilot phase for testing the application will initially be deployed in France, Denmark, Greece, Italy, and Spain, in cooperation with national authorities, online platforms, technology providers, and organisations involved in the protection of minors. Gathering feedback from these stakeholders will be essential for optimising the solution and ensuring its seamless integration into existing digital ecosystems.
In parallel with the development of this technological solution, the Commission has published detailed guidelines on the protection of minors in the online environment, containing concrete recommendations for public authorities, digital platforms, and other relevant stakeholders. These guidelines specifically address four major categories of risks identified as priorities:
- Addictive design: Minors are particularly vulnerable to design practices that may stimulate compulsive behaviour. The guidelines recommend reducing minors’ exposure to such features and suggest the default deactivation of persuasive design elements that encourage excessive use of online services, such as the display of like and reaction counts, push notifications, or read receipts.
- Cyberbullying: It is recommended that minors be empowered to block users and to reject being added to group chats or communities without their explicit consent. Furthermore, the guidelines advise prohibiting other users from downloading or capturing screenshots of content posted by minors, in order to prevent the unwanted dissemination of intimate material or the risk of sexual extortion.
- Harmful content: The guidelines promote enhanced user control over the type of content to which young users are exposed. Platforms are encouraged to prioritise explicit user feedback over behavioural profiling. Consequently, if a young user indicates that they do not wish to be shown certain types of content, such content should no longer be recommended to them.
- Unwanted contact from strangers: The guidelines recommend that platforms set minor users’ accounts to private by default, meaning that such accounts should not be visible to users outside their contact list. This measure aims to minimise the risk of unsolicited contact from strangers, unless prior consent has been granted.
While not establishing legally binding obligations, the guidelines published by the Commission will nevertheless serve as a reference point in the monitoring of platforms’ compliance with the DSA, including in the context of enforcement proceedings already initiated against major players such as Meta and TikTok. As such, the document reinforces the European Union’s institutional commitment to upholding the best interests of the child, as enshrined in Article 24 of the Charter of Fundamental Rights of the European Union.
Through this combination of regulatory measures, practical guidance and technological innovation, the European Commission puts forward a concrete and balanced response to the challenges associated with the protection of minors in the digital environment. The initiative forms part of a broader effort to establish a safe and accountable online ecosystem, while laying the groundwork for potential common standards at EU level regarding age verification and platform responsibility.
The article is available in Romanian HERE.
