European Commission Publishes Draft Guidelines on Minor Safety

The European Commission has now published its much-anticipated Draft Guidelines on the protection of minors under Article 28 of the Digital Services Act (DSA). Service providers will no doubt welcome the level of clarity the Draft Guidelines bring to what is generally considered one of the most important but vague obligations in the DSA. With the guidelines open for public consultation until 10 June 2025, our Technology team highlights some key takeaways for providers of online platforms.
The European Commission, on 13 May 2025, published the eagerly-awaited Draft Guidelines on the protection of minors under Article 28 of the Digital Services Act (DSA).
To recap, Article 28(1) of the DSA requires that providers of online platforms 'accessible to minors’, i.e. under-18s, put in place:
“appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service.”
The Draft Guidelines confirm that an online platform can be accessible to minors even where the relevant terms of service stipulate a minimum age of 18, if the online platform does not have effective measures in place to enforce that restriction. An online platform might also be accessible to minors where the provider is “otherwise aware” that minors are on the platform. This might occur, for example, because the service is known to appeal to minors or personal data processing reveals that some of the users are minors.
The Draft Guidelines also describe the measures the Commission considers that online platforms should take to ensure a high level of privacy, safety and security for minors online. These will be applied by the Commission as a “significant and meaningful benchmark” when assessing platforms’ compliance with Article 28(1) of the DSA. We explore some of the key measures.
Risk review
First, the Draft Guidelines emphasise that a risk review should be carried out in order to determine which measures are appropriate and proportionate to ensure a high level of privacy, safety and security for minors on the relevant platform. The risk review should, at a minimum identify:
- The likelihood that minors will access their service.
- The risks to the privacy, safety and security of minors potentially posed by or arising from its platform
- The provider’s risk prevention and risk mitigation measures already in place
- Any appropriate and proportionate additional measures to ensure a high level of privacy, safety and security for minors on their service, and
- The potential impact (positive or negative) of those measures on children’s rights
In all instances, the platform must give due consideration to the best interests of the child and ensure that their protective measures do not disproportionately or unduly restrict the rights of the child. These include the rights to participation, freedom of expression, information, etc.
While timelines are not set in the draft guidelines, a risk review should presumably be carried out as soon as possible following finalisation of the guidelines. In addition, a subsequent risk review should be completed whenever an online platform makes “significant changes” to the platform. Online platforms that are also designated as very large online platforms should carry out the risk review as part of their “general assessment of systemic risks” under Article 34 DSA.
Service design – account settings
The Draft Guidelines provide that accounts for minors should be set to the highest level of privacy, safety and security by default. This means that default settings need to be designed in such a way that interaction is limited to approved contacts, and features like geolocation, tracking, autoplay, livestreams, and push notifications are turned off.
Features which promote excessive use, such as communication streaks, or those which harm self-esteem like body filters should be turned off by default.
In addition, child-friendly controls, which adapt based on the child’s age, should be put in place. Age-appropriate warnings should also be used when a child attempts to change their default settings, and should clearly explain the impact of any change they make.
Interestingly, these requirements mirror the approach taken by the Irish Data Protection Commission in its Fundamentals for a Child-Oriented Approach to Data Processing.
Recommender systems
The Draft Guidelines build on and expand the application of the DSA’s obligations regarding recommender systems.
- Technical operation: The Draft Guidelines make clear that online platforms should take into account the specific needs of minors. They require that platforms prioritise ‘explicit, user-provided signals’ such as survey responses, implement controls to indicate content preferences and user reporting over ‘implicit, engagement-based signals’, such as inferred preferences based on platform activity, to determine the content shown to the minor.
- Non-profiling alternative: The Draft Guidelines purport to essentially expand the application of Article 38 of the DSA to all online platforms. Article 38 currently only applies to very large online platforms and very large online search engines, and requires them to offer an option for each recommender system that is not based on profiling.
- User controls: Online platforms are expected to introduce controls to ‘reset’ recommended content feeds “completely and permanently”, along with controls to influence the parameters of the system by allowing a “most interested” and “least interested” functionality.
Age assurance
The Draft Guidelines recommend that an assessment be carried out to determine whether a form of age assurance is appropriate for the service. The Draft Guidelines do not set out specific factors for determining if an age assurance solution is required. However, the Draft Guidelines do note that platforms are expected to assess the relevant risks to minors in accessing the platform or individual content and features.
If an age assurance solution is required, online platforms must develop a solution that is proportionate and effective to the relevant risks posed by the content, feature(s) or platform. The Draft Guidelines differentiate between three forms of age assurance:
- Self-declaration: methods that rely on the individual to supply their age or confirm their age range, either by voluntarily providing their date of birth or age, or by declaring themselves to be above a certain age
- Age estimation: independent methods which allow a provider to establish that a user is likely to be of a certain age, to fall within a certain age range, or to be over or under a certain age, and
- Age verification: a system that relies on physical identifiers or verified sources of identification that provide a high degree of certainty in determining the age of a user
The Commission considers that the highest form of age assurance – age verification – is necessary in certain cases. This includes situations where applicable EU or Member State law sets a minimum age to access specific products or services, such as alcohol, pornographic content, or gambling. It is also required in instances where the platform's terms of service specify that users must be over 18 (due to identified risks to minors), or any circumstances where the risk review identifies a high risk to minors which cannot otherwise be mitigated. Age estimation, on the other hand, is appropriate for medium risk content or services.
User reporting tool
The measures set out in this portion of the Draft Guidelines will be largely familiar to providers, including making the reporting functionality accessible and child-friendly, and allowing minors to report breaches of the platform’s terms and conditions.
Where categories (i.e. reasons for the report) are presented in the reporting tool, these should be adapted to the youngest users and should avoid complex menu systems. The Guidelines suggest that “poor practice” would be to present the user with a list of “15 different complaints categories” in order to report content. Interestingly, this arguably runs counter to the DSA transparency report Implementing Regulation, which requires providers to categorise the content of user reports into 15 defined types. The Draft Guidelines seem to suggest that providers should be providing users with more high-level categories, but providers will then have to map these reports to the more prescriptive categories outlined in the Implementing Regulation for transparency reporting purposes.
Comment
Service providers will no doubt welcome the measure of clarity the Draft Guidelines bring to what is generally considered one of the most important but vague obligations in the DSA. However, the Draft Guidelines do include a number of rather burdensome recommendations, and it remains to be seen what level of complexity those recommendations will bring to platforms’ approach to the protection of minors more generally. For example, the risk review, which is required to be conducted by all platforms, will undoubtedly overlap with risk-type assessments carried out by those platforms under other EU laws like the GDPR and AI Act, and in other jurisdictions like the UK and Australia. Also, whilst guidelines prepared under the DSA are ostensibly non-binding, in practice, regulators are likely to look unfavourably on platforms which ignore them. Of course, the Draft Guidelines are not finalised, and some of the recommendations may change in light of feedback from respondents.
Please do not hesitate to get in touch with a member of our Technology team should you have any questions on the Draft Guidelines, or on the next steps your organisation should take to comply.
The content of this article is provided for information purposes only and does not constitute legal or other advice.
Share this:
