Internet Explorer 11 (IE11) is not supported. For the best experience please open using Chrome, Firefox, Safari or MS Edge

The European Data Protection Board recently published draft Guidelines on the interplay between two of the most significant pieces of European technology regulation, namely the DSA and the GDPR. The Guidelines are open for public consultation until 31 October 2025.

Our Data & Technology team provides a high-level overview of some important core takeaways from the Guidelines impacting technology service providers.


The European Data Protection Board (EDPB) published their draft Guidelines on the interplay between the EU Digital Services Act (DSA) and the GDPR in mid-September 2025. The Guidelines are open for public consultation until 31 October 2025.

In this article, we provide a high-level overview of some of the key takeaways from the Guidelines, which are likely to impact all manner of technology service providers.

Illegal content (Article 7 of the DSA)

The EDPB examine the GDPR’s impact on Article 7 of the DSA, which provides that intermediary service providers will not lose the benefits of the DSA’s various content safe harbours when they:

“…carry out voluntary own-initiative investigations into, or take other measures aimed at detecting, identifying and removing, or disabling access to illegal content, or take the necessary measures to comply with the requirements of Union law and national law…”.

Controllers should be able to demonstrate that any technologies deployed for these processes, like machine learning systems, comply with:

  • The principles regarding the processing of personal data, such as the minimisation principle, and
  • The data protection by design and by default obligations stemming from the GDPR.

In addition, in terms of GDPR legal bases, the EDPB note that because controllers are not legally required to carry out this type of processing, legitimate interests under Article 6(1)(f) will likely be the most suitable legal basis to rely on. When considering the three-step legitimate interests test, they helpfully acknowledge that:

“…It is clear that the interest of detecting and addressing illegal content in intermediary services to protect the recipients of the service is legitimate, in particular where such content can be disseminated to the public via an online platform”.

In contrast to monitoring actions regarding illegal content, the EDPB acknowledge a service provider could be obliged to process personal data under a requirement stemming from EU law. For example, this may arise when scanning systems for copyrighted content, or when complying with data subject rights requests. This in turn opens up potential reliance for controllers on Article 6(1)(c) of the GDPR, referred to as ‘legal obligation’ in these instances.

Deceptive design patterns

Article 25(1) of the DSA obliges providers of online platforms to:

“design, organise, and operate their online interfaces in a way that does not impair the ability of recipients of the service to make autonomous and informed decisions”.

The EDPB highlight the important distinction that according to Article 25(2) of the DSA, the prohibition in Article 25(1) of the DSA shall not apply to deceptive design practices which are already covered by the GDPR, or indeed the EU’s Unfair Commercial Practices Directive.

The EDPB say that when deciding if a deceptive design pattern falls under the GDPR rather than the DSA, two points matter:

  1. Is personal data being processed?
  2. Is the behaviour being influenced linked to how that personal data is processed?

For example, patterns that try to push all recipients of a service to buy a product by (emotional) steering, e.g. “There are only a few products left in stock”, may not be covered by the GDPR. However, if the recipient of the service is manipulated into providing (additional) personal data, for example, “There are only a few products left in stock. Enter your email address now and make a reservation”, then the pattern is subject to the GDPR.

The EDPB list common examples of deceptive design patterns that may cause addictive behaviours as those including: “infinite scrolling, infinite streaming, autoplay, periodic rewards, status or reputation improvements, collection completion, gamification, countdown timers, among others”.

Advertising transparency

Article 26 of the DSA lays down transparency rules for providers of online platforms regarding their advertising. In particular, meaningful information regarding the “main parameters” used to determine the ad recipient and, where applicable, information about how to change those parameters must be directly accessible from the advertisement.

This result is an important difference between Article 26 of the DSA and the transparency requirements under the GDPR. The GDPR requirements provide that in the case of personal data being collected directly from the data subject, information shall be provided at the time personal data is obtained. This can happen before the processing takes place, e.g. via a Privacy Policy contained in a New User Experience (NUX) flow when a user signs up. In the DSA’s case, as the information must be presented in real time with the ad, data processing to generate and deliver an ad will have taken place already.

Advertising and profiling

The EDPB notably consider that the provisions of Article 26 of the DSA may:

“…depending upon the particular characteristics of the case, relate to data processing practices that might fall within the scope of automated individual decision-making and profiling that fulfil the criteria of Article 22(1) GDPR, if the profiling in question leads to a decision that produces legal effects or similarly significantly affects data subjects.”

To assess whether an automated decision to present a specific advertisement to an individual produces legal effects or similarly significantly affects them, several (non-exhaustive) characteristics of the personal data processing activity (including at the level of each individual advertisement delivery) should be considered. These include:

  • The intrusiveness of the profiling process
  • The tracking of individuals across different websites, devices and services
  • The expectations and wishes of the individuals concerned
  • The way the advert is delivered, or
  • Using knowledge of the vulnerabilities of the data subjects targeted

Recommender systems (Articles 27 and 38)

The EDPB is of the view that recommender systems that involve the processing of personal data carry potential risks for individuals. Some of these risks relate to:

  • Processing of personal data on a large scale
  • Potential lack of accuracy and transparency concerning inferences and the combination of personal data
  • Evaluation or scoring or profiling of data subjects, and
  • The processing of special categories of data or data of a highly personal nature or data of vulnerable data subjects

The EDPB notes that, as with advertising, showing users specific content through a recommender system could count as a ‘decision’ under Article 22(1) of the GDPR - meaning one that significantly affects the individual.

Particular attention is needed where algorithms suggest content, services and products that could have a lasting impact on individuals or strongly influence their behaviour or choices – for example, recommender systems for housing or job offers on an online platform.

In terms of options to provide recommender systems which are not based on profiling, the EDPB advise that providers of Very Large Online Platforms (VLOPs) should present both options equally when they first use the service. The EDPB notes that systems should not “nudge” recipients of the service to select the option for a recommender system that is based on profiling. Providers of VLOPs and Very Large Online Search Engines (VLOSEs) may only use a recommender system based on profiling after the recipient of the service has chosen this option.

In addition, while the non-profiling-based option is active, the EDPB considers that the provider cannot lawfully continue to collect and process personal data to profile the user, for the purposes of future recommendations. If a user switches between two versions of a recommender system - one based on profiling and one not - they must not be profiled while using the non-profiling version.

Protection of minors (Article 28 of the DSA)

Article 28 of the DSA outlines that:

“Providers of online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service”.

It prohibits the presentation of personalised advertising to minors based on profiling.

The EDPB notably outline that Articles 28(1) and (2) of the DSA can qualify as a legal basis for processing under Article 6(1)(c) of the GDPR to process personal data subject to the condition that the controller is able to demonstrate on a case-by-case basis that this processing, e.g. in the context of age assurance, is necessary and proportionate to achieve the goals established by Articles 28(1) and (2) of the DSA.

Risk assessments

The EDPB highlight that providers of VLOPs and VLOSEs are obliged under Article 34 of the DSA to carry out a risk assessment for systemic risks. These include risks to the protection of personal data. If there are systemic risks, the EDPB consider that a Data Protection Impact Assessment (DPIA) according to Article 35 of the GDPR is likely to be mandatory.

The EDPB highlight that Article 35(1)(a) of the DSA elaborates on the mitigating measure of “adapting the design, features or functioning of their services, including their online interfaces”. It also notes that this overlaps with the obligation of a controller to design a system for processing personal data in accordance with the requirements of data protection by design and by default under Article 25 of the GDPR.

Regarding the systemic risks that can stem, for example, from automated tools for detecting illegal content or illegal activities, recommender systems and advertising systems, adhering to GDPR principles and safeguards, like data minimisation and pseudonymisation, can enable providers of VLOPs and VLOSEs, to effectively comply with the DSA, including Article 35 of the DSA.

Comment

The EDPB’s new guidelines, are reflective of its interpretation of EU rules, and should be carefully considered by impacted online services. The guidelines are also subject to potential update – the EDPB is inviting comment on them until 31 October 2025.

For more information on the implications of these Guidelines, please contact a member of our Data & Technology team.

The content of this article is provided for information purposes only and does not constitute legal or other advice.



Share this: