Latest

Tech Law Blog

Profiling and Automated Decision-Making under GDPR

03 November 2017

Click here to subscribe to our weekly tech law updates

In October 2017, the Article 29 Working Party, the collective group of European data protection authorities (WP29), adopted new Guidelines on profiling and automated decision-making under the GDPR.

The Guidelines state that profiling and automated decision-making can be useful for individuals and organisations. In particular, these forms of processing can deliver increased efficiencies and resource savings. However, WP29 also acknowledges that this processing may pose significant risks for individuals unless appropriate safeguards are put in place. The Guidelines serve to provide greater clarity surrounding the regulation of these forms of processing under the GDPR.

We identify the key takeaways from the Guidelines.

Profiling vs. ‘solely’ automated processing under GDPR

The Guidelines distinguish between the concepts of profiling and automated decision-making. Helpfully, the WP29 describes three ways in which profiling can be used in practice:

  • general ‘profiling’, defined in Article 4(4) GDPR 
  • human decision-making based on profiling; and
  • purely automated decision-making under Article 22 GDPR, which includes profiling legal effects concerning, or similarly significantly affecting, the data subject

Automated decisions can be made with or without profiling, which in turn can take place without making automated decisions. Only profiling that is based on purely automated processing, i.e. without meaningful human intervention, and which produces “legal” or “similarly significant” effects on a data subject is generally prohibited under Article 22 GDPR. In all other cases of profiling, the general provisions of the GDPR apply.

General prohibition on ‘solely’ automated processing producing ‘legal’ or ‘similarly significant’ effects

Recital 71 to the GDPR cites the “automatic refusal of an online credit application” or “e-recruiting practices without any human intervention” as typical examples of automated decision-making producing ‘legal’ or ‘similarly significant’ effects. However, beyond these examples, the GDPR is largely silent on the parameters of this processing.

In the Guidelines, WP29 helpfully explains that a “legal effect” usually suggests a processing activity that has an impact on someone’s legal rights, such as their legal status or rights under a contract, e.g. the freedom to associate with others, vote in an election, or take legal action. However, even where no legal or contractual rights or obligations are affected, data subjects could still be sufficiently impacted to attract the protections under this provision. According to WP29, a subjective analysis must be conducted to determine whether certain processing activities fall within the scope of Article 22 GDPR.

In many typical cases, targeted advertising does not have a significant effect on individuals, eg an advertisement for a mainstream online fashion outlet based on a simple demographic profile for ‘women in the Brussels region’. However, WP29 recognises that it is possible for targeted advertising to have a significant effect on an individual depending on their specific characteristics, eg someone in financial difficulties who is regularly shown advertisements for online gambling may sign up for these offers and potentially incur further debt. Therefore, WP29 states that consideration must be given to:

  • the intrusiveness of the profiling process
  • the expectations and wishes of the individuals concerned
  • the way the advertisement is delivered; and
  • the specific vulnerabilities of the data subjects targeted, including minority groups, vulnerable adults and children

Additionally, WP29 clarifies that automated decision-making that results in differential pricing could also have a significant effect if, for instance, prohibitively high prices effectively bar individuals from certain goods or services.

Exceptions to the prohibition

Automated decision-making under Article 22 GDPR may only be justified by at least one of three exceptions:

  • the automated decision-making is necessary under Article 22(2)(a) for entering into, or the performance of, a contract;
  • the data subjects give their explicit consent; and/or
  • Union or Member State law provides a legal basis.

WP29 reiterates its view that “necessity” should be interpreted narrowly. Also, the onus is on the data controller to demonstrate that the profiling is necessary and that no less privacy-intrusive methods could be adopted.

While “explicit consent” is not defined in detail in the GDPR, WP29 has announced that ‘explicit consent’ will be further addressed in their forthcoming guidelines on consent.

Transparency obligations

WP29 states that data controllers should be particularly mindful of their transparency obligations due to the potential risks and interference that automated processing may have on the rights of data subjects. In particular, the profiling process is often invisible to the data subject and involves the creation of derived or inferred ‘new’ personal data about the data subjects, which they themselves have not directly provided. Therefore, if the data controller is making automated decisions under Article 22(1) GDPR, they must:

  • tell the data subject that they are engaging in this type of activity
  • provide meaningful information about the logic involved; and
  • explain the significance and envisaged consequences of the processing

The information given should be meaningful, rather than a complex explanation of the algorithms used.

Appropriate safeguards and recommendations for best practice

Given that errors or bias in collected or shared data in the automated decision-making process can result in incorrect classifications and assumptions, WP29 recommends that controllers carry out assessments on the data sets they process in order to check for bias. These reviews should occur on a regular basis and should form a part of both product design and post-design implementation.

What’s next?

The Guidelines are not yet final and WP29 is accepting comments until 28 November 2017. Businesses should consider proactive methods of ensuring that the algorithms or any automated processing they use do not produce discriminatory, erroneous or unjustified decisions and results for data subjects. These measures should be documented to demonstrate compliance with the accountability principle under the GDPR. Also, businesses should keep up-to-date on newly issued guidance and guidelines from WP29, which should provide more insight into obligations under the GDPR and the expectations of regulators.


technology law, start-ups, social media, privacy law, iot, intellectual property, export control, eu regulations, eu law, employment law, data security, data protection, contract law, cloud, cjeu,

  • Google+
  • LinkedIn
A