NOYB is tackling automated decisions: two cases



  1. Introduction

In December 2021, None Of Your Business (NOYB), the NGO founded by Max Schrems, filed two complaints against two tech companies on the basis of art. 22 GDPR, which regulates the use of automated decision making.

One complaint was filed against Airbnb,[1] because it downgraded the complainant’s rating as a host solely through an automated decision, without complying with art. 22 safeguards. The second complaint was against Amazon (Amazon Mechanical Turk)[2], due to the use of automated decision making to accept or reject workers, without providing the required information to individuals and the safeguards required in artt. 13, 14 and 22 GDPR.

This article will evaluate whether the decisions taken by the companies mentioned should fall within the remit of art. 22 GDPR. This determination is important since if the decisions are to be covered by art. 22 GDPR, many important consequences emerge:

  1. a) only some legal basis to process personal data can be used: i.e., consent, legal basis and contract, with exclusion of legitimate interests;
  2. b) individuals subject to the decisions must be informed about the fact that they will be subject to this kind of decisions, and, in particular, the meaningful information about the logic involved and the significance and the envisaged consequences of the processing;
  3. c) controllers must implement some safeguards to protect data subjects’ rights, and at least the right to obtain human intervention, to express his or her viewpoint and to challenge the decision.

A decision taken via ADM is under the scope of art. 22 if:

  1. a) it is a decision: for the purposes of this provision, a decision is any type of determination concerning a data subject, which may also include measures (rec. 71 GDPR);
  2. b) it is based solely on automated processing, including profiling: in this case, the decision must be based only on the processing of personal data carried out by automated means. A minimum, non-critical involvement of human beings in the decision-making process does not render a decision non automated. The human reviewer must have the power to evaluate the underlying facts and change the decision. In particular, a person who rubber-stamps a decision taken by an algorithm does fall under this provision.
  3. c) it produces legal effects concerning the individual or similarly significantly affects him or her: while there is consensus on the concept of solely automated processing, there is room for debate on what can be considered as producing legal effects and, in particular, which decisions ‘similarly significantly affect’ the individuals.


  1. The complaint against Airbnb

According to NOYB “the decision to delete the 5-stars Review has had the effect of reducing the overall rating of the complainant as a host, which directly influences the Superhost status of the complainant and the contractual advantages that it provides (…). In other words, the complainant can lose her Superhost status and the substantial advantages that it confers on her”.

The benefits of being a Superhost are: (a) more visibility and trust from guests; (b) to earn a superhost badge that can make the host listing more attractive to guests; (c) extra 20% on top of the usual bonus when the host refer new hosts; (d) after 4 consecutive quarters as a Superhost, they receive a travel coupon (currently worth $100).

This begs the following question: does the decision to downgrade an Airbnb host from Superhost to ‘normal host’ have a similarly significant effect as a legal decision? It is doubtful.

The decision in question must produce legal effect or similarly significant effects. A decision produces legal effects when it creates, modifies, or extinguishes somebody’s legal rights, status or obligations.

The decision falls under the scope of art. 22 GDPR if it affects data subjects in a similarly significant manner. The Article 29 Working Party (now European Data Protection Board) considered that the effects of the processing must be sufficiently great or important to be worthy of attention.[3]

Additionally, there were several cases where the interpretation of this provision was strict, opting for a narrow concept of what similarly significant effects mean. Courts and national data protection authorities considered that the following decisions fall outside of the remit of art. 22 GDPR: (a) Targeting advertising (except where it targets vulnerable persons, like gamblers);[4] (b) Uber’s batched matching system and upfront pricing system, even the where the systems had an effect on drivers’ earnings;[5]  (c) Temporary automatic blocking of the app to Uber drivers following a fraud signal;[6]  (d) Earning profile or the decision to allocate a passenger to an available driver;[7] (e) Pre-granted loans, prices adjusted to the customer’s profile, benefits and discounts[8].

As these examples show, the threshold is high, and trivial consequences cannot be included. So, it seems that the automatic decision to downgrade from Superhost to normal host, while surely detrimental for the host, cannot be considered as producing a similarly significant effect on the individual, and, hence, not covered by art. 22 GDPR


  1. The complaint against Amazon Mechanical Turk

The complaint against Amazon Mechanical Turk is different. Amazon Mechanical Turk is a crowdsourcing marketplace that makes it easier for individuals and businesses to outsource their processes and jobs to a distributed workforce who can perform these tasks virtually[9]. Crucially, Amazon Mechanical Turk reserves the right to automatically deny access to their platform.

Since the company uses automated decision-making to accept or reject workers, it can be covered by art. 22. Not only are “e-recruiting practices without any human intervention” expressly mentioned in rec. 71 GDPR as an example of decisions having similarly significant effects, but there were also recent decisions in this regard. For instance, The Italian Data Protection Authority considered that Glovo’s[10] and Deliveroo’s[11] booking systems, through which riders book the time slots predetermined by the company until saturated, fall within art. 22 GDPR. They produced significant effects because they allowed/denied access to job opportunities.


  1. Conclusion

In conclusion, the mere fact that a company uses an algorithm to make decisions concerning individuals does not mean that they will be covered by art. 22 GDPR. It is crucial to evaluate, in particular, whether the effects of the decisions meet the threshold established by the law (decisions having similarly significant effect to legal decisions) as complemented and illustrated by the EDPB’s guidelines and decisions rendered by courts and national data protection authorities.





[1] NOYB, “GDPR Complaint: Airbnb hosts at the mercy of algorithms”,

[2] NOYB, “Complaint filed: Help! My recruiter is an algorithm!”

[3] Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679.

[4] Guidelines 08/2020 on the targeting of social media users.

[5] District Court of Amsterdam in ‘Uber B.V’, C/13/696010 / HA ZA 21-81.

[6] Idem ant.

[7] District Court of Amsterdam in ‘Ola Netherlands B.V’, C/13/689705/HA RK 20-258.

[8] Agencia Española de Protección de Datos, Caixabank SA, Procedimiento nº: PS/00477/2019.


[10] Ordinanza ingiunzione nei confronti di Foodinho s.r.l. – 10 giugno 2021 [9675440].

[11] Ordinanza ingiunzione nei confronti di Deliveroo Italy s.r.l. – 22 luglio 2021 [9685994].

Share this article!

About Author

Leave A Reply