By Oreste Pollicino, Giovanni De Gregorio, Pietro Dunn
In September 2022, the Advocate General (AG) Rantos delivered his opinion in the case of Meta Platforms, Inc. v. Bundeskartellamt. The German Federal Cartel Office had issued an order against Meta prohibiting the processing of personal data obtained from non-Facebook sources for micro-targeting and advertising purposes as foreseen within Facebook’s terms and conditions. According to the Bundeskartellamt, Meta was believed to be non-compliant with the GDPR and that these violations were to be considered an abuse of dominance for a service provider with the market position of Meta. Following this decision, Meta brought an action before the Higher Regional Court in Düsseldorf that raised the preliminary reference to the ECJ.
Among the questions of this case, the referring court seeks guidance about data processing legal bases, particularly when that is necessary to perform a contract and in the case of the legitimate interest. First, regarding the contractual legal basis, the AG underlined that “[a]s far as the personalised content is concerned, it seems to me that, although that activity may, to some extent, be in the user’s interest, since it makes it possible to display content, particularly in the ‘newsfeed’, which, on the basis of an automated evaluation, matches the user’s interests, it is not apparent that it is also necessary in order to provide the service of the social network at issue, such that the processing of personal data to that end does not require the user’s consent”. Second, the AG addressed the question of the referring court whether the need to respond to a legitimate request to provide certain data, the need to combat harmful behaviour and promote security or to conduct research in the public interest and to promote safety, integrity and security justify the practice at issue. In this case, according to the AG, certain clauses characterising the practice at issue may be justified by legitimate interests in the circumstances described by the referring court. Therefore, the AG has not excluded the possibility to rely on the legitimate interest in this case, but this assessment should be made by the referring court. Besides, even the case of dominant position does not exclude the possibility to rely on different legal bases. According to the AG, “the mere fact that an undertaking providing a social network enjoys a dominant position in the domestic market for online social networks for private users cannot, on its own, render invalid the consent of the user”, although such an aspect should be taken into account. The AG also noted that the processing of that data does not necessarily require the consent of the data subject but may also be founded on the legal basis of legitimate interest if such processing is necessary for the provision of a platform’s service.
The Opinion of AG Rantos is part of a wider debate on the legal bases for collecting and processing of personal data for purposes of micro-targeting and advertising on online platforms. In June 2022, TikTok had announced a revision of its terms and conditions with respect to privacy and data protection policies in the European Economic Area, UK and Switzerland, according to which it would move from reliance on consent to legitimate interests as its legal basis under the GDPR, as of July 2022, for the collection and processing of data for such purposes. The data considered used concerned both users’ on-TikTok and their off-TikTok activity (notably, “information that businesses share with us in order to reach potential customers on TikTok”). This opaque change in the terms and conditions met widespread criticism and concern about the consistency of TikTok’s decision with the GDPR.
The primary issue was represented by the legitimacy of grounding data processing for advertising purposes on legitimate interest instead of consent, also in the light of the EDPB Guidelines 8/2020 on the targeting of social media that have been updated in of April 2021,4 and of the CJEU’s decision in Fashion ID. It has been noted, first of all, that it is not clear why TikTok’s choice of relying on the legal basis of legitimate interests should be recognized as necessary for the platform and proportionate to the pursuit of its business interests, with respect to the data provided by users when using the social network. Second, it has been underscored that the exercise of users’ right to object to the processing of their personal data before the initiation of the processing itself is cumbersome and counter-intuitive, in contrast with the EDPB’s guidelines. Third, with respect to observed data (i.e., data that is not actively made available to the social media provider but that is provided via the simple use of the platform), Article 5(3) of the Directive 2002/58/EC (e-privacy Directive) requires the need for free and informed consent – to the extent this is based on cookies or other information stored on the user’s device – so that legitimate interest, in the words of the EDPB, does not represent the appropriate legal basis.6
In July 2022, the Italian Data Protection Authority issued a decision warning the social network that this move would likely infringe the GDPR as well as the national legislation transposing the e-Privacy Directive. In fact, according to the Italian DPA, the processing of data collected from users’ personal devices for the purposes of advertising without their consent would be illegal. Additionally, the decision found that, following the difficulties admittedly encountered by TikTok in implementing age verification techniques, the new policy would likely affect minors as well as adults. The Italian DPA’s decision, however, was not the only one: the Irish and Spanish Authorities also intervened swiftly to warn TikTok, leading to the platform’s ultimate decision to suspend the adoption of the new privacy policies moving legal basis from consent to legitimate interest.
Both the case of Meta and that of TikTok showcase the central role played by privacy and data protection rights within the legal framework of the European Union. Indeed, the last decade has seen a rapid increase in the concerns of European and Member States’ institutions, going from the European Court of Justice to the European Parliament and the Commission to domestic authorities, that have led to the creation of a European fortress of personal data, that raises questions about the sustainability of online platforms’ business models. Although these concerns, and the institutional reaction to these concerns, find their rationale in the protection of constitutional values, particularly fundamental rights, the approach taken by Europe in this area raises significant questions with respect to the economic and technological sustainability, in the long run, of the European data protection legal framework, as well as with respect to the protection of other fundamental rights and values such as the freedom to conduct business, which is equally protected by the Charter of Fundamental Rights of the EU under Article 16.
The risk of following such an absolute approach could lead to an excessively restrictive enforcement, thus neglecting the central role of balancing in European constitutionalism. If the business model of social media platforms such as TikTok is based on the promotion of advertising, generally personalized and customized based on users’ interests and preferences, a disproportionate enforcement may ultimately affect that balancing. Even the European Court of Justice has underlined that privacy and data protection are not absolute rights in the European constitutional framework. Besides, restrictive choices may have an impact not only upon online platforms, social media, and social networks, but also upon other companies more generally: for instance, Apple relies on a variety of legal bases including legitimate interests for a range of data processing activities: most notably, for the distribution of personalized advertising.
From a regulatory point of view, the challenge is thus that of finding a balance where the protection of individual rights does not translate into a compression of market freedoms. In other words, it is essential that privacy and data protection are not turned into absolute rights, capable of quashing any other fundamental right, as protected amongst others by the Charter itself and recognised by the GDPR in recital 4. The mistake for Europe would be to grant absolute protection to privacy and data protection that does not leave space for flexibility to address the transformations of the digital age.
Within the European landscape, the GDPR sets a legal framework which is highly protective of data privacy and data protection rights of individuals across the continent. The GDPR already provides for a range of duties and obligations and provides data subject rights aiming at making data controllers accountable for the data processing activities they put in place and at mitigating the risks connected to them. A disproportionate enforcement of these rules, particularly in the case of legal bases, risks missing the important role actively played by the rules governing the practical and technical deployment of data processing itself and disregards the accountability framework set out in the GDPR.
It is against this backdrop that the principle of proportionality, interpreted as the guiding parameter of European constitutionalism, comes to play a critical role. Proportionality is a key requirement and criterion of legitimacy of any intervention aiming at imposing restrictions upon a fundamental right both within the framework of the ECHR and the Charter. Thus, when dealing with a conflict of opposing fundamental rights, both European and domestic constitutional courts are required to operate a balancing test to ensure that none of those rights is subjected to excessive compression. This approach has also been reflected within the EU regulatory framework as shown by the increasing resort to so-called “risk-based regulation”, whereby risk assessment becomes the proxy to defining the correct degree of duties and obligations to be imposed on market actors, thus providing another example of the role of proportionality in the digital age.
The cases analyzed provide a critical example of the extensive approach to European data protection. Currently, the constitutional scale between fundamental rights in Europe appears to be increasingly tipped in favor of privacy and data protection rights, creating a legal framework which if not carefully tended to, particularly in the case of enforcement, risks being at odds with the principle of proportionality. This situation should lead to work in the direction of building a system that may be sustainable with regard to all fundamental rights and freedoms, particularly to mitigate disproportionate balancing in the internal market.
 CJEU, Case C-252/21, Meta Platforms, Inc. v. Bundeskartellamt, Opinion of AG Rantos of 20 September 2022, ECLI. Cf. https://curia.europa.eu/jcms/upload/docs/application/pdf/2022-09/cp220158en.pdf.
 Ibid, 56.
 Ibid, 75.
 Cf. Article 6(1)(f) GDPR.
 See, ex multis, https://www.euroconsumers.org/opinions/tiktoks-new-policy-of-advertising-without-consent-must-stop; https://www.accessnow.org/immediately-no-tiktoks-personalised-ads-europe/.
 CJEU, Case C-40/17, Fashion ID, 29 July 2019.
 Cf. CJEU, Case C-131/12, Google Spain, 13 May 2014; Case C-293/12, Digital Rights Ireland and others, 8 April 2014; Case C-498/16, Schrems I, 25 January 2018; Case C-311/18, Schrems II, 16 July 2020. Cf. Oreste Pollicino, Judicial protection of fundamental rights on the Internet: A road towards digital constitutionalism? (Hart 2021).
 Cf. CJEU, Joined cases C-92/09 and C-93/09, 9 November 2010.
 “We may also process your personal data where we believe it is in our or others’ legitimate interests, taking into consideration your interests, rights, and expectations” https://www.apple.com/legal/privacy/en-ww/.
 See, for instance, Art. 32 on the Security of processing.
 Cf. Art. 52 CFREU.
 Cf. Giovanni De Gregorio and Pietro Dunn, ‘The European Risk-Based Approaches: Connecting Constitutional Dots in the Digital Age’ (2022) 59 Common Market Law Review 473.