Google and the Right to be Forgotten: brief summary of a never-ending story

0

Over the past few years there has been a natural tendency to associate specifically Google with the evolution of the Right to be Forgotten and the consequences related thereto. This tendency is fundamentally based on a variety of different circumstances that influenced both the world’s biggest tech giant and the new balance between the free dissemination of information and individual self-determination.

The premise is that any processing of personal data may determine advantages and disadvantages with regard to individual and social interests protected by the law[1]. In simple terms, the Right to be Forgotten entitles individuals to request that search engines such as Google, Bing, and Yahoo delist URLs from across the Internet that contain “inaccurate, inadequate, irrelevant or excessive” information surfaced by queries containing the name of the requester. Critically, the ruling requires that search engine operators make the determination for whether an individual’s right to privacy outweighs the public’s right to access lawful information when delisting URLs.

It is clear that, although there is no authentic hierarchy of rights, depending on the context some may be predominant over others. The importance attached to a specific right and the importance of preventing its limitation is determined in accordance with the fundamental understanding in that society[2]. In this regard, although it cannot be – and will never be – qualified as an absolute right, the Right to be Forgotten has been able to exploit fertile ground in the European legislation and to gain equal weight in EU framework as the right to information.

As a matter of fact, due to the entry into force of the Regulation (EU) 2016/679[3], the Right to be Forgotten has seen an increase of its specific weight in balancing with other conflicting rights such as the right to freedom of expression. Actually, the “right to erasure” is not a new concept in EU Data Protection law. Under Article 12 b) of the Directive (EC) 1995/46[4], individuals were given a right to ask controllers to erase or block their personal data where its processing was not compliant with the provisions of the Directive and particularly where it was incomplete or inaccurate. However, the fact that such a right is now further legitimized by a specific provision[5] within one of the most relevant pieces of legislation in our recent history, has made its importance definitely more perceptible.

What all this means is that the rights of individuals in the GDPR can come into conflict with the rights to freedom of expression and access to information. Beyond the GDPR, these rights are granted by the European Convention on Human Rights and the EU Charter of Fundamental Rights.

Considering the other side of this coin, it is common knowledge that Google has unintentionally assumed an ever-growing role in this setting. Especially since the 2014 European Court of Justice ruling in the Costeja Gonzalez case[6], the natural evolutionary path of the Right to be Forgotten has always been linked to Google. From then on, the American multinational tech company found itself unwillingly handling an incredible number of requests related to this right and aimed at obtaining the deindexing of certain contents.

Just to get an idea of the numbers we are talking about, in a company research paper[7] released in February 2018 Google said that in all it received requests to remove more than 2.43 million URLs since the end of May 2014, and it has removed about 43% of them. During the next three and a half years, until the end of 2017, almost 400,000 requesting entities were reported. In the past two years, celebrities requested more than 41,000 delistings and politicians and government officials requested almost 34,000.

The processing of each delisting request is not automated – it involves individual consideration of every request and implies human judgement. Without such an individual assessment, the procedure put in place by Google would be open to substantial abuse, with the prospect of individuals, or indeed businesses, seeking to suppress search results for illegitimate reasons. Moreover, it would be a violation of the provisions set forth by Article 22 of the GDPR according to which the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

In view of all the above, Google has managed to invest money and devote human resources to guarantee that each and every single request is duly evaluated. Things that the tech giant takes into account are “the nature of the offence, the sentence imposed, the time which has passed since the conviction, and the relevance of the information to the requestor’s business or professional life”[8].

Notwithstanding the utmost importance of Google Spain, which indeed automatically comes to mind when talking about this kind of issues, it must be stressed that also the jurisprudence of the Court of Justice of European Union has seen a great evolution in the last few years and has managed to define better the vague boundaries of the Google Spain case. Indeed, in that occasion the Court actually had to pull the strings of the existing legislation to be able to attribute the duty of the search engine operator to take into consideration the requests of the interested parties and, further, to balance the public interest with the individual one. Most recently, in September 2019, the CJEU had the chance to clarify the situation with two extremely significant decisions, immediately recognized as landmark cases.

With reference to the first one, it relates to case C-507/17[9] between the French Authority for the Protection of personal data (Commission Nationale de l’Informatique et des Libertés – CNIL) and Google Inc. and concerns the territorial scope of the deindexing. The interpretative question that the French Council of State referred to the CJEU was whether, when a search engine operator accepts a request for deindexing, it is required to perform said deindexing on all versions of its search engine or only on that corresponding to the Member State of residence of the beneficiary of the deindexing (in this case, only on Google.fr), or again on all those of the member countries (Google.it, Google.de, Google.eu, etc.).

In this regard, the CJEU clarified that currently there is no obligation – deriving from the EU law – for a search engine operator that accepts a delisting request submitted by the interested party to carry out such deindexing on all versions of its engine. However, the delisting must be done in the search engine versions of all EU member states. In addition, the operator must adopt measures to prevent, or at least seriously discourage users, from accessing the links removed in the European versions, through an “extra EU” version.

Basically, the abovementioned decision has established that there is no obligation for a search engine to apply the Right to be Forgotten globally. Therefore, while EU citizens actually are entitled to ask for the right to erasure according to Article 17 of the GDPR, said right only applies within the boundaries of the European Union, meaning the bloc composed by the 27 Member States.

Obviously, this decision has also been subject to some criticism, particularly from those who believe that when information is available elsewhere, one needs not to travel to access it. By way of example, when a content is blocked in a specific EU Member State because of GDPR rules, one could simply change his/her VPN location to a non-EU country and have immediate access. According to this view, the notion of geography determining access and privacy right, given current technology of VPN for starters, makes a mockery out of legal data protections[10].

Truth is that the above described ruling could be seen as a victory for global freedom of expression. Courts or data regulators in the UK, France or Germany should not be able to determine the search results that Internet users in America, India or Argentina get to see. Probably it would not be right that one country’s data protection authorities can impose their interpretation on Internet users around the world.

The Court stated that the balance between privacy and free speech should be taken into account when deciding if websites should be delisted – and also to recognise that this balance may vary around the world. Further, the CJEU underlined the limits of a “global approach”. Since the right to privacy and data protection are not absolute rights, they need to be balanced with other fundamental rights, among which the right to freedom of expression and the GDPR does not aim to strike a fair balance between fundamental rights outside the territory of the Union[11].

The second case (C-136/17[12]), which seems to have received less attention from the media, arises from another reference made by the French Council of State in relation to four appeals that concerned, in one way or another, the processing of sensitive data by search engines. The Court reiterated that the indexing activity carried out by search engine operators must be considered “processing of personal data” and therefore subject to the limitations currently provided for by the GDPR, and previously by Directive (EC) 1995/46, as regards the treatment of sensitive data.

Consequently, a search engine operator must, in principle, accept requests for delisting regarding web pages where sensitive personal data appear, unless such links prove to be strictly necessary to protect the freedom of information of Internet users potentially interested in having access to web pages containing sensitive data.

For what concerns judicial data, the search engine operator is required to accept a request for deindexing towards web pages that contain information that is no longer current (for example, they inform about a conviction at first instance of a person later acquitted on appeal), unless that the public role played by the person, the seriousness of the crime and other circumstances do not make the public interest in knowledge prevalent. If the search engine operator considers the public interest in information as prevalent – and therefore the links should not be removed – he will still have to “arrange the list of results in such a way that the resulting overall image for the Internet user reflects the current legal situation”. In other words, the algorithm will have to sort the results not based on their popularity, but on the topicality of the information.

These are the most recent but certainly not the last steps that will trace a story destined to last a long time and whose ending, to date, seems to be still far far away.

[1] G. Sartor, The right to be forgotten: balancing interests in the flux of time, International Journal of Law and Information Technology (Nov. 2016)

[2] A. Barak, Proportionality and Principled Balancing, Law & Ethics of Human Rights, Vol. IV (2010)

[3] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation)

[4] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data

[5] Regulation (EU) 2016/679, Article 17 – Right to erasure (‘right to be forgotten’)

[6] C‑131/12, Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González.

[7] T. Bertram, Three years of the Right to be Forgotten, (Feb. 2018)

[8] G. Corfield, Here is how Google handles Right To Be Forgotten requests, The Register (May 2018).

[9] C-507/17, Google LLC, successor in law to Google Inc., v Commission nationale de l’informatique et des libertés (CNIL)

[10] J. Vigo, Google and the Right to Be Forgotten, Forbes (Oct. 2019).

[11] G. De Gregorio, Google v. CNIL and Glawischnig-Piesczek v. Facebook: content and data in the algorithmic society, MediaLaws, (Mar. 2020)

[12] C-136/17, GC, AF, BH, ED v Commission nationale de l’informatique et des libertés (CNIL), interveners: Premier ministre, Google LLC, successor to Google Inc.

 

 

Share this article!
Share.

About Author

Leave A Reply