Could ISPs be Forced to Take Action to Guarantee Individual’s “Right to be Forgotten”?

0

From the “Right to be Let Alone” to Control over Personal Data

In recent years technological innovations and the expanding range of services available on the internet have conjured up an every-increasing variety of new issues for data protection.

While social networks allow personal data to be continuously shared, a simple search on a search engine may produce a lot of information concerning an individual’s present and past life.

At the very beginning the right to privacy was mainly conceived as “right to be let alone”,[i] i.e. freedom from surveillance and interference of others. Then individuals have been entrusted with powers of active control over their personal data, drawing up a “right to protection” of one’s personal data.[ii]

In 2010, in the context of the revision of the Personal Data Directive (95/46/EC), the European Commission envisaged the introduction of a “right to be forgotten”, meant as the right of individuals to have their data no longer processed and deleted when they are no longer needed for legitimate purposes.[iii]

This is not a totally “new right”, as it constitutes a specification of the right of erasure and to oppose processing of one’s personal data in the absence of legitimate purposes laid down in the Personal Data Directive.[iv]

Now, the recent European Commission’s proposal for a EU data protection regulation expressly provides for a “right to be forgotten” and to obtain erasure of personal data from the data controller as well as the abstention from further dissemination of such data, where:

  • the data are no longer necessary in relation to the purposes for which the data are collected or otherwise processed; or
  • data subjects have withdrawn their consent to the processing; or
  • data subjects object to the processing of personal data concerning them; or
  • processing of data subjects’ personal data otherwise does not comply with the proposed EU regulation.[v]

Pursuant to the proposed EU regulation, supervisory authorities shall impose a fine of up to EUR 500,000, or in case of an enterprise up to 1% of its annual worldwide turnover, to anyone who, intentionally or negligently, does not comply with the “right to be forgotten” or to erasure.[vi]

The issue of the balance to be stricken between the “right to be forgotten” and the right of the press to inform, which also includes the right of individuals to be informed, has been already addressed in a previous article.

Discussions on the “right to be forgotten” in the online environment involve also an assessment of the interference between individual’s requests to delete data available on the internet and liability of hosting platforms and other intermediaries  – like search engines – that merely host or link to third-party content which includes said data.

Two recent Spanish and French cases show how this interference may occur.

Spanish Data Protection Authority (Agencia Española de Protección de Datos – AEPD) vs. Google

In 2010 the AEPD ordered Google Spain SL and Google Inc. to delete links on Google’s search engine to any website containing out of date or inaccurate information about individuals, which allegedly harmed their “right to be forgotten”, and to preclude future access to the same.

Google challenged the AEPD’s order in a Madrid Court, claiming that only publishers of an article, rather than search engines and news aggregators, should be forced to take action in order to guarantee users’ privacy and their “right to be forgotten”.

During the proceeding, on March 2 2012 the Madrid Court asked the Court of Justice of the European Union (ECJ) to clarify some issues relating to search engines’ liability with respect to the protection of personal data, and, among others:

  • whether the indexing of information by search engines qualifies as “processing of personal data” under the Personal Data Directive[vii] and, if yes, whether a search engine can be deemed to be the “data controller” with respect of said data;[viii]
  • if the previous question is to be answered positively, whether national data protection authorities, on grounds of the “right to be forgotten”, may directly require the search engine to remove data from its index, without previously or simultaneously requiring such removal to the website where the information is published;
  • whether search engines’ duty to remove the data exists if such data have been lawfully published and are maintained in the linked website;
  • whether the protection of personal data includes data subjects’ right to prevent search engines from indexing information which they consider harmful to their legitimate interests, although such information is lawful and accurate in its origin.

The main issue the ECJ will need to solve is whether the indexing of information by search engines qualifies as “processing of personal data” and, consequently, whether the search engine acts as “data controller” (itself or jointly with others) pursuant to the applicable law.

In this regard, Opinion no. 1/2008 of the Article 29 Data Protection Working Party on data protection issues related to search engines[ix] clarifies that in some circumstance search engine providers might be considered “data controllers”, thus fully responsible under data protection laws for the resulting content related to the processing of personal data.

This may happen when search engine providers do not limit themselves strictly to an intermediary role and perform additional operations (e.g. storing of parts of the content available on the internet – including the personal data in that content – on their servers; using facial recognition technologies in the context of image processing and image search; selling of advertisement triggered by personal data; etc.).

On the contrary, the same Opinion no. 1/2008 stresses that “The principle of proportionality requires that to the extent that a search engine provider acts purely as an intermediary, it should not be considered to be the principal controller with regard to the content related processing of personal data that is taking place.”[x] In these cases publishers of the relevant information should be considered “data controllers”.

The Article 29 Data Protection Working Party suggests in any case that the assessment on whether a search engine should be considered to be a “data controller” under data protection laws is separate from the issue of search engine’s liability for third-party content linked to pursuant to the E-commerce Directive (2000/31/EC), as the French case described below shows.

Tribunal de Grande Instance de Paris (TGI): Diana Z. vs. Google

On February 15 2012 the TGI ordered Google to remove from its index any search results linking to pornographic videos made by a woman during her past, which allegedly harmed her “right to be forgotten” (droit à l’oubli).[xi]

The TGI’s order referred to Section 6 of Law of July 21 2004 (Loi sur la confiance dans l’économie numérique – LCEN)[xii] whereby hosting providers are not liable for infringement unless, upon obtaining knowledge of an unlawful activity or information, did not promptly remove or disable access to said information.[xiii] In the TGI’s view, Google could not rely on such exemption from liability since, once notified of the presence of content which allegedly harmed the woman’s “right to be forgotten” in its search listings, it failed to promptly delete it.

Differently from the Spanish case, the TGI did not raise the issue of whether an intermediary should be considered to be a “data controller” (itself or a controller jointly with others) with regard to the processing of personal data. The TGI simply applied the ISP’s liability rules set forth by the E-commerce Directive.

It seems not very clear the reasoning which led the TGI to qualify the search engine Google as “hosting provider”, meant as storage activity of information provided by a recipient of the service.[xiv]

Indeed, the E-commerce Directive does not address search engine’s liability, while the European Commission in its first report on the application of the E-commerce Directive released in 2003 stressed the need of additional limitations on liability for search engines activities.[xv]

Moreover, under a factual standpoint, it could be debated whether search engines effectively “store” information they link to.

Comment

As a general rule, liability for deleting information about individuals upon request should lie on the person or entity who posted that information, and not on the search engine providers conducting mere automatic searches. However, despite this, intermediaries might be required to delete data which are prejudicial to an individual pursuant to the liability rules set forth by the E-commerce Directive.

The new proposal for a EU data protection regulation stresses this point, as it clarifies that the new rules “shall be without prejudice to the application of Directive 2000/31/EC, in particular of the liability rules of intermediary service providers in Articles 12 to 15 of that Directive.”[xvi]

In any case, a careful analysis is recommended to assess the role performed in practice by a search engine with respect to the information to which the tool directs the users, and, in particular, its awareness concerning said information and the possibility to control it.

It remains to be seen how the ECJ will address the questions referred by the Madrid Court in the context of the above-mentioned lawsuit. It is certainly desirable that the ECJ will reach a fair balance between all the interests involved.


[i] Warren, Brandeis, Right to Privacy, Harvard Law Review, Vol. IV, no. 5, December 15, 1890.

[ii] Section 8(1) of the European Convention on Human Rights (ECHR); Section 8 of the European Union Charter of Fundamental Rights; Section 1 of the Italian Data Protection Code (Legislative Decree no. 196/2003).

[iii] Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions, A comprehensive approach on personal data protection in the European Union, November 11 2010, COM(2010) 609 final.

[iv] Articles 12(b) and 14(a) of the Personal Data Directive.

[v] Section 17 of the Proposal for a regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data, January 25 2012, COM(2012) 11 final.

[vi] Section 79(5)(c) of the proposed EU regulation.

[vii] Section 2(b) of the Personal Data Directive, which defines “processing of personal data” as “any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction”.

[viii] Section 2(d) of the Personal Data Directive, which defines the “data controller” as “the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data”.

[ix] Adopted on April 4 2008.

[x] Opinion no. 1/2008, page 14.

[xi] The TGI’s injunction concerned the websites www.google.com and www.google.fr.

[xii] Implementing in France the E-commerce Directive (2000/31/EC).

[xiii] Section 6(2) of the LCEN, implementing Section 14 of the E-commerce Directive.

[xiv] Section 14 of the E-commerce Directive.

[xv] First Report on the application of Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce, November 21 2003, COM(2003) 702 final, page 14.

[xvi] Section 2(3) of the proposed EU regulation.

Share this article!
Share.

About Author

Leave A Reply