Online application of the right to be forgotten and new EU data regulation

19 June 2023
Knowledge Base

by František Nonnemann & Vladan Rámiš

The right to be forgotten (right to erasure) is one of the well-known, and somewhat feared, legal institutes introduced by GDPR. On the other hand, the right to erasure is not absolute. It does not always apply and must be balanced against the interests and rights of other persons. However, this does not mean that the scope of the right to erasure is clear in practice and does not raise several issues and uncertainties. How should a data controller, especially an internet search engine, proceed if a data subject objects that the data displayed or otherwise processed is untrue? Misleading, offensive, factually incorrect? The search engine operator usually acts as the controller of the personal data it displays, but it is not the one who has placed the data on the Internet. It therefore has no other information how to assesses the accuracy of the data which has been published on the source page.

This question how an internet search engine, or more generally the data controller, should proceed in such situation, was considered in “new” Google caseof the Court of Justice of the European Union (CJEU).

In addition to setting out the basic framework and roles of the parties involved (data subject, controller, etc.), the CJEU has also applied the Safe Harbor principle known mostly from other areas of law, especially from e-commerce directive.

Legitimate investment business or just another “Ponzi scheme”?

What facts, circumstances and preliminary questions did the CJEU consider in this case?

German investment company owner and his spouse have asked Google not to display specific search results associated with their names. These results related to a series of newspaper articles very critically questioned their business model and suggesting its illegal or fraudulent nature.

Google refused their request for two reasons:

  1. The articles concerned professional lives of the affected persons
  2. Google did not have sufficient information that the articles contained false or inaccurate information.

National court hearing the dispute referred two questions to the CJEU for a preliminary ruling. To our article, the first question is relevant:

“Is it compatible with the data subject’s right to respect for private life and to protection of personal data  if, within the context of the weighing-up of conflicting rights and interests … when the link, the de-referencing of which [that person] is requesting, leads to content that includes factual claims and value judgments based on factual claims the truth of which is denied by the data subject, and the lawfulness of which depends on the question of the extent to which the factual claims contained in that content are true, the national court also concentrates conclusively on the issue of whether the data subject could reasonably seek legal protection against the content provider, for instance by means of interim relief, and thus at least provisional clarification on the question of the truth of the content displayed by the search engine data controller could be provided?”

In other words, what information or documents are to be provided the the data subject applies for the right to erasure on the grounds of data inaccuracy? Is a judicial decision, at least a preliminary one, necessary? There is also the question of how the search engine operator, or more generally the data controller, should proceed. Can it refuse such a request and refer the applicant to the court? Or should it verify the truthfulness or alleged falsity of the data displayed itself?

Neither the data subject nor the operator of internet search engine is a court

The CJEU stated it is necessary to weigh all the rights concerned so that the right to erasure can be exercised in an efficient way, but at the same time so that the search engine does not interfere with the right to freedom of expression. The Court added that for public figures, in certain circumstances the right to freedom of expression and information may override their right to privacy and protection of personal data, but this is not the case where false information is published. Dissemination of such information is not protected.

CJEU then considered the roles and obligations of the two main actors, the person seeking the erasure or removal of the search result and the internet search engine operator. CJEU addressed the question of whether and, where applicable, to what extent the person who has requested the removal of a search result must prove his/her allegation that the linked text is false or inaccurate, and whether the search engine operator must itself seek to clarify the questioned facts.

CJEU stated that the data subject asking for the removal, or de-indexing, of a search result must prove the manifest incorrectness of the information displayed. However, this requirement must not be interpreted too broadly, and the person must not be required to produce evidence or documents which cannot reasonably be required of him/her. Among the documents which do not have to be provided is a judicial decision made against the publisher of the website in question, and the existence of such a decision cannot be required even in the form of an interim court decision.

And the internet search engine operator?

The operator is then obliged to assess whether the facts presented clearly show that the information displayed is false. If so, the request to remove the search result should be approved.

However, according to the Court, the search engine operator cannot be obliged to investigate the facts, obtain additional information, and, to that end, to organise an adversarial debate with the content provider seeking to obtain missing information concerning the accuracy of the referenced content. Such an interpretation would pose a serious risk of removing content which corresponds to a legitimate and overriding public need for information.

How does the CJEU ruling apply to the new EU data regulation?

The importance of the current Google decision goes beyond the EU data protection regulation (GDPR) and practice. Indeed, the approach taken by the CJEU to the obligations of search engines to verify the complaints of persons affected by ‘source’ articles, or their manifest falsity, may also be relevant to the interpretation of the obligation of online platforms under new Digital Service Act (EU regulation no. 2022/2065, DSA).

DSA replaces Articles 12 to 15 of the original Directive on electronic commerce2. These articles provide for so-called Safe Harbor exemptions from the liability of providers of intermediary services for content distributed by them but generated by their users.

Intermediary service means, according to the art. 3 letter g) of DSA, one of the following information society services:

  1. a ‘mere conduit’ service, consisting of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
  2. a ‘caching’ service, consisting of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request;
  3. a ‘hosting’ service, consisting of the storage of information provided by, and at the request of, a recipient of the service;

An internet search engine as an intermediary service

It is important to mention that in the “old” e-commerce directive it was not entirely clear whether search engines could be considered intermediary services.

Search engines are explicitly included in the scope of the DSA (art. 3 letter j)), although they were included at later stages of the legislative process. Therefore, although the DSA is quite clear that a search engine is an intermediary service, it is not entirely clear which of the above categories it is supposed to fall under. However, it is likely to be a caching service or a hosting service regarding the nature of its operations.

A caching service provider is not liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient or more secure the information’s onward transmission to other recipients of the service upon their request. There are several conditions for this non-liability, one of them is that the caching service provider “acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a judicial or an administrative authority has ordered such removal or disablement.” (Art. 5 par. 1 letter e) DSA).

Similarly, under art. 6 par. 1) of the DSA, the hosting service provider is not liable for information stored at the request of the recipient of the service if:

  1. does not have actual knowledge of illegal activity or illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is apparent; or
  2. upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content.

A search engine would – by the nature of its services – more likely qualify as a caching service. On the other hand, it appears from the commented CJEU decision, and the obligations imposed in the decision by the CJEU on the search engine operator (albeit under the legal situation prior to the adoption of the DSA) that the CJEU will interpret the DSA more likely in a manner that the rules applicable to hosting services apply to the search engine. This is even though the search engine operator does not impose any information directly provided to it by the recipient of the service.

Whether the search engine will be assessed as caching service provider or hosting service provider, the CJEU has indicated in recent Google judgment what requirements it will impose. The CJEU’s conclusions correspond to the text of Article 16 of the DSA. It imposes an obligation on hosting service providers to implement a so-called “notice & action” mechanism. This mechanism allows the affected persons to notify these providers of the appearance of specific information within their service that the person or entity concerned considers to be illegal content.

How can the user defend himself under the DSA?

The DSA requires hosting service providers to allow service recipients to submit notifications of the presence on their service of specific items of information that the individual or entity considers to be illegal content (Art. 16 par. 1 DSA). Providers will ensure this right, for example, by setting up their interface appropriately to ‘guide’ the complainant to provide all relevant information and by publishing instructions on what the notification should contain.

Art. 16 par. 3 of DSA introduces an important rebuttable presumption: The notifications shall be considered to give rise to actual knowledge or awareness in respect of the specific item of information concerned where they allow a diligent provider of hosting services to identify the illegality of the relevant activity or information without a detailed legal examination.

For the application of such presumption, it is necessary the notice contains:

  1. a sufficiently substantiated explanation of the reasons why the individual or entity alleges the information in question to be illegal content;
  2. a clear indication of the exact electronic location of that information, such as the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content adapted to the type of content and to the specific type of hosting service;
  3. the name and email address of the individual or entity submitting the notice;
  4. a statement confirming the bona fide belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.

Paragraphs 71 to 73 of the current CJEU Google judgment correspond to the requirement of a ‘sufficiently substantiated statement of reasons’ according to the DSA. However, it is supplemented in the judgment by an express declaration that the search engine operator is not obliged to carry out its own investigation which would support the complainant’s claim that the search results should be removed. Neither is it obliged to search for evidence and facts that would refute such a claim. However, if the search engine operator does so voluntarily, he cannot be held liable and does not lose the benefit of the Safe Harbor.

Users’ protection: From the GDPR to the DSA

Current CJEU decision in the Google case clarifies how to apply in the online environment the right to be forgotten, or the right to de-indexing of specific result in the search engine.

The practical impact of this judgment goes beyond the data privacy regulation. We believe that it will also be highly relevant for the future interpretation of the DSA. Although it interprets the situation prior to the adoption of the DSA, there is no reason to avoid applying its conclusions to interpretation of DSA.

While the adoption of the DSA might also lead to some modification of the previous CJEU case law on search engines, the commented judgment shows that in practice such modification is unlikely to be necessary. The previous case law, including the commented decision, will remain applicable even with this new regulation.

An extended version of this analysis was originally published on the Czech/Slovakian website, which can be found here: https://www.epravo.cz/top/clanky/pravo-na-vymaz-osobnich-udaju-v-ramci-internetoveho-vyhledavace-115974.html

(*1) Judgment of the Court of Justice from 8th December 2022, In case C-460/20, TU, RE vs. Google LLC: https://curia.europa.eu/juris/document/document.jsf?text=&docid=268429&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=2258211.

(*2) Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’)

The author, František Nonnemann, has worked for 10 years at the Office for Personal Data Protection, among others as the director of the analytical department or the head of the legal department.

Vladan Rámiš is the director of the legal department at Mafra Media Group.

Leave a Reply

Your email address will not be published. Required fields are marked *