Have you ever 'googled' your name? Many people do, and some find search results about themselves they rather not find publicly available on the internet. The question is; what do you do when that happens?
When Spanish citizen Mr Mario Costeja González found links to two, for him unflattering, pages of the Spanish newspaper La Vanguardia from 1998, he requested that the newspaper remove the personal information about him contained in the relevant pages. He also requested that Google Spain and Google Inc remove or conceal the personal data relating to him so that the data no longer appeared in the search results and in the links to La Vanguardia.
The matter ended up before the Spanish data protection authority Agencia Española de Protección de Datos (AEPD). The AEPD rejected the complaint against La Vanguardia. At the same time, it upheld the complaint against Google.
Google brought the matter before the Spanish National High Court (Audiencia Nacional), and that court referred the matter to the Court of Justice of the European Union.
The CJEU yesterday handed down its judgment in this case (Case C-131/12).
The decision is legally technical and many of the legal questions dealt with are specific to the European Union. However, the consequences of the decision are global. For example, the Court discussed in detail whether the functions carried out by Google Search amounted to data "processing", and whether Google was a data "controller" under the relevant EU law.
The Court answering both these questions in the affirmative meant that Google was responsible for its search results completely independently of the possible liabilities of the publishers, such as the newspaper in this case.
This means that even if certain content, such as the newspaper reporting relating to Mr Mario Costeja González, can lawfully be uploaded to the internet, it may be unlawful for Google to list such content in its search results.
For the EU, there are practical advantages in such an approach. It means that, by controlling the search engines, it can affect at least the likelihood of personal information being found online even where the information is provided by a party located outside the EU.
While in line with the approaches taken in the United States, its long-term implications for the internet may be severely limiting.
Perhaps the most serious aspect of the judgment relate to the so-called "right to be forgotten". The Court concluded that, where search results appear to be inadequate, irrelevant or no longer relevant, or excessive, the information and links concerned in the list of results must be erased. This applies even where the information is true and published lawfully by third parties. In other words, the Court places on Google the burden of deciding whether search results have become outdated.
The practical difficulties with this are obvious. First of all, there is the risk of search engines erring on the side of caution and removing any content complained of. After all, the risks of not removing the content may easily outweigh any perceived advantage of keeping the content accessible. Second, content may be seen to be outdated and irrelevant on one date only to become highly relevant again at a later date.
For example, information about a person's conduct may be seen to be outdated one day but become relevant again at a later date if that conduct is repeated. In other words, the relevance of information is not static — it is constantly changing and is always dependent on context.
In any case, the Court's conclusion on the right to be forgotten will no doubt reverberate across the world. Indeed, it forces the creation of a more forgetful internet.
From a privacy perspective this must be seen as a victory. But at the same time, privacy interests must always be balanced against competing interest such as freedom of information. The Court acknowledged this and stated that, while the right to be forgotten ordinarily trumps competing interests such as the economic interest of the search engine operator and the interest of the general public in finding information upon a search relating to the data subject's name:
"That would not be the case if it appeared, for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of inclusion in the list of results, access to the information in question."
The question is of course how this assessment will work in practice.
Dan Jerker B. Svantesson is co-director of the Centre for Commercial Law at Bond University. He is an ARC Future Fellow (project number FT120100583) and receives funding from the Australian Research Council. The views expressed herein are those of the author and are not necessarily those of the Australian Research Council.