Privacy but at what price?
23 May 2014
A search engine operator never forgets... unless you ask it to — the practical challenges of the CJEU Google case decision
20 June 2014
Data & Information E-Alerts — 16 May 2014: protecting personal data; CESG cloud security guidance; and more
19 May 2014
21 May 2014
11 July 2014
19 June 2014
A EU court has backed the “right to be forgotten”, but what does this mean for freedom of expression?
In today’s highly networked age, few can doubt the power of the internet to shape our public identities and biographies.
We may delude ourselves that we control what is said about us online but the truth is that our online personas are heavily dictated by third party data-creators, whose multifarious web-creations often lie firmly beyond our grasp. However, it is to a large extent internet search engines (ISEs) which bring this data to a mass audience.
Through the application of their sophisticated indexing algorithms, they enable all manner of web-pages containing information about us to be presented to the individual searcher, and in an order which is predetermined not by any human hand but rather by the algorithms themselves.
Of course, for those of us with fairly anodyne histories, the indexing function that the Googles of this world perform is of no great consequence. However, for some, the ability of ISEs to ensure the endless online reiteration of dark or embarrassing moments from their past is a painful problem.
It is precisely this problem which the CJEU has sought to address in its recent judgment in Google Spain Sl, Google Inc v Agencia Española de Protección de Datos, Mario Costeja González (C-131/12). There were two core questions posed in Google Spain.
First, can ISEs properly be characterised as entities which ‘control’ the personal data contained in the source web-pages which they index, such that they fall within the scope of the European Data Protection Directive?
Second, if ISEs are ‘data controllers’ for the purposes of the Directive, can individuals who are identified in some way in indexed web-pages invoke the so-called ‘right to be forgotten’ as against those ISEs, effectively requiring them to consign the offending web-pages to e-oblivion.
Strikingly and controversially, the European Court has answered both of these questions in the affirmative.
So far as the latter question is concerned, the Court has gone so far as to say that, even if the information contained in the relevant web-page is both true and lawfully present on the internet, the data subject’s right to be forgotten still generally operates as a trump card.
Only if there is a ‘preponderant interest of the general public’ in facilitating access to the information can the relevant web-pages continue to be subject to indexation. The effect of the judgment is therefore that individuals are now afforded a significant power to rewrite their e-history to suit their own interests.
At first blush, the judgment seems to embody a perfectly sensible solution to a serious problem, namely how to protect individual privacy in a mass data world. However, closer analysis reveals its fundamentally flawed nature.
To begin with, as Advocate General Jääskinen rightly observed in his Opinion to the Court, it is wholly unrealistic and exorbitant to regard ISEs as ‘controllers’ of personal data.
Unlike the generators of the source webpages, ISEs have no authorial insight into or say over the content of those webpages. Rather they act as a bridge between those who make the data publicly available on the internet and those who wish to access it.
In so doing, ISEs provide the core architecture for our highly valuable information society. In no meaningful sense can ISEs said to be data controllers for the purposes of the Directive. The proposition can be tested in this way: if ISEs are data controllers, then they are subject to the wide ranging obligations imposed under the Directive, including the obligation to ensure that any data being processed are ‘adequate’ and ‘relevant’.
But adequate in what sense? Relevant to what? How are ISEs to make these kinds of intensely value-laden judgments when they are likely to have no immediate familiarity with the data subject, the content of the web-page in question or the overall factual context? Undertaking such assessments is likely to prove an Augean task for any ISE.
Perhaps even more significantly, nowhere in the judgment is there any careful balancing of other fundamental rights, including the right to freedom of expression enshrined in both Article 10 of the European Convention on Human Rights and Article 11 of the Charter of Fundamental Rights of the European Union.
Importantly, the right to freedom of expression incorporates not only the right to express ideas but also the right to seek and obtain information. It is this right which members of the public, including journalists, researchers and the man and woman on the street, exercise every time they search the internet for information.
This right is no less fundamental than the right to privacy and yet it is given scant consideration in the judgment. This asymmetrical treatment of rights in the judgment yields an unbalanced result, as important Article 10 rights are in effect left to play second fiddle whilst Article 8 rights dominate the stage. This is a result which sits most uncomfortably with existing Strasbourg jurisprudence. It inevitably invites litigation on the question of whether the issues raised in Google Spain should now be refracted directly through the legal prism of the Convention.
Of course, it may well be that these issues will be resolved in the context of the new Data Protection Regulation which is still being debated in Europe. However, in the meantime, the judgment in Google Spain means we may well find ourselves exposed to a degree of data impoverishment which augurs ill for the development of our information society.
11KBW barrister Anya Proops