As a digital privacy enthusiast, I am part of a large group of people who advocate in favor of the right to be forgotten. After all, being able to invoke that right is vital for people to be able to truly control their digital footprint.
On the other hand, freedom of access to information is incredibly important. For this reason, there is a lot of hoopla surrounding people’s ability to invoke the “right to be forgotten”. The general public requires access to information that could help them make different, better-informed decisions.
Sometimes politicians may try to hide past information about their character that could cause voters to question whether they are fit for office. Other times, crooked businessmen may attempt to hide fraud – or other dodgy dealings – in order to give themselves a clean slate. It is for this reason, that Google sometimes denies people the right to be forgotten.
This creates a complex dilemma about who should be permitted to be forgotten, and who should not. And, just as importantly, who should decide. At the moment Google has been gifted the job of chief mediator – but what gives the tech giant the legitimacy to make those life-altering decisions?
In my opinion, Google deserves praise for being strong when it comes to refusing to delist search results that it feels are in the public interest. The big question, however, is whether Google should be making those decisions unaided in the first place. And, when Google is found to have made the wrong decision – as it was recently in the UK’s courts – does it demonstrate that Google is somehow failing or doing a bad job?
Forget Me – Forget Me Not
The British cases in question involve two businessmen who appealed Google’s decision to deny their request to be forgotten. The ex-convicts felt that they deserved to be delisted from the search engine – Google felt differently.
One of the men (referred to as NT1 for anonymity) wanted details about a conviction from the late 1990s to be delisted. At that time he was found guilty of conspiracy to falsely account. He was ultimately denied the right to be forgotten by the high court because the judge felt he still posed a considerable risk to the general public.
NT2, on the other hand, was permitted the right to be forgotten. The court ruled that because his conviction did not involve actions taken in relation to “consumers, customers or investors,” Google should grant him his request. Explaining the judgment in more detail Justice Warby commented:
“There is not [a] plausible suggestion … that there is a risk that this wrongdoing will be repeated by the claimant. The information is of scant if any apparent relevance to any business activities that he seems likely to engage in.”
A Failure on Google’s Part?
For some people, the court’s decision has been taken as evidence that Google did something wrong. Following the ruling, for example, a spokesperson for Carter-Ruck – the law firm that acted on behalf of both NT1 and NT2 – commented that:
“The decision should cause Google to put in place improved reviewing processes of delisting requests.”
This seems unfair because Google was attempting to act in the public’s best interest. So, how does Google decide, and is it fair to criticize that process?
In 2014, David Drummond, senior vice president of corporate development and chief legal officer at Alphabet (Google’s parent company) wrote that Google had “set up an advisory council of experts” from “the worlds of academia, the media, data protection, civil society, and the tech sector” to “act as independent advisors.”
Even with this team in place, Drummond explains that it is tough to pass judgment on whether search results that include people’s names meet the stipulations set out by the European Court of Justice (ECJ). Namely, that a takedown request should be granted if the data has become “inadequate, irrelevant or no longer relevant, or excessive.”
Drummond writes that when deciding what to delist it can be hard to balance public interest against the interests of the individual – referring to the process as “vague and subjective.”
Peter Fleischer, Google’s global privacy counsel, says that most of the time “a large majority [are] in favor of one decision or the other.” However, he admits that on some occasions they “are violently split.” He also says that around 50% of requests are rejected and even goes as far as to admit that 15 percent of cases are left in limbo, with no decision made at all.
Better Safe Than Sorry
At the end of the day, despite being a digital privacy advocate, I can sympathize with Drummond, Fleischer, and Google. When the European Court decided that people had the right to be forgotten, Google suddenly got lumped with a very difficult job. In my opinion, the public sector has thus far inadequately supported the tech giant in that decision-making process.
Although I cannot adequately comment on whether Mr. Justice Warby made the right decision last Friday (because I have no way of knowing exactly what NT2’s crimes where in the first place) – what I do like about these latest verdicts is that they appear to have been decided fairly on a case by case basis – by a judge with the legal authority to do so.
Using that verdict to then somehow condemn Google and insinuate that it needs to do a better job seems unjust. I tend to feel that Google is doing the right thing by denying difficult requests. After all, as far as the general public and Google are concerned it is better to be safe than sorry.
These cases prove that Google is not always the right body to make these kinds of decisions. After all, it is a tech company, not a court of law. For this reason, I would prefer Google to continue denying borderline cases. Thus leaving the decision to be made by public sector bodies who are more qualified to make the decision in the first place.
Governments need to start facing the fact that processing right to be forgotten requests should not solely be Google’s responsibility. Yes, Google can probably handle the vast majority of cases. However, when it comes to more critical cases – where the public could be put at risk – some kind of coherent approach between the public sector and Google ought to be put in place.
The courts correctly decided that the right to be forgotten should exist, so perhaps those courts ought to take more responsibility when actually enforcing those rights too.
Title image credit: Castleski/Shutterstock.com
Image credits: amasterphotographer/Shutterstock.com, Creativa Images/Shutterstock.com,