UK’s Proposed DP Bill Could Outlaw Important Research

Ray Walsh

Ray Walsh

August 15, 2017

Cybersecurity experts are heavily criticizing the UK version of the forthcoming EU General Data Protection Regulation (GDPR). The law will prepare the UK for Brexit by codifying the GDPR regulations into the British constitution. The first insights into the proposal, which closely mimics the EU version, were released last week by the UK’s digital minister, Matt Hancock.

Cybersecurity experts are expressing concerns over a specific clause contained within the draft Data Protection Bill (DP Bill). They are concerned that specifications within the British legislation could criminalize security research aimed at improving digital privacy for British consumers.

Get a VPN Service Today!

A VPN is the best digital privacy tool on the market

Unblock any website with a VPN today

Essential Research

The part of the proposed DP Bill that is drawing criticism states that it would become a criminal offence to “intentionally or recklessly re-identify individuals from anonymised or pseudonymised data.” At first glance, this seems positive. After all, researchers have proven in the past that it’s possible to re-identify previously anonymized credit card data.That research proves that firms need to work harder to anonymize data in order to protect consumers.

However, now researchers who do that kind of work have come forward to point out that the wording of the clause would, in fact, make it a crime for them to attempt to prove that data is not being thoroughly anonymized. That research, they argue, is an essential part of uncovering flaws in current systems. Intentionally re-identifying anonymized data is the only way to carry out that work.

Massive Fines

Under the proposals, attempting to help consumers by doing this kind of digital privacy research would be punishable with massive maximum fines of up to £17 million. That would make the work too dangerous to undertake, the researchers argue.

History has already proven that the dangers of re-identifying anonymized data are enormous for consumers. In 2006, a 62-year-old woman from Lilburn in the US was identified from de-anonymized data released by AOL. According to the firm, it had hoped that the anonymized data would “help academic researchers.”

Instead, AOL was forced to take the anonymized data offline, after a reporter was able to single out person No. 4417749. The reporter in question used search queries including “numb fingers,” “60 single men,” “dog that urinates on everything,” “landscapers in Lilburn, Ga,” several people with the second name “Arnold” and “homes sold in shadow lake subdivision Gwinnett County Georgia.” That trail led him to a widow called Thelma Arnold, who confirmed that “those are my searches.”

Also in 2006, Netflix was sued by a woman whose sexual preference was revealed by re-identified anonymized data. In 2016, a browser add-on called Web of Trust was caught selling anonymized web searches to third parties. Those searches were used by an investigative journalist to re-identify a German judge’s porn habits, a politician’s drug prescriptions, and even the details of ongoing criminal cases. In fact, the journalist was able to re-identify around 50 people.

Web of Trust logo, from its website

Wrong Target

Despite the embarrassment that this type of journalism and research can cause, without these kinds of revelations firms will continue to anonymize data in insecure ways that permit people to re-identify the data. The concern, of course, is that those records could fall into the hands of cybercriminals, or other nefarious actors.

This is the opinion of  Lukasz Olejnik, a researcher at Princeton’s Center for Information Technology Policy. Olejnik feels that the UK’s DP Bill will outlaw important research, but do nothing to quell the growth of poorly anonymized databases.

“Any re-identification ban would need strong provisions guaranteeing that researchers acting in good faith are on the safe side.

“I worry that if re-identification is simply banned, there might be no incentive for sane security and privacy engineering designs. It’s a paradox, but re-identification ban might end up leading into overall weaker systems.”

This is a real danger. The UK government ought to be outlawing the accumulation of badly anonymized data, as opposed to the research that reveals the slack behaviors of corporations.

Opinions are the writer’s own.

Title image credit: Maksim Kabakou/

Image credits: hafakot/

Exclusive Offer
Get NordVPN for only