In 2013 Edward Snowden revealed that the world’s top technology companies cooperated with the United States National security Agency (NSA) to sell out their customers and hand over their data to the secretive unaccountable surveillance organization (and what was not handed over was tapped by the NSA and its UK sidekick, GCHQ , anyway).
Ever since, these tech companies have struggled to regain consumer confidence in their services and products. As part of this drive, companies such as Apple and Google have started to offer default end-to-end encryption. This means that data is encrypted on users’ devices, and that users (not the companies) are the sole holders of their own private encryption keys.
This is great for users, as it means they do not have to actually trust the companies involved. With end-to-end encryption the tech companies are completely unable e to access to their users’ data. Of course, this also means that government agencies cannot simply ask the tech companies to decrypt and hand over their customers’ data (they can’t),or access the encrypted data directly – unless the owner of that data hands over their private keys to them (in the UK it is crime to refuse to hand over your encryption keys)
Perhaps predictably, this has hugely alarmed the data-hungry US and the UK governments and their spy agencies, leading to ever more urgent demands that tech firms allow government access to their users’ encrypted data.
In the US, this debate has mainly centered around demands for some kind of ‘backdoor’ or ‘golden key’ that would allow security services to decrypt encrypted data. In the UK, however, Prime Minister David Cameron has gone so far as to plan legislation that will ban products which use strong encryption,
‘We have always been able, on the authority of the home secretary, to sign a warrant and intercept a phone call, a mobile phone call or other media communications, but the question we must ask ourselves is whether, as technology develops, we are content to leave a safe space—a new means of communication—for terrorists to communicate with each other.
‘My answer is no, we should not be, which means that we must look at all the new media being produced and ensure that, in every case, we are able, in extremis and on the signature of a warrant, to get to the bottom of what is going on.’
From word go such plans have been heavily criticised by security experts (in addition to being slammed by privacy campaigners and stoutly resisted by the tech companies) on the grounds that any weakening of encryption would make the internet less safe for all. A damning new 26 page report from 14 of the worlds most respected cryptographers and computer scientists, however, may represent the biggest blow yet to government plans on both sides of the pond.
It was this same group, after all, that in 1997 published a similarly damning report on the Clinton administrations’ proposed ‘Clipper Chip’, which was designed to allow the NSA decrypt all communications. That report played a key role in the backlash against the government’s proposals which led to them being abandoned.
Paul Kocher, president of the Rambus Cryptography Research Division (and not a contributor to the new research), explained to the New York Times that instead of focusing on the contentious issue how much power governments should have to invade citizens’ privacy, the paper instead focuses on the practical and technical reasons of why such access would be a security disaster,
‘[The paper] details multiple technological reasons why mandatory government back doors are technically unworkable, and how encryption regulations would be disastrous for computer security. This report ought to put to rest any technical questions about “Would this work?”’
According to the report,
‘Demands for exceptional access to private communications and data shows that such access will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend. The costs would be substantial, the damage to innovation severe, and the consequences to economic growth difficult to predict. The costs to developed countries’ soft power and to our moral authority would also be considerable.’
Strong words indeed! With reference to the UK government’s plan to ban strong encryption altogether, the report asks that, if the UK requires ‘exceptional access’ for its law enforcement and security services, then is it also acceptable for all governments (such as that of China) to implement similar backdoors into their products?
Similarly, if a Chinese citizen were to use a British product backdoored by the UK government, should the Chinese government also be provided similar access? Ross Anderson, professor of security engineering at Cambridge University and co-author of the report, told the Guardian that,
‘There are three tests for exceptional access to be compatible with human rights. The required access must be set out in law sufficiently clearly for its effects to be foreseeable, it must be proportionate and it must be necessary in a democratic society. The government demands for access to everything fail all these tests by a mile. A point I would like to make to the prime minister and his circle is: whoever put the prime minister up to this should get a complete bollocking. The proposals are wrong in principle and unworkable in practice.’
Whether either government will pay the slightest bit of attention to the report remains to be seen, but with expert opinion so critical of their plans, it may be difficult for them to push through the legislation they hope for….