Brave New Data World

“Individual responsibility is important, but it is not enough: it’s a policy making issue. It’s so easy to bypass the traditional tools of notice and consent, or transparency and choice,” said Acquisti. In fact, there is also evidence that the more control people have on data, the more risk they take, “like driving faster when you have seat belts,” he says. Privacy enhancing technology, homomorphic encryption, blind signatures and other technologies will play a big role, according to Acquisti. “Don’t believe those that say that privacy is antithetic to big data: privacy is not stopping information flows, it is managing them,” he concluded.

Bild3

Back doors, trap doors, and crypto wars

Not content with passive surveillance, intelligence agencies have sought to incorporate both front and back doors to encryption algorithms and implementations. But these are easily exploitable by others. Scientists can find solutions that allow intelligence agencies to protect society, without weakening encryption’s security and slipping into massive surveillance.

Peter Y A Ryan - © HLFF / Flemming – 2015

Peter Y A Ryan
© HLFF / Flemming – 2015

Peter Y A Ryan
Professor of Applied Security at the University of Luxembourg. Ryan is a pioneer in applying process algebras to the modelling and analysis of secure systems. He also worked for a period with the UK GCHQ intelligence service.

In the ‘90s, many claimed that the WWW would “go dark”. That is: be so well encrypted that it would escape all surveillance, and provide a safe haven for pedophiles, terrorists and criminals. “However, this has not happened: in fact, surveillance is much more effective now than then”, said Peter Ryan in his workshop.

He mentioned this issue to put in perspective the recurrent request of governments to introduce legal backdoors in encryption (i.e. free access to citizen’s data) especially as an anti-terrorist measure. “Intelligence agencies protect us, but if they are not constrained they can be really damaging to society, producing a chilling effect on democratic discussion. Privacy is not only an individual right, but it is also an essential ingredient for a healthy, vibrant society,” Ryan pointed out.

“How can we reconcile the right to privacy, and the requirements of intelligence to access private information in order to protect society?” he asked. Ryan has experience both in academia and in intelligence. “So I have worked on both sides,” he remarked.

Before Edward Snowden’s revelations on the NSA massive spying program, the balance was supposedly struck through regulation and oversight. These bound intelligence agencies to targeted, necessary and proportionate access to private information. But in the post-Snowden era everybody is aware that intelligence is carrying on massive surveillance and active eavesdropping: from tapping optical fibres to spreading malware in computers. A less reported aspect is that intelligence agencies have also deliberately undermined the security of the Internet itself, by influencing how standards for security algorithms are established.

The first major attempt to do this was carried on by the Clinton administration. In the ‘90s, Clinton’s government was getting worried about the growth of encryption on the Internet. So it proposed that law enforcement could legally get hold of encryption keys in order to read encrypted information on demand. In practice, it attempted to enforce “key escrow” mechanisms into cryptographic products, through devices like the so-called “Clipper chip”.

Information security experts fiercely attacked these proposals. The resulting controversy was dubbed the “crypto wars”. Experts like Matt Blaze showed that the implementation was technically flawed. They also pointed out the pitfalls of abandoning good design principles like “forward secrecy”. “Attempts of law-enforcement to weaken the security of the Internet almost certainly result in vulnerabilities exploitable by others. Crypto that can be broken only by the “good guys” just isn’t going to work,” explained Ryan. In the end, the idea was abandoned, but has re-emerged recurrently, with ministers and intelligence calling for bans on strong cryptography.

“The Clipper chip style approach asks for open and legal access to keys, so it is a kind of “front door” to encryption. But recently, attempts to create secret “back doors” to Internet security standard have been revealed”, said Ryan. In particular, the NSA reportedly attempted to insert a back door into the standard for the Dual EC pseudo-random number generator. Good randomness is vital for strong cryptography but the NSA has allegedly managed influence the standard such that it knows algebraic relations behind the generator. “Front doors and back doors represent attempts to weaken the security of the Internet in ways that should only be exploitable by law-enforcement. But in practice they will certainly result in vulnerabilities exploitable by others,” argued Ryan.

“Can we find mechanisms to ensure that surveillance is constrained strictly to what is legal, necessary and proportionate, without undermining its efficacy?” Ryan asked. A set of proposals to achieve this objective can be found in a Dahgstuhl Manifesto that Ryan coordinated. This research agenda includes approaches like zero knowledge proofs. “These techniques allow you to prove a fact without revealing anything about the fact itself,” Ryan explained, “they would allow intelligence agencies to prove that they are working according to certain rules, without revealing their sensitive findings.” Another approach would be to allow just a certain percentage of surveillance. “If intelligence can only monitor, say, 5% of traffic, and this is publicly verifiable, then it will focus their minds on the right 5%,” Ryan explained.

Posted by

YOU! We invite attendees of the Heidelberg Laureate forum to support our blog with additional valuable content. Email

Leave a Reply




Bitte ausrechnen und die Zahl (Ziffern) eingeben

E-Mail-Benachrichtigung bei weiteren Kommentaren.
-- Auch möglich: Abo ohne Kommentar. +