The myth of Responsible Encryption: why it would never work – CNET


Security

The myth of Responsible Encryption: why it would never work

Government officials call for a way to both protect consumers and allow law enforcement and see criminal data. But experts say it’s a fantasy.

security-privacy-hackers-locks-key-6777

Governments want tech companies to create a master key that only law enforcement can use. Security experts say it’s a fantasy.

James Martin/CNET

Governments want to have their cake and eat it too.

Many support a concept called “responsible encryption,” which provides complete privacy and security for people, while also allowing law enforcement to see encrypted messages to better protect you.

Sounds fantastic, right? Unfortunately, security experts say it’s a paradox.

Yet, the concept continues to rear its head. The most recent “responsible encryption” advocate is US Deputy Attorney General Rod Rosenstein, who called out tech companies refusing to help with uncovering private messages in a speech to the US Naval Academy on Tuesday.

“Responsible encryption can protect privacy and promote security without forfeiting access for legitimate law enforcement needs supported by judicial approval,” he said, according to a transcript.

Rosenstein isn’t alone. Officials in Australia and the UK have both called for it, despite the fact that both governments have also suffered major breaches that would have shattered their concept of “responsible encryption.”

“Responsible encryption,” according to the lawmakers who demand it, would entail companies creating a secret key, or backdoor that only the government can access, so they can read through messages only with a warrant or a court order. The key would be kept secret — unless it gets stolen in a breach.

Companies like Apple, WhatsApp and Signal provide end-to-end encryption, meaning people can chat privately, even hidden from the companies themselves. The encryption means that only you and the person you sent the messages to can read it, since there’s no key to unlock it.

It provides security and privacy for people who want to make sure that no one is spying on their messages — a modest request in the age of mass surveillance. But governments around the world have a problem with that.

Rosenstein instead sees a future where companies keep their data encrypted, unless the government needs it to investigate a crime or a potential terrorist attack. It’s the same rallying cry the UK’s prime minister Theresa May made after a June 4 terrorist attack on the London Bridge, blaming encryption for providing a safe space for extremists.

Rosenstein uses password recoveries  and email scanning as examples of responsible encryption, except none of those are cases of end-to-end encryption. He references an unnamed “major hardware provider,” which “maintains private keys it can use to sign software updates for each of its devices.”

Then the deputy attorney general brings up the key flaw with “responsible encryption”: creating a backdoor for police also means creating an opening for hackers.

“That would present a huge potential security problem, if those keys were to leak,” Rosenstein said. “But they do not leak, because the company knows how to protect what is important.”

Except these important files have leaked on multiple occasions, including from the US government itself.

Adobe accidentally released its private key on its security blog in September. In 2011, RSA’s SecurID authentication tokens were stolen. Stuxnet, one of the most notorious malware to exist, used stolen encryption keys to install itself. The NSA has fallen to multiple breaches now, from Russian spies stealing their secrets to the Shadow Brokers hacker group selling the agency’s tools.

“When the companies have the keys, they can be stolen,” Jake Williams, a security analyst and founder of RenditionSec, said. “Law enforcement calls [end-to-end encryption] ‘warrant proof crypto,’ but many companies will tell you they’re not trying to dodge a warrant, they’re just doing what’s right for security.”

It’s why Apple never wanted to create a backdoor for the FBI in 2016, even when the agency demanded it needed information from the San Bernardino terrorist’s iPhone. Apple CEO Tim Cook called the back door “the equivalent of cancer” in 2016, arguing that the master key could be stolen and abused by hackers, like it had been in all the previous cases.

It’s unclear why Rosenstein believes these encryption keys can’t be stolen. The Justice Department confirmed Rosenstein’s comments and declined to comment further.

The call for encryption loopholes has sent alarms through the security community, who feel like it’s deja vu, repeating the same argument they have for years.

“I think it’s extremely concerning that the man responsible for prosecuting crimes on the federal level would expect the invasion of everyone’s privacy simply to make law enforcement’s job easier,” Mike Spicer, an expert and founder of the security company Initec said.

The myth resurfaces nearly every year, Eva Galperin, the cybersecurity director at the Electronic Frontier Foundation, said. And every time, the EFF slams the demands, calling it a “zombie argument.”

“Calling it ‘responsible encryption’ is hypocritical,” Galperin said. “Building insecurity in your encryption is irresponsible.”

About CNET Privacy Policy Ad Choice Terms of Use Mobile User Agreement Help Center

Top Brands:



Source link

Share

Leave a Reply

Your email address will not be published. Required fields are marked *