Nextcloud signs public letter, opposing German plan to force decryption of chat

Originally published at: https://nextcloud.com/blog/a-bad-idea-nextcloud-signs-public-letter-opposing-german-plan-to-force-decryption-of-chat/

It is said that the one thing one can learn from history is that we don’t learn from history. The clipper chip is an example of this, and it shows again today.

It is April 16, 1993. The White House announces the ‘Clipper chip’, officially known as the MYK-78. It was meant for use in secure communication devices like phones, protecting calls from interception by encrypting them.

Each Clipper chip came with a pre-baked, unique secret key, but the chip also had one extra ‘feature’: the cryptographic key was not just known to the recipient, but also to law enforcement agencies like the CIA and FBI. In a slight nod to privacy, the ‘backdoor key’ was to be split in two and shared between two federal agencies, blocking any single party from use.

The tech world protested: weakening encryption by building in a back door was a bad idea. Only one device was ever produced, by AT&T Bell. It took just one year for a major design flaw to break the encryption, putting a final nail in the coffin of the project.

It is May 24, 2019. The German ‘Der Spiegel’ reports that the German minister of interior afairs is working on a project that will force communication apps to break their encryption and give surveillance agencies access to their content.

Today, June 11, together with over 100 other organizations, Nextcloud signed a public letter against this plan-in-development. This was a bad idea in 1993, it is a bad idea today.

Five issues with a crypto backdoor

The criticism, as explained in the letter, covers 5 main things.

First, it goes against over 20 years of successful crypto-related policy in Germany, making Germany one the most secure countries in the cyber economy.

Second, more technically speaking, the vulnerability that has to be built into messenger software can of course be used by anyone, not just the government. This means access to the data by criminals, but also, of course, employees of the companies behind the apps. And while the government has said to block any service which keeps using encryption, there are many other ways of encrypting data. The data of ‘normal’ users won’t ben encrypted anymore, but criminals and terrorists of course are very motivated to keep their communication safe.

The third point extends the last element of the second: the supposed benefits for law enforcement are dubious at best. There is no evidence of increased difficulty of surveillance, rather an increased use of it. Mostly, surveillance is done with ‘Trojan horses’, apps which infect the device of a target and, before data is encrypted, share it with law enforcement. This targeted approach works well and represents a more balanced approach to law enforcement vs privacy.

Fourth, Germany does not operate in a vacuum. The international community watches and this move will be used by authoritarian states to justify their mass surveillance. The credibility of Germany as an international proponent of freedom, leader of the free world perhaps, will be tarnished.

Last, but not least, this will have big consequences for the industry in Germany, putting it at a serious disadvantage. When people know that their digital communication in Germany isn’t entirely safe, financial services, healthcare and other sectors will be negatively impacted. The letter notes that in 2016 and 2017, the total costs of sabotage and cyber spying was over 43 billion euro, and with a weakened state of encryption the costs of breaches will go up. Innovation will suffer, as technology theft becomes easier and Germany won’t be as good a place to start a business or do R&D anymore.

All in all, it was a bad idea in 1993. It hasn’t gotten any better today. Let’s keep our data ours!

6 Likes

I appreciate that Nextcloud takes up a clear position! :+1:

Besides, for me the headline is a little bit inartfully expressed and missleading:

A bad idea: Nextcloud signs public letter, …

2 Likes

While i fully agree to that post i have to say that your point three is invalid because point two also extends to devices. To install trojans on devices and computers law enforcement has to maintain backdoor exploits onto them and keep them open as long as possible, so that the devices stay vulnerable to anyone. Also these backdoors can be used by anyone to infect the devices and law enforcement does not have the power to find those exploits by themselfs, they need to buy zero days from dubious sorces which they do not know whom else they sell this…

1 Like

I suggest changing the title of this thread, since now the impression is made that it was a bad idea for Nextcloud to sign the public letter…

2 Likes

Agree. First thought I had when reading the title was the same as sciurius. Try just dropping the words “A bad idea:” and leave the rest. Just a thought.

2 Likes

You’re not the only one to say that, changed it :wink:

1 Like

That is true. However, these trojans are only installed on targeted devices - from people who are subject of law enforcement action. And we don’t mind law enforcement targeting people, that’s their job… What we object to is making EVERYBODY unsafe just so law enforcement can more easily target people (and it doesn’t even do that, as the letter discussed).

WRT the buying of zero day exploits and such, that is indeed a dirty, nasty market and I’m not happy with that either. Then again, I’m not really sure how to fix that part - law enforcement always had to get their tools somewhere… But it is a separate issue, I think.

Yes, but the backdoor is open on all devices as well and can also be used by anyone… it is just the same thing and both should just not be the case.

Maybe I am missing something, but we were talking in the letter about backdooring all encryption - that hurts everyone. But the tools used by law enforcement to ‘break’ into devices of targets and put a secret monitoring tool on there is of course not to be found on all devices. This is only on devices of targeted suspects.

It is true that this entire process (gov’t buying zero days to use to hack devices of targets) doesn’t result in these vulnerabilities getting fixed as soon as possible, this does create a bad dynamic - as I said, it is not ideal. But imho it isn’t anywhere near as bad as breaking encryption for everybody…

You miss the point that the software needs to get onto the devices somehow, this happens via exploiting vulnerabilities bought on black market and kept open on all devices to keep the ability to install the software, instead of reporting them to the vendors sobthey can be fixed, while the shady sources may sell the vulnerabilities to other shady interested parties who might use them in whatever way…
So this is actually the same thing effectively, both expose all users to inadequate risks just to catch some criminals who actually mostblikely know how to work arround that anyways.