Fighting the wrong fight - Legislative proposals against end-to-end encryption put people and their personal information in danger and do not belong in democratic societies

Originally: February 12th
Website Up:March 8th 2023

In 1948, The United Nations Declaration of Human Rights established the right to privacy in Article 12. “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.” In light of this declaration, it is alarming how governments around the world are proposing bills that actively try to weaken people’s fundamental right to privacy instead of respecting that right. The EARN IT act (2020, 2022) in the United States, the Online Safety Bill (2021) in the United Kingdom, and European Commission’s legislative proposal to combat child sexual abuse (2022) are all recently proposed bills that threaten people’s fundamental right to privacy by taking a specific aim against end-to-end encryption, a way for people to communicate privately where no one else but the intended recipients can see the contents of the messages. Popular apps we use in our everyday lives have utilized this technology for many years, including messaging apps such as Signal and WhatsApp. Even Apple has recently provided its users an option to protect their data in iCloud using end-to-end encryption, so their personal information would be secure even if it were to become exposed in a data breach. However, despite this positive development, not everyone has been pleased. The opponents of end-to-end encryption argue that when companies make encryption easier and more convenient for anyone to use, it also means that people can use it for criminal purposes. Nevertheless, proposing bills that aim to put backdoors into encryption will only weaken the protection of people’s personal information and thus make it more likely to be abused. Moreover, creating a mass surveillance system should not belong to democratic societies in the first place, no matter how noble the aim is. To keep this article’s scope narrow enough, I will focus on the European Commission’s legislative proposal as it is the most relevant bill for people living in EU countries.

 

The European Commission argues that new legislation concerning child sexual abuse is needed because of the massive scale of the problem. In 2021, there were at least 85 million pictures and videos of child sexual abuse material (CSAM) circulating the web. Furthermore, the Commission claims that detection has not been effective enough because it has been based on companies voluntarily transferring information about these cases to relevant authorities. In the Commission’s legislative proposal, the plan is to go further than just detecting CSAM but additionally trying to identify “grooming”. The goal is to prevent child exploitation as early as possible, but as the Commission acknowledges, this goal “requires automatically scanning through texts in interpersonal communications.” Simply put, people’s private communication would not be confidential anymore, as an algorithm would be scanning them for potential grooming. However, child protection organizations have seen value in this form of scanning and are supporting the bill. ECPAT, a global network of organizations trying to end child sexual abuse, has defended the bill by arguing it would advance children’s rights.

 

On the other hand, many civil liberty groups have voiced their concern over the European Commission’s bill and its negative consequences on people’s privacy and security. Many people, such as human rights activists, journalists, and lawyers, rely on end-to-end encryption to keep their communication secure and private. If the bill were to pass, it would put these people with higher threat models, as well as everyone else, in a situation where they cannot trust anymore that they can communicate safely. One of the most fundamental issues with this bill is that scanning end-to-end encrypted messages the way it proposes would break the whole idea of end-to-end encryption, as its entire principle is that no outsider can intercept the message in any way. Therefore, breaking end-to-end encryption would put everyone at risk, including the children these bills try to protect.

 

Now, the defenders of the bill could argue that if you are not a criminal, you do not have to worry about the scanning, as it would specifically look for CSAM, nothing else. However, scanning child sexual abuse material has proven to be error-prone, leading to situations where innocent people get accused of holding CSAM. There have even been situations where people have had their Google accounts deleted because Google mistakenly accused them of having CSAM on their accounts. Furthermore, I would argue that this kind of “nothing to hide argument” is questionable, to begin with. Valuing the right to privacy is often less about having something to hide and instead having something to protect, like sensitive personal information such as medical records. A famous American whistleblower Edward Snowden has superbly stated,

“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say.”

 

The last argument against this bill is about mass surveillance in general. I argue that establishing a mass surveillance system that will scan everyone’s personal communication cannot belong to democratic societies. Furthermore, mass surveillance has been proven to be a vastly ineffective way to combat crime. For example, there is no proof that the Patriot Act, a wide-reaching surveillance law enacted after 9/11 to scan people’s personal communication, prevented a single terrorist attack. A prominent security and cryptography expert Matthew D. Green, who works at Johns Hopkins as an Associate Professor of Computer Science, has described the European Commission’s bill as “the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR.” In the end, establishing a mass surveillance system is no different from having the government install cameras in our homes just because we could be doing something illegal. If we are not okay with that, why should the scanning of all of our communication be?

 

Even if this article argues against the European Commission’s bill proposal, it is still critical to emphasize how serious the problem of child sexual abuse is in the world, and thus, governments and the EU should try to find ways to combat it. However, proposing a mass surveillance system that will treat everyone as a potential criminal cannot be the right approach in a democratic society. The significant risks such a system would place on people’s privacy and security are unacceptable, and thus the European Commission should seriously consider alternative methods to those it has proposed. For example, a European Digital Rights group EDRi has suggested better alternatives to the Commission’s proposal that would fight against child sexual abuse while also respecting people’s fundamental rights to privacy. At the center of their proposal is that possible interventions must stay on an individual level while having a proper legal basis and warrant. Furthermore, EDRi wants to direct more resources into prevention and other methods that will address the root issues better than these proposed solutions relying only on technology.

 

To conclude, this has been an article I never thought I would have to write, at least in a European context. The Snowden revelations in 2013 disclosed that creating a far-reaching mass surveillance system was possible, even if it was probably unconstitutional in the first place. Back then, the key reason for creating such a system was combating terrorism, which the 9/11 terrorist attacks seriously triggered. These days, when surveillance laws get proposed, the primary argument has shifted from combatting terrorism to fighting child sexual abuse, a noble cause that everyone universally agrees is a crucial problem to solve. However, as horrible as the problem is, it is no reason to start treating everyone as a potential perpetrator by violating everyone’s fundamental right to privacy with a mass surveillance system. Do we really want to follow in China’s footsteps and become the next surveillance state? Even if the EU bill would get rejected in the end, it has undoubtedly set the Commission to sail in very dangerous waters.