Messaging services : Proposed EU regulation would open way to mass surveillance
Reporters Without Borders (RSF) is alarmed by a European Union proposal for combatting child sexual abuse under which the content of instant messaging apps, including encrypted ones, would be subjected to constant scanning. If adopted, such a radical measure would jeopardise journalists’ work, RSF says.
Reporters Without Borders (RSF) is alarmed by a European Union proposal for combatting child sexual abuse under which the content of instant messaging apps, including encrypted ones, would be subjected to constant scanning. If adopted, such a radical measure would jeopardise journalists’ work, RSF says.
Unveiled by the European Commission on 11 May, the proposed regulation “laying down rules to prevent and combat child sexual abuse” would complement the Digital Services Act, which is due to be adopted by the end of June.
To protect minors, the Commission recommends systematic surveillance of the content of chats on messaging apps, including encrypted messaging apps. RSF calls for encrypted messaging apps to be specifically excluded from systematic surveillance because journalists rely on them to protect their sources.
“This proposal has a very laudable goal but feigns a complete misunderstanding of encryption,” said Vincent Berthier, the head of RSF’s Tech Desk. “It’s simple: scanning end-to-end encrypted messaging services would render them useless and would be tantamount to mass surveillance! Such a demand from the European Commission is unacceptable and dangerous, both for press freedom and democracies.”
End-to-end encrypted messaging services are based on a simple principle: even the platforms that provide these services cannot decipher the content of chats. Only the sender and recipient – the two ends – can read a chat. If the content is accessible to a third party at any time, the security is irreparably compromised. But this is exactly what the European Commission is proposing.
Unrealistic proposal
When presenting the proposed regulation, European commissioner for home affairs Ylva Johansson said the aim was “about detecting child abuse material” not undermine data encryption. The proposal says that, to preserve users’ right to privacy, only disputed content should be monitored and reported by service providers. The intention is commendable but is technically impossible because all chats would have to be scanned in order to identify those that pose a problem.
RSF asks the Commission to urgently realize that it is asking platforms to open a default backdoor – a secret access that allows interference with the service being provided – without considering the consequences for the protection of journalists’ sources and the confidentiality of their work. A targeted approach focusing on documented cases, on a case-by-case basis, based on prior investigation, would be much more effective and would not undermine such a fundamental principle as the protection of sources.
The importance of encrypted communications was stressed in a 2020 report by the RSF-initiated Forum on Information and Democracy’s working group on “infodemics” – a report with 250 proposals for governments and platforms on managing digital spaces democratically. “It is important to note that creating vulnerabilities or constraints on encryption is problematic and inconsistent with human rights standards,” the report said, telling governments they should under no circumstances compromise end-to-end encryption.