EU Plan to Mandate Messaging Apps to Detect CSAM Faces Criticism
A European Union (EU) proposal that mandates messaging platforms to scan private communications for child sexual abuse material (CSAM) could possibly generate millions of false positives daily, according to an open letter published by hundreds of security and privacy professionals. This controversial plan by the EU has been under scrutiny since its inception two years ago, with objections raised by independent experts, European Parliament lawmakers, and the EU’s own Data Protection Supervisor.
Concerns Over Unspecified Detection Scanning Technologies
The EU proposal not only necessitates messaging platforms to scour for known CSAM but also demands the use of unspecified detection scanning technologies to identify unknown CSAM and grooming activity. Critics argue that this expectation borders on the technologically impossible and will not serve its intended purpose of safeguarding children from abuse. They warn that such a regulation could severely compromise internet security and users’ privacy by obligating platforms to carry out sweeping surveillance with unproven and risky technologies like client-side scanning.
Doubts About Technological Feasibility
Experts have expressed doubts over the existence of a technology capable of fulfilling the law’s requirements without causing significant harm. Despite these warnings, the EU appears to be steadfast in its decision. The recent open letter addressed amendments proposed by the European Council to the draft CSAM-scanning regulation, arguing that the changes fail to rectify the fundamental issues with the plan.
Signatories of the Open Letter
Among the 270 signatories of the letter at the time of writing are hundreds of academics, including renowned security experts like Professor Bruce Schneier from Harvard Kennedy School and Dr. Matthew D. Green from Johns Hopkins University. The letter also features a few researchers from tech giants such as IBM, Intel, and Microsoft.
Pushback from European Parliament MEPs
Last fall, MEPs in the European Parliament united to propose a substantially revised approach. This approach would limit scanning to individuals and groups suspected of child sexual abuse and to known and unknown CSAM, eliminating the need to scan for grooming. It also seeks to reduce risks to end-to-end encrypted (E2EE) communications by limiting scanning to platforms that are not end-to-end encrypted. The European Council, another key player in EU lawmaking, has yet to take a position on this matter.
Experts Warn of Security and Privacy Disaster
The experts caution that the proposed amendments by the Belgian Council presidency still do not address the fundamental flaws in the Commission approach. They argue that these revisions still pave the way for “unprecedented capabilities for surveillance and control of Internet users” and could undermine a secure digital future for society, with massive implications for democratic processes in Europe and beyond.
Implications of False Positives
The experts also warned that the Council’s proposal to limit the risk of false positives by defining a “person of interest” could still result in a large number of false alarms. Given the vast number of messages sent on these platforms, even a detector with a low false positive rate could lead to millions of false positives daily, particularly on platforms with millions or even billions of users, such as WhatsApp.
Increasing Adoption of E2EE
With the increasing adoption of E2EE, the likelihood of services being categorized as high risk is set to increase. This could result in almost all services being classified as high risk, which could affect a massive number of people indiscriminately. This is a significant concern as the interoperability requirements introduced by the Digital Markets Act could result in messages flowing between low-risk and high-risk services.
Challenges to Safeguarding Encryption
The experts argue that detection in end-to-end encrypted services fundamentally undermines encryption protection. They warn that introducing detection capabilities, whether for encrypted data or for data before it is encrypted, infringes on the confidentiality provided by end-to-end encryption.
Looking Forward
Should the EU persist with its current course, the experts warn of catastrophic consequences, potentially affecting how digital services are used worldwide and negatively impacting democracies globally. As the proposal for a regulation to combat child sexual abuse is set to be discussed on May 8, the world will be watching for the Council’s stance on this critical issue.