Experts Warn EU's Plan to Mandate CSAM Scanning in Messaging Apps Could Lead to Millions of False Positives

A contentious move by European Union lawmakers to mandate messaging platforms to scan private communications for child sexual abuse material (CSAM) has drawn significant concern, with hundreds of security and privacy experts voicing their apprehensions in an open letter published Thursday. This scrutiny has intensified since the European Commission proposed the CSAM scanning initiative two years ago, sparking alarm from independent experts, members of the European Parliament, and the EU’s own Data Protection Supervisor.

Under the proposed legislation, messaging platforms would be required not only to scan for known CSAM upon receiving a detection order but also to employ unspecified scanning technologies to detect unknown CSAM and real-time grooming activities. Critics label this approach as naive technosolutionism, arguing that it is technically unfeasible and unlikely to fulfill the aim of protecting children. Instead, they contend, it could destabilize internet security and infringe on user privacy through widespread surveillance, relying on untested technologies, such as client-side scanning.

Experts assert that no existing technology can meet the law's demands without causing substantial harm. Despite this, the EU continues to advance the proposal. The recent open letter specifically addresses amendments to the CSAM-scanning regulation put forth by the European Council, with signatories arguing that these changes do not remedy the proposal’s core issues.

At the time of writing, the letter has been endorsed by 270 individuals, including noted academics and security professionals like Bruce Schneier of Harvard Kennedy School and Matthew D. Green from Johns Hopkins University, along with researchers affiliated with tech giants such as IBM, Intel, and Microsoft. An earlier letter from July, signed by 465 academics, criticized the detection technologies embedded in the proposal as inherently flawed and susceptible to attacks, potentially jeopardizing the vital protections offered by end-to-end encryption (E2EE).

Minimal support for alternative solutions has emerged. Last fall, Members of the European Parliament pushed back with a reformed approach that would limit scanning to individuals and groups already suspected of abuse and exclude grooming detection, aiming to protect E2EE. However, the European Council, the other legislative body in the EU law-making process, has yet to weigh in, which will shape the final outcome.

The latest amendment, introduced by the Belgian Council presidency in March, aims to refine detection orders by employing risk categorization and mitigation strategies while enhancing cybersecurity and encryption protections. Nevertheless, the 270 signers caution that these changes merely scratch the surface of a potential security and privacy crisis.

The experts warn that from a technical perspective, implementing this proposal could undermine the security of communications and systems. Relying on unreliable detection technologies to produce targeted orders might not lessen the risk of ushering in an era of mass surveillance of user messages.

Additionally, the open letter critiques the Council’s plan to classify “persons of interest” as users who have shared CSAM or attempted grooming, proposing automated assessments where false positives may still abound. They emphasize the improbability of significantly reducing false alarms without an unmanageable number of verified incidents. Given the billions of messages exchanged daily across platforms like WhatsApp, even a refined detection system could yield millions of erroneous alerts.

Experts regard a proposal to restrict detection orders to “high-risk” messaging apps as ineffective, arguing such a designation would still impact a broad swath of users. They note that features necessary for sharing CSAM are commonplace among various service providers. Moreover, the rise in adoption of E2EE suggests more services will face high-risk categorizations, compounding the issue.

The letter also reiterates critical points made by security experts over recent years, asserting that detection within E2EE services inherently undermines the protection encryption provides. The new proposal claims to safeguard cybersecurity while maintaining detection within encrypted environments—a contradictory stance, according to the authors.

In recent weeks, police chiefs across Europe have also expressed concern about the rise of E2EE, urging platforms to devise security systems that still allow for the identification of illegal activity. This intervention appears to pressure lawmakers to enact regulations like the CSAM scanning initiative, with police leaders denying any calls for backdooring encryption yet providing no clear alternative solutions for achieving lawful access.

Should the EU proceed without reevaluating the current approach, the signers warn, the consequences could be dire. They argue this legislation could set a dangerous precedent by filtering internet communications, undermining individuals’ rights to privacy in the digital realm, and adversely affecting youthful internet users who rely on online platforms for communication. They assert that it could alter global digital service usage, potentially destabilizing democratic processes worldwide.

An EU source close to the Council has not offered insights into current discussions among Member States but mentioned that a meeting is scheduled for May 8, where the CSAM regulation proposal will be under review.

This article highlights the ongoing debate surrounding the EU's CSAM-scanning proposal and its implications for privacy and security in digital communications.

Most people like

Find AI tools in YBX