Regulation to Prevent and Combat Child Sexual Abuse

The Regulation to Prevent and Combat Child Sexual Abuse (Child Sexual Abuse Regulation, or CSAR) is a European Union regulation proposed by the European Commissioner for Home Affairs Ylva Johansson on 11 May 2022. The stated aim of the legislation is to prevent child sexual abuse online through the implementation of a number of measures, including the establishment of a framework that would make the detection and reporting of child sexual abuse material (CSAM) by digital platforms &mdash; known by its critics as chat control &mdash; a legal requirement within the European Union.

Support for the proposal
Supporters of the regulation include dozens of campaign groups, activists and MEPs, along with departments within the European Commission and European Parliament themselves. Opponents include civil society organisations and privacy rights activists.

The European Commission's Directorate-General for Migration and Home Affairs argues that voluntary actions by online service providers to detect online child sexual abuse are insufficient. They emphasize that some service providers are less involved in combating such abuse, leading to gaps where abuse can go undetected. Moreover, they highlight that companies can change their policies, making it challenging for authorities to prevent and combat child sexual abuse effectively. The EU currently relies on other countries, primarily the United States, to launch investigations into abuse occurring within the EU, resulting in delays and inefficiencies.

Several bodies within the EU claim the establishment of a centralized organization, the EU Centre on Child Sexual Abuse, would create a single point of contact for receiving reports of child sexual abuse. It is claimed this centralization would streamline the process by eliminating the need to send reports to multiple entities and would enable more efficient allocation of resources for investigation and response.

Proponents also argue for the need to improve the transparency of the process of finding, reporting, and removing online child sexual abuse material. They claim that there is currently limited oversight of voluntary efforts in this regard. The EU Centre would collect data for transparency reports, provide clear information about the use of tools, and support audits of data and processes. It aims to prevent the unintended removal of legitimate content and address concerns about potential abuse or misuse of search tools.

Another aspect highlighted by supporters is the necessity for improved cooperation between online service providers, civil society organizations, and public authorities. The EU Centre is envisioned as a facilitator, enhancing communication efficiency between service providers and EU countries. By minimizing the risk of data leaks, the Centre aims to ensure the secure exchange of sensitive information. This cooperation is crucial for sharing best practices, information, and research across different countries, thereby strengthening prevention efforts and victim support.

Criticism of the proposal
Groups opposed to this proposal often highlight that it would impose mandatory chat control for all digital private communications, and as such commonly refer to the proposed legislation by the name "Chat Control". Civil society organisations and activists have argued that the proposal is not compatible with fundamental rights, infringing on the right to privacy. Moreover, the proposal has been criticised as technically infeasible. In Ireland, only 20.3% of the reports received by the Irish police forces turned out to be actual exploitation material. Specifically, from a total of 4192 reports received, 471 i.e. more than 10% were false positives.

The European Parliament commissioned an additional impact assessment on the proposed regulation which was presented in the Committee on Civil Liberties, Justice, and Home Affairs. The European Parliament's study heavily critiqued the Commission's proposal. According to the Parliament's study, there aren't currently any technological solutions that can detect child sexual abuse material, without resulting in a high error rate which would affect all messages, files and data in a particular platform. In addition, the European Parliament's study concluded that the proposal would undermine end-to-end encryption and the security of digital communications. Lastly, the study highlighted that the proposed regulation would make teenagers "feel uncomfortable when consensually shared images could be classified as CSAM".

The Council of the European Union's Legal Service also criticised the impact of the Commission's proposal on the right to privacy. The Council's legal opinion emphasized that the screening of interpersonal communications of all citizens affects the fundamental right to respect for private life as well as the right to the protection of personal data. The legal experts of the Council also referenced the jurisprudence of the EU Court of Justice, which has ruled out against generalised data retention.

The European Data Protection Supervisor (EDPS) together with the European Data Protection Board (EDPB) stated, in a joint opinion, that "the Proposal could become the basis for de facto generalized and indiscriminate scanning of the content of virtually all types of electronic communications", which could have chilling effects on sharing legal content.

In March 2023, introduced a revised version of the proposal, which Germany's Digital Affairs Committee noted drew strong opposition from several groups. The new scheme, referred to as "Chat Control 2.0", proposed to implement scanning on encrypted communications. In April 2023, the European Parliament confirmed that they had received messages calling to vote against the European Commission's chat control proposal. Citizens expressed their concerns that the new legislation would breach data protection and privacy rights.

EU Commissioner Ylva Johansson has also been heavily criticised regarding the process in which the proposal was drafted and promoted. A transnational investigation by European media outlets revealed the close involvement of foreign technology and law enforcement lobbyists in the preparation of the proposal. This was also highlighted by digital rights organisations, which Johansson rejected to meet on three occasions. Commissioner Johansson was also criticised for the use of micro-targeting techniques to promote its controversial draft proposal, which violated the EU's data protection and privacy rules.

Legislative process
On November 14 2023, the European Parliament's Committee on Civil Liberties, Justice, and Home Affairs (LIBE), voted to remove indiscriminate chat control and allow for the targeted surveillance of specific individual and groups which are reasonably suspicious. Moreover, Members of the European Parliament voted in favour of the protection of encrypted communications.

In February 2024, the European Court of Human Rights ruled, in an unrelated case, that requiring degraded end-to-end encryption "cannot be regarded as necessary in a democratic society". This underlined the European Parliament's decision to protect encrypted communications.

In May 2024, Patrick Breyer reported that moves were again being made to restore indiscriminate message scanning to the legislation, under the name of "upload moderation".

On 21 June, it was reported that voting on the legislation had been temporarily withdrawn by the EU Council, in a move that is believed to be the result of pushback by critics of the proposal including software vendors.