EU moves closer to adopting “Chat Control 2” online child abuse regulation

The European Union is preparing to adopt a regulation on preventing and combating child sexual abuse online. Informally referred to as Chat Control 2, the proposal has been under discussion since May 2022 and is scheduled for a vote on 14 October 2025 under the Danish Presidency of the Council, Kazinform News Agency correspondent reports.

EU flag
Photo credit: Unsplash.com

On 11 May 2022, the European Commission presented its proposal for a Regulation on laying down rules to prevent and combat child sexual abuse. The Commission stated that current voluntary measures by online service providers to detect and report such material were insufficient, leading to uneven practices and gaps in reporting.

The proposal covers providers of messaging services, web-based email, hosting services, and other services where interpersonal communication occurs. It sets out obligations for risk assessment, mitigation measures, and, when ordered by competent authorities, the detection, reporting, and removal of child sexual abuse material (CSAM) and, potentially, grooming attempts.

Detection technologies could be applied directly on the user’s device before content is transmitted, including in services using end-to-end encryption.

Compromise text

The Danish Presidency’s July 2025 compromise text on the proposed EU regulation to prevent and combat child sexual abuse keeps the main framework of the original Commission proposal but adds several significant provisions.

One of the most notable changes is the introduction of a risk categorisation system. Under this approach, online services would be classified as low, medium, or high risk based on a set of objective criteria. If significant risks remain after a provider has implemented mitigation measures, authorities could apply detection orders to services deemed high risk.

The scope of detection orders is defined more narrowly than some earlier drafts. At this stage, they would apply only to known and new child sexual abuse material in visual formats and to URLs. Text and audio content are excluded, although the regulation includes a review clause that would allow lawmakers to consider adding grooming detection in the future.

To address concerns about oversight and privacy, the compromise text sets out safeguards. Detection orders would require authorisation from either a judicial authority or an independent administrative body. Any content flagged under these orders would have to be pseudonymised before it could be reviewed by a human.

The proposal also establishes strict technology vetting requirements. Any detection tools used must undergo assessment to ensure their effectiveness, evaluate their impact on cybersecurity, and confirm compliance with fundamental rights before they can be approved.

Finally, the compromise text clarifies exemptions. Communications related to state functions, such as national security or military activities, would be excluded from the regulation’s scope.

Earlier, Kazinform News Agency reported that Belgium issued 12 guidelines for AI use in advertising.

Most popular
See All