EU fails to extend rules on child abuse content detection by online platforms
The article reports that the European Union failed to reach an agreement on extending temporary rules that allow online platforms to detect and remove child sexual abuse material (CSAM).
These rules, in place since 2021, permitted companies like Google and Meta to voluntarily scan content, including messages, despite strict EU privacy laws. They are set to expire on April 3, 2026, creating a legal vacuum.
The disagreement stems mainly from conflicts between EU member states and the European Parliament over the scope of the rules. Lawmakers pushed to limit or exclude end to end encrypted communications from scanning, arguing for stronger privacy protections, while many governments said this would make the measures ineffective.
The issue reflects a broader tension between child safety and privacy rights. Supporters of the rules argue that detection tools are essential for identifying victims and removing illegal material, while critics warn against mass surveillance and intrusion into private communications.
Because no extension was agreed, platforms may lose legal certainty to continue detecting such content, potentially weakening efforts to combat online child abuse until a permanent law is finalized.
Meanwhile, a long term EU regulation on child sexual abuse content has been stalled since 2022 due to similar disagreements, leaving the issue unresolved.




