How the EU is drifting toward authoritarian control
By Tomaz Lovsin
The European Union is at a crossroads in its digital policy. A draft regulation to prevent and combat child sexual abuse (CSA) — often called “Chat Control” — would allow authorities to order private messaging and hosting services to scan for prohibited content inside people’s communications. Supporters present this as a child-protection measure. Critics, including Europe’s own legal experts, warn it risks normalizing suspicionless surveillance and undermining core rights.
At stake is not just a single law but the future architecture of private communication in Europe.
The Proposal: How “Chat control” works
The Commission’s CSA Regulation would create a three-step system:
Providers of messaging and hosting services must conduct risk assessments and mitigation.
If authorities find a “significant risk,” they can issue detection orders.
Providers must then scan content and report findings to a new EU Centre on Child Sexual Abuse, which distributes “indicators” like hashes and patterns.
Detection orders are the linchpin. They can compel providers to look for three categories: known CSAM (via hash-matching), “new” CSAM, and grooming. For encrypted services, compliance would require client-side scanning — inspection on the user’s device before encryption. This is why privacy regulators warn that the system amounts to general monitoring of everyone’s communications.
The legal guardrails
Two major institutions have already raised red flags.
Council Legal Service (April 2023): It concluded that detection orders for private messaging would amount to general and indiscriminate surveillance, likely violating the EU Charter’s privacy and data-protection rights. Because interpersonal services cover nearly the entire population, such orders would effectively surveil everyone.
European Court of Human Rights (Podchasov v. Russia, February 2024): The Court ruled that weakening or bypassing end-to-end encryption creates indiscriminate surveillance, violating Article 8 on private life.
Together, these opinions underline that mandatory scanning of encrypted communications is legally difficult, if not impossible, to justify under European rights law.
Parliament vs. Council
The legislative fight reflects these concerns.
European Parliament has tried to limit scope: rejecting indiscriminate scanning, exempting end-to-end encrypted services, and narrowing detection mainly to known CSAM. Regulators welcomed these changes as more rights-compatible.
Council of the EU, driven recently by the Danish Presidency, continues to push compromise drafts that preserve broad detection orders, including for encrypted messengers. Civil-society trackers describe these texts as “recycled” attempts to bring back client-side scanning.
This deadlock explains why the current temporary derogation — which merely permits voluntary detection — has been extended until April 2026.
False positives and real-world risks
Supporters often argue that “platforms already scan.” That is misleading. Today’s practices are voluntary and limited. Under the CSA Regulation, scanning would become mandatory, including for “new” material and grooming, which rely on AI classifiers.
Such systems are error-prone. At population scale, even tiny error rates would generate thousands of false reports. Examples include teenagers’ consensual sexts or parents’ baby photos wrongly flagged. These mistakes are not rare anomalies; they are structural to automated detection. The result: innocents drawn into police investigations, intimacy bundled into suspicion, and trust in digital platforms eroded.
A democratic deficit
Beyond technical flaws lies a political concern. The Commission has been accused of micro-targeting political ads to promote Chat Control, a controversial tactic for an institution tasked with regulating such practices. Critics argue this short-circuits open debate, pushing sweeping surveillance powers under the banner of child protection while sidestepping democratic scrutiny.
The pattern echoes earlier EU initiatives. Laws framed around terrorism, disinformation, or child protection often ratchet surveillance powers, narrowing spaces for dissent and private life. With the CSA proposal, the Commission risks entrenching this drift.
Comparison with the US Patriot Act
The United States offers a cautionary tale. After 9/11, Section 215 of the Patriot Act was used to justify bulk collection of Americans’ phone metadata. Years later, courts found the interpretation unlawful. Congress replaced it with a narrower model, which itself collapsed due to compliance problems. By 2020, the program was abandoned.
The lesson is clear: emergency powers expand quickly, are hard to roll back, and often prove unlawful or ineffective. Europe faces the same risk if it normalizes scanning infrastructure that can be easily repurposed from child protection to other aims, such as “serious crime” or “public order.”
Professions and relationships at risk
Confidential communication is essential not just for personal life but for democratic institutions:
Journalists rely on source protection to expose corruption or abuses. Mandatory scanning would chill whistleblowing.
Lawyers require secure exchanges with clients. Pre-screening threatens the right to a fair trial.
Doctors and therapists depend on confidentiality for patients to seek help. Inspection would drive self-censorship.
The European Court has consistently upheld these protections — from journalists’ sources (Sanoma v. Netherlands, 2010) to lawyer-client secrecy (Niemietz v. Germany, 1992) and medical confidentiality (Z v. Finland, 1997). A CSA regime mandating scanning would undercut these foundations of democratic life.
The bigger picture: From DSA to CSA
The Digital Services Act (DSA) introduced risk assessments, audits, and compliance duties for platforms, especially around minors’ safety. The CSA Regulation builds on that scaffolding but shifts from platform governance to private communications surveillance. Critics warn this is a qualitative leap: from managing online platforms to pre-screening citizens’ messages.
This is why regulators stress that once client-side scanning infrastructure exists, repurposing it becomes trivial. History shows mission creep is not hypothetical — it is inevitable.
What happens next
The Justice and Home Affairs Council will meet October 13–14, 2025, under the Danish Presidency. As of early September, no formal agenda includes a CSA vote. Still, campaigners warn this may be the critical window for compromise. Decisions made then could shape European communications for decades.
Stopping the drift requires clear red lines:
No client-side scanning of private communications.
No orders that weaken or bypass encryption.
No “temporary” exceptions that become permanent surveillance gateways.
Instead, critics propose alternatives: better resourcing for targeted investigations, faster takedowns of abusive material, survivor support, and child-safety design that does not conscript entire populations into suspicionless scans.
Conclusion
The choice is not between protecting children and protecting privacy. It is between safety with rights and safety instead of rights. Europe’s legal guardrails are already clear: mass detection of private communications and weakening of encryption are incompatible with fundamental rights.
If the Union crosses that line, it will not be out of ignorance but out of political will. The time to decide is now — before infrastructure makes surveillance the default.
Tomaz Lovsin, is a Slovenian national residing in Cyprus with an LL.M degree in law
Read full article here:
https://oblivion777.substack.com/p/from-child-safety-to-chat-surveillance
Click here to change your cookie preferences