EU Chat Control Advances as Privacy Experts Warn of Hidden Backdoor Risks

  • The EU has introduced a new Chat Control proposal: Mandatory scanning is gone, but Article 4’s ‘risk mitigation’ clause could still push services toward scanning private and encrypted messages.
  • Anonymity could be severely limited: Age-verification rules would make anonymous accounts difficult, affecting journalists, whistleblowers, and users who rely on privacy for safety.
  • The scope of scanning is expanding: The proposal allows detection of chat text and metadata, raising concerns about large-scale monitoring across the EU’s 450M citizens.
  • The technology behind it still isn’t viable: Experts say safe CSAM detection in encrypted apps doesn’t exist yet, even Apple abandoned its own client-side scanning system after backlash.

The Chat Control proposal is back in Brussels. Again. 

Lawmakers are treating it like a familiar guest who keeps showing up at the door wearing a slightly different jacket. Privacy experts say the jacket is hiding something sharp. 

A revised version of the EU’s Child Sexual Abuse Regulation (CSAR) has now moved from the Law Enforcement Working Party to the Coreper (Committee of Permanent Representatives).

Coreper is the group of permanent representatives from all EU member states, if Coreper likes the text, the Council will adopt its position. After that, the proposal jumps straight into a fast trilogue.

On paper, the new version looks softer. Mandatory scanning of private chats, photos, and URLs was removed. Scanning is now voluntary. Lawmakers seem happy. They might even feel relieved. 

Privacy experts, however, are staring at one line in Article 4 like it’s a hidden knife taped under the table.

Let’s break down what actually changed, what didn’t, and why critics say this version may be even worse than the old one. 

The ‘Voluntary’ Scanning That Doesn’t Feel Very Voluntary

The Denmark Presidency produced the new compromise after negotiations stalled for over three years. 

When the Law Enforcement Working Party met on November 12, the group accepted it with broad support. 

No dissenting votes. No further changes needed. A rare moment of harmony inside the EU Council meeting room.

It sounds like the pressure is gone.

But then Article 4 happens. It includes something vague, flexible, and extremely powerful. It’s called a ‘risk mitigation measure.’ High-risk services may need to apply ‘all appropriate risk mitigation measures.’ The phrase feels harmless until you imagine how governments could interpret it.

article 4 extract from the new chat control proposal.
Source: Proposal for a Regulation of the European Parliament and of the Council

Germany has publicly reaffirmed opposition to a version of the proposal that mandates scanning of encrypted chats. Whether it will maintain that firm stance throughout the negotiations remains uncertain.

Patrick Breyer, digital rights jurist and longtime critic of Chat Control, says this line reintroduces mandatory scanning through the back door. His argument is simple. 

If a service is labeled ‘high-risk,’ it might be obliged to scan everything anyway. Even private, end-to-end encrypted content.

Patrick Breyer post on X.

Breyer says this could make client-side scanning mandatory. That is when your phone or laptop scans your messages before encryption kicks in. It essentially turns your device into a small police assistant. 

You never asked for that. It’s like buying headphones and discovering they also whisper everything you say back to a security office. 

Encryption Isn’t Just a Tech Feature – It’s How Modern Life Works

The biggest concern is the effect on end-to-end encryption. This is the shield that protects private communication on WhatsApp, Signal, and other messengers.

It’s the same shield used by journalists, doctors, activists, lawyers, and everyone who occasionally sends a photo of their passport to a friend for a hotel booking.

How does end-to-end encryption work

Breaking encryption has always been the red line. No government has found a safe way to weaken encryption for criminals without also weakening it for everyone else. 

It’s like removing the doors from all apartments in a building because one person is suspected of wrongdoing. 

Everyone becomes vulnerable, and burglars get a Black Friday sale they didn’t expect.

The new compromise avoids saying ‘break encryption.’ It uses vague language. However, privacy specialists argue that the outcome remains the same. 

If scanning becomes a mandatory risk mitigation measure, encrypted platforms will need to scan content before encryption is applied. That collapses the entire security model. 

Anonymous Communication May Also Be on the Line

The Fight Chat Control group published a summary of the new text. They highlight another major change. Anonymous communication becomes nearly impossible.

fight chat control initiative extract.
Source: fightchatcontrol.eu

The proposal requires every user to verify their age before accessing communication services. This eliminates the option to create anonymous accounts. 

That affects whistleblowers and journalists. It affects people escaping abusive households. It affects people living under repressive governments who rely on anonymity for safety.

Article 6 also includes restrictions that critics call a ‘digital house arrest’ for minors. It bans children from installing many apps associated with grooming risk. The list includes WhatsApp, Instagram, and even online games like Roblox. 

Imagine a 15-year-old today without messaging apps or online games. They would end up communicating solely through school assignments and fridge magnets. 

Why This Version Worries Experts Even More

The original proposal already concerned privacy advocates. It focused on scanning photos, videos, and URLs for CSAM content.

The new version goes further. Breyer notes it includes scanning of private chat text and metadata. Metadata can reveal who you talk to, how often, and from where you talk to them. 

It turns the communication graph of the entire EU population into a map available for inspection.

This shift from media scanning to text scanning is a significant development. It expands what authorities can request. It expands the scope of what companies must monitor to avoid being labeled ‘high-risk.’ And it expands the potential for abuse.

Critics also point out that voluntary scanning does not guarantee privacy. 

If one major app decides to comply, others may feel pressure to follow. Competition might turn into a race where the winner is the one who scans the most. 

A Political Win, A Technical Minefield

Politically, lawmakers are celebrating. After years of deadlock, they finally have a text that appears less aggressive. Removing mandatory scanning looks like a concession. 

It’s easy to present this as a victory for privacy.  

It gives enormous discretion to authorities. It lets governments later argue that scanning is essential for safety. 

That is why privacy groups call this version a political trick. It removes the scary parts from the front of the bill. Then it grows them back under a different name. 

The EU Parliament Will Have Its Say – But History Is Complicated 

The next step is Coreper. If they approve the text on November 19 or soon after, the Council will adopt its official position. 

Then a trilogue begins between the Council, the Commission, and the European Parliament. 

fight chat control extract what's next.
Source: fightchatcontrol.eu

In theory, Parliament could oppose it. 

In practice, Parliament has often compromised on surveillance laws after political pressure. Privacy groups fear a rushed trilogue where Parliament gives in to urgency.

The Council and the Commission are now aligned. Both want stronger online monitoring. This alignment alone makes many observers nervous. 

The Bigger Story: Europe Keeps Trying to Build Scanning Systems That Don’t Exist 

There is a broader theme here. The EU continues to propose scanning systems that experts say cannot operate safely. 

The automatic detection of CSAM in encrypted environments remains technically unsolved. Client-side scanning has accuracy issues, privacy concerns, and a potential for misuse. 

Even Apple backed away from its own client-side scanning feature after heavy criticism from researchers. 

The EU is once again attempting to regulate technology that does not yet exist in a safe form. It is similar to writing a law that requires cars to fly by next summer. 

The idea might be noble. The engineering reality is not ready. 

Governments want a system that detects serious crimes. Researchers seek a system that malicious actors cannot exploit. Companies want a system that doesn’t destroy trust. 

So far, no system satisfies all three. 

The Real Test Is About to Begin

The next few days will decide how far the EU is willing to push this plan. 

Coreper will review the text, and if nobody objects, the Council will lock in its position fast. Privacy groups and security experts are raising alarms again because the new compromise still creates a path to mass scanning, even if the language looks softer. 

The proposal also threatens anonymity and introduces new monitoring routes that could reshape private communication for 450M people in the EU. 

Lawmakers call it progress. Experts call it a warning sign. 

Everything now depends on how Article 4 is interpreted and how much power it quietly hands over. The final battle will happen in trilogue, and the tech community is already bracing for impact. 

Anya Zhukova is an in-house tech and crypto writer at Techreport with 10 years of hands-on experience covering cybersecurity, consumer tech, digital privacy, and blockchain. She’s known for turning complex topics into clear, useful advice that regular people can actually understand and use.  Her work has been featured in top-tier digital publications including MakeUseOf, Online Tech Tips, Help Desk Geek, Switching to Mac, and Make Tech Easier.
Whether she’s writing about the latest privacy tools or reviewing a new laptop, her goal is always the same: help readers feel confident and in control of the tech they use every day.  Anya holds a BA in English Philology and Translation from Tula State Pedagogical University and also studied Mass Media and Journalism at Minnesota State University, Mankato. That mix of language, media, and tech has given her a unique lens to look at how technology shapes our daily lives.  Over the years, she’s also taken courses and done research in data privacy, digital security, and ethical writing – skills she uses when tackling sensitive topics like PC hardware, system vulnerabilities, and crypto security.  Anya worked directly with brands like Framework, Insta360, Redmagic, Inmotion, Secretlab, Kodak, and Anker, reviewing their products in real-life scenarios.
Her testing process involves real-world use cases – whether it's stress-testing laptops for creative workloads, reviewing the battery performance of mobile gaming phones, or evaluating the long-term ergonomics of furniture designed for hybrid workspaces.  In the world of crypto, Anya covers everything from beginner guides to deep dives into hardware wallets, DeFi protocols, and Web3 tools. She helps readers understand how to use multisig wallets, keep their assets safe, and choose the right platforms for their needs.  Her writing often touches on financial freedom and privacy – two things she strongly believes should be in everyone’s hands.
Outside of writing, Anya contributes to editorial style guides focused on privacy and inclusivity, and she mentors newer tech writers on how to build subject matter expertise and write responsibly. 
She sticks to high editorial standards, only recommends products she’s personally tested, and always aims to give readers the full picture.  You can find her on LinkedIn, where she shares more about her work and projects. 
Key Areas of Expertise: Consumer Tech (laptops, phones, wearables, etc.) Cybersecurity and Digital Privacy PC/PC Hardware Blockchain, Crypto Wallets, and DeFi In-Depth Product Reviews and Buying Guides Whether she’s reviewing a new wallet or benchmarking a PC build, Anya brings curiosity, care, and a strong sense of responsibility to everything she writes. Her mission? To make the digital world a little easier – and safer – for everyone. 


View all articles by Anya Zhukova

The Tech Report editorial policy is centered on providing helpful, accurate content that offers real value to our readers. We only work with experienced writers who have specific knowledge in the topics they cover, including latest developments in technology, software, hardware, and more. Our editorial policy ensures that each topic is researched and curated by our in-house editors. We maintain rigorous journalistic standards, and every article is 100% written by real authors.

Leave a Comment