Key Takeaways:
- The EU has introduced a new Chat Control proposal: Mandatory scanning is gone, but Article 4’s ‘risk mitigation’ clause could still push services toward scanning private and encrypted messages.
- Anonymity could be severely limited: Age-verification rules would make anonymous accounts difficult, affecting journalists, whistleblowers, and users who rely on privacy for safety.
- The scope of scanning is expanding: The proposal allows detection of chat text and metadata, raising concerns about large-scale monitoring across the EU’s 450M citizens.
- The technology behind it still isn’t viable: Experts say safe CSAM detection in encrypted apps doesn’t exist yet, even Apple abandoned its own client-side scanning system after backlash.
The Chat Control proposal is back in Brussels. Again.Â
Lawmakers are treating it like a familiar guest who keeps showing up at the door wearing a slightly different jacket. Privacy experts say the jacket is hiding something sharp.Â
A revised version of the EU’s Child Sexual Abuse Regulation (CSAR) has now moved from the Law Enforcement Working Party to the Coreper (Committee of Permanent Representatives).
Coreper is the group of permanent representatives from all EU member states, if Coreper likes the text, the Council will adopt its position. After that, the proposal jumps straight into a fast trilogue.
On paper, the new version looks softer. Mandatory scanning of private chats, photos, and URLs was removed. Scanning is now voluntary. Lawmakers seem happy. They might even feel relieved.Â
Privacy experts, however, are staring at one line in Article 4 like it’s a hidden knife taped under the table.
Let’s break down what actually changed, what didn’t, and why critics say this version may be even worse than the old one.Â
The ‘Voluntary’ Scanning That Doesn’t Feel Very Voluntary
The Denmark Presidency produced the new compromise after negotiations stalled for over three years.Â
When the Law Enforcement Working Party met on November 12, the group accepted it with broad support.Â
No dissenting votes. No further changes needed. A rare moment of harmony inside the EU Council meeting room.
The key change is the removal of mandatory scanning. Messaging apps will not be forced to scan shared pictures, videos, or URLs. Providers like WhatsApp, Signal, Telegram, and email services can choose to scan for CSAM material.Â
It sounds like the pressure is gone.
But then Article 4 happens. It includes something vague, flexible, and extremely powerful. It’s called a ‘risk mitigation measure.’ High-risk services may need to apply ‘all appropriate risk mitigation measures.’ The phrase feels harmless until you imagine how governments could interpret it.
Germany has publicly reaffirmed opposition to a version of the proposal that mandates scanning of encrypted chats. Whether it will maintain that firm stance throughout the negotiations remains uncertain.
Patrick Breyer, digital rights jurist and longtime critic of Chat Control, says this line reintroduces mandatory scanning through the back door. His argument is simple.Â
If a service is labeled ‘high-risk,’ it might be obliged to scan everything anyway. Even private, end-to-end encrypted content.
Breyer says this could make client-side scanning mandatory. That is when your phone or laptop scans your messages before encryption kicks in. It essentially turns your device into a small police assistant.Â
You never asked for that. It’s like buying headphones and discovering they also whisper everything you say back to a security office.Â
Encryption Isn’t Just a Tech Feature – It’s How Modern Life Works
The biggest concern is the effect on end-to-end encryption. This is the shield that protects private communication on WhatsApp, Signal, and other messengers.
It’s the same shield used by journalists, doctors, activists, lawyers, and everyone who occasionally sends a photo of their passport to a friend for a hotel booking.
Breaking encryption has always been the red line. No government has found a safe way to weaken encryption for criminals without also weakening it for everyone else.Â
It’s like removing the doors from all apartments in a building because one person is suspected of wrongdoing.Â
Everyone becomes vulnerable, and burglars get a Black Friday sale they didn’t expect.
The new compromise avoids saying ‘break encryption.’ It uses vague language. However, privacy specialists argue that the outcome remains the same.Â
If scanning becomes a mandatory risk mitigation measure, encrypted platforms will need to scan content before encryption is applied. That collapses the entire security model.Â
Anonymous Communication May Also Be on the Line
The Fight Chat Control group published a summary of the new text. They highlight another major change. Anonymous communication becomes nearly impossible.
The proposal requires every user to verify their age before accessing communication services. This eliminates the option to create anonymous accounts.Â
That affects whistleblowers and journalists. It affects people escaping abusive households. It affects people living under repressive governments who rely on anonymity for safety.
Requiring age verification for every single user is like asking everyone to show their passport before entering a grocery store. It may solve one problem. It creates many more.
Article 6 also includes restrictions that critics call a ‘digital house arrest’ for minors. It bans children from installing many apps associated with grooming risk. The list includes WhatsApp, Instagram, and even online games like Roblox.Â
Imagine a 15-year-old today without messaging apps or online games. They would end up communicating solely through school assignments and fridge magnets.Â
Why This Version Worries Experts Even More
The original proposal already concerned privacy advocates. It focused on scanning photos, videos, and URLs for CSAM content.
The new version goes further. Breyer notes it includes scanning of private chat text and metadata. Metadata can reveal who you talk to, how often, and from where you talk to them.Â
It turns the communication graph of the entire EU population into a map available for inspection.
This shift from media scanning to text scanning is a significant development. It expands what authorities can request. It expands the scope of what companies must monitor to avoid being labeled ‘high-risk.’ And it expands the potential for abuse.
Critics also point out that voluntary scanning does not guarantee privacy.Â
If one major app decides to comply, others may feel pressure to follow. Competition might turn into a race where the winner is the one who scans the most.Â
A Political Win, A Technical Minefield
Politically, lawmakers are celebrating. After years of deadlock, they finally have a text that appears less aggressive. Removing mandatory scanning looks like a concession.Â
It’s easy to present this as a victory for privacy. Â
Technically, the situation is far from reassuring. The proposal now relies heavily on interpretation. The phrase ‘all appropriate risk mitigation measures’ could mean anything.Â
It gives enormous discretion to authorities. It lets governments later argue that scanning is essential for safety.Â
That is why privacy groups call this version a political trick. It removes the scary parts from the front of the bill. Then it grows them back under a different name.Â
The EU Parliament Will Have Its Say – But History Is ComplicatedÂ
The next step is Coreper. If they approve the text on November 19 or soon after, the Council will adopt its official position.Â
Then a trilogue begins between the Council, the Commission, and the European Parliament.Â
In theory, Parliament could oppose it.Â
In practice, Parliament has often compromised on surveillance laws after political pressure. Privacy groups fear a rushed trilogue where Parliament gives in to urgency.
The Council and the Commission are now aligned. Both want stronger online monitoring. This alignment alone makes many observers nervous.Â
The Bigger Story: Europe Keeps Trying to Build Scanning Systems That Don’t ExistÂ
There is a broader theme here. The EU continues to propose scanning systems that experts say cannot operate safely.Â
The automatic detection of CSAM in encrypted environments remains technically unsolved. Client-side scanning has accuracy issues, privacy concerns, and a potential for misuse.Â
Even Apple backed away from its own client-side scanning feature after heavy criticism from researchers.Â
The EU is once again attempting to regulate technology that does not yet exist in a safe form. It is similar to writing a law that requires cars to fly by next summer.Â
The idea might be noble. The engineering reality is not ready.Â
Governments want a system that detects serious crimes. Researchers seek a system that malicious actors cannot exploit. Companies want a system that doesn’t destroy trust.Â
So far, no system satisfies all three.Â
The Real Test Is About to Begin
The next few days will decide how far the EU is willing to push this plan.Â
Coreper will review the text, and if nobody objects, the Council will lock in its position fast. Privacy groups and security experts are raising alarms again because the new compromise still creates a path to mass scanning, even if the language looks softer.Â
The proposal also threatens anonymity and introduces new monitoring routes that could reshape private communication for 450M people in the EU.Â
Lawmakers call it progress. Experts call it a warning sign.Â
Everything now depends on how Article 4 is interpreted and how much power it quietly hands over. The final battle will happen in trilogue, and the tech community is already bracing for impact.Â
The Tech Report editorial policy is centered on providing helpful, accurate content that offers real value to our readers. We only work with experienced writers who have specific knowledge in the topics they cover, including latest developments in technology, software, hardware, and more. Our editorial policy ensures that each topic is researched and curated by our in-house editors. We maintain rigorous journalistic standards, and every article is 100% written by real authors.






