Key Findings
- Chat control divides Europe: Supporters say it's important for children's safety, while opponents warn it's a “monster” that will undermine privacy.
- Encryption is at risk: Mandatory client-side scanning effectively creates backdoors, weakening end-to-end encryption and exposing users to cyber threats.
- Unproven and invasive: Automated detection systems run the risk of false alarms and may disrupt Article 7 of the EU Charter of Fundamental Rights.
- Precedent outside Europe: Weakening privacy in the EU could embolden less democratic regimes around the world by turning child safety laws into tools of mass surveillance.
The EU's long-running battle for “Chat Control” has returned to the spotlight, and this time it has gained new momentum.
Under the Danish Council Presidency, a proposal to require automatic scanning of private chats – even those protected by end-to-end encryption – is back on the agenda. Its supporters say it is a crucial step in combating the spread of child sexual abuse material (CSAM) online.
Not everyone agrees with this point of view. While some EU member states such as France, Spain and Italy strongly support the plan, others such as Belgium, Austria and Poland have called it “privacy monster”
Civil liberties advocates warn the law could turn Europe's most private spaces, from WhatsApp groups to encrypted emails, into monitoring zones.
Essentially, the debate is forcing Europe to confront a difficult question: is chat monitoring a necessary tool to protect children in the digital age? Or is this just the beginning of mass surveillance under the guise of security?
What does chat control offer and why is it controversial?
At its core, the EU Chat management proposal will require digital platforms to scan users' private communications for harmful content. This includes not only traditional social networks, but also encrypted messaging services such as WhatsApp, Signal, Telegram and possibly email providers.
Under the current draft proposal, service providers would be required to implement client-side scanning tools that identify and flag suspected child sexual abuse material (CSAM) or grooming behavior before messages are encrypted and sent.
Critics argue that for this to happen, platforms need to weaken end-to-end encryption (E2EE)a function that ensures that only the sender and recipient can read the message.
Stated goal: protecting children online
Lawmakers behind chat controls stress the law's goal is to combat large-scale child exploitation. By requiring proactive scanning, they aim to stop the spread of CSAM and intervene in caregiving conversations before harm occurs.
However, the reality is not as simple as it seems: there are some notable risks and trade-offs.
Hidden cost: encryption is at risk
Security experts warn that creating a backdoor for one target effectively creates a backdoor for everyone.
Compromising end-to-end encryption (E2EE) for security. creates a potential point of attack hackers, hostile governments and cybercriminals to exploit vulnerabilities.
As Edward Snowden said:
“Claiming that you don’t care about privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say.”
The law's goals may be noble, but the tradeoff is unfortunately significant. In trying to protect children, they risk eroding the digital foundations of privacy and security.
Divided Europe: who supports and who resists?
Proposal to control chat splits Europe. On the one side, France, Italy, Spain, Sweden, Ireland, Lithuania and Latvia Support this proposal, arguing that it is an important step towards protecting children online.
These governments implicitly agreed that the need to act against child exploitation outweighed any concerns about increased surveillance.
On the other side, Belgium, Czech Republic, Austria, Netherlands and Poland. reacted sharply, calling the proposal a dangerous abuse of power. In fact, Belgium even went so far as to describe it as “a monster that invades your privacy.»
Poland The opposition to this proposal is particularly noteworthy. The country has often been criticized for backsliding on the rule of law and allowing government surveillance. Her opposition here may seem like a sudden shift toward support for civil liberties.
However, instead there may be discomfort if EU institutions are given supervisory powers: a level exceeding their national control.
Denmarkcurrently serving as Council President, has made chat control a top political priority. This proposal is consistent with the Danish tradition of high trust in governance, where citizens generally trust that institutions act with integrity.
However, critics warn that this model does not necessarily work well in other EU countries. In countries like Spain, Italy, Slovakia, Romaniaor BulgariaIn countries where political scandals and corruption are more common, citizens are less likely to trust such broad oversight powers. will not be subjected to violence.
To further complicate matters, Austria has already begun implementing similar encryption-breaking measures across the countryraising concerns that even if Chat Control fails at EU level, fragmented versions of the proposal could still evolve and spread across member states.
Technical and ethical shortcomings
If adopted, Chat Control poses a significant risk. Let's look at the three biggest ones.
Encryption's Achilles' heel
The main problem with Chat Control is its impact on end-to-end encryption (E2EE). Services such as Signal or WhatsApp depend on E2EE to ensure that only the sender and recipient can read the message.
Forcing providers to scan content before encrypting it effectively creates a backdoor. Lawmakers insist the backdoor is intended solely to detect child abuse material. But let's be realistic: if such a backdoor exists, it will only be a matter of time before it appears. an experienced villain finds a way to take advantage of this.
False positives and untested technology
The technology behind Chat Control remains largely untested. Automated scanning tools designed to detect CSAM are prone to false positives, where harmless images or messages may be flagged as illegal.
In other words, innocent users may be investigated based on faulty, unproven algorithms. And to make matters worse, there are detection systems in place that actually prevent large-scale abuse. This will leave a hard surveillance infrastructure without proven benefits.
Rights at stake
Make no mistake, the technical risks of Chat Control are significant. But beyond these risks, there is also a fundamental legal and ethical issue.
The European Parliament warned this full scan may break Article 7 of the EU Charter of Fundamental Rightswhich protects private and family life. In other words, what is presented as child protection can be seen as mass surveillance, undermining the EU's own system of rights.
The Big Question: Security or Surveillance in Europe's Future
No one on either side of the debate opposes the goal of protecting children online. This is the emotional core of the chat control proposal and the main reason it continues to gain traction in the EU.
But the real question is not whether protecting children is important, but what price society is willing to pay for it.
Mandatory mass scanning of private conversations threatens to undermine digital freedoms in a way that is difficult to reverse. History shows that once the infrastructure for accessing encrypted messages is created, it is rarely limited to a single purpose.
What begins as a tool against child abuse can quickly spread to other areas: dissent, political speech, or even everyday personal interactions.
As George Orwell warned in 1984:
“Nothing belonged to you except the few cubic centimeters inside your skull.”
The current debate is focused on the EU, but the implications go far beyond it. If the EU weakens encryption, it risks setting a precedent that less democratic governments will happily follow. This could lead to a troubling new norm in which private communication is no longer truly private.
As Edward Snowden once said, ignoring privacy because you have “nothing to hide” is like ignoring free speech because you have “nothing to say.” Privacy is not a luxury; it is a fundamental democratic right. And once it's gone, it rarely comes back.
Tech Report's editorial policy is to provide useful and accurate content that provides real value to our readers. We only work with experienced writers who have specific knowledge of the topics they cover, including the latest developments in technology, software, hardware and more. Our editorial policy ensures that every topic is researched and curated by our in-house editors. We maintain strict journalistic standards and every article is 100% written by real authors.






