The EU wants to read your emails and messages. What is Chat Control and why is it being discussed again?

  • The European Commission wants to push through a regulation to combat child abuse
  • Under it, there is a risk of automatic monitoring of all messages, including end-to-end encrypted ones
  • Denmark actively wants to promote it, the Czech Republic has not yet taken a specific stance

Sdílejte:
Adam Kurfürst
Adam Kurfürst
19. 8. 2025 08:30
Advertisement

How far are we willing to go when our goal is to protect children and youth from the dangers lurking in the online space? Each individual must find the answer to this question themselves – preferably after studying sources and spending some time thinking critically, as this is certainly not a topic with one trivial solution that would satisfy all parties. However, some public officials clearly have a clear idea, and their decisions in the coming months could significantly affect each of us. A very controversial regulation is indeed emerging within the Council of the European Union.

A far from new regulation with good intentions

In the media and on social networks, you may have encountered the term **Chat Control** in recent years. Its opponents, in particular, use this term to refer to the European Commission’s 2022 proposal, which aims to establish rules for preventing and combating child sexual abuse. In practice, you may also **encounter it under the acronym CSA** (Child Sexual Abuse) **or CSAR** (Child Sexual Abuse Regulation).

EU officials are primarily trying to protect children from dangers in the online space. They cite statistics according to which at least one in five children becomes a victim of sexual violence in childhood, and a 2021 study whose half of respondents experienced some form of child sexual abuse online. The report focuses primarily on the necessity of detecting and reporting child sexual abuse material (child pornography) to prevent its further creation and dissemination.

“It is clear that the EU is currently still not protecting children from becoming victims of sexual abuse, and that the online dimension of this problem poses a particular challenge,” the proposal states.

The intention of the regulation, as presented by the more than three-year-old proposal, can therefore understandably be described as good. All the more so when we consider that children with disabilities (whether mental or developmental), who may also potentially be present in the online world, face an even higher risk of sexual violence.

Why Chat Control or what steps does the EU want to take?

The EU wants to actively combat child abuse in the online space. **Currently, a regulation known as Chat Control 1.0 is in force**, which allows online communication service providers to voluntarily scan communications. So far, only some platforms such as Gmail, Facebook/Instagram, or Snapchat use these options.

However, this is probably not sufficient, and so the European Commission is trying to introduce a far more drastic solution. It is no coincidence that opponents refer to it as Chat Control 2.0 – the new regulation would impose an **obligation on communication application providers to automatically scan all private conversations**. That means yours too. **On WhatsApp, Telegram, Signal, in email. Everywhere.** Even those **unsent in the draft stage**. The goal would be singular: to search for suspicious content and report it to the European Centre for the Prevention and Combating of Child Sexual Abuse.

**End-to-end encryption is also taken into account.** Specifically, the introduction of client-side scanning is being considered, i.e., content inspection directly on the user’s phone before encryption. This would essentially require backdoors to be built into applications, which, however, raise concerns about further security risks (attacks by hackers, service providers, or foreign powers).

European Union documents specifically refer to **three types of harmful content**. The first is “Known CSAM,” which is material previously identified as meeting the definition of child sexual abuse. It is technically the easiest to detect because a digital fingerprint (hash) exists for it. “New CSAM” represents the exact opposite, as it is previously undetected content that can only be machine-detected using more sophisticated tools, such as those employing machine learning. Finally, grooming is discussed – a technique where the perpetrator tries to gain a child’s trust and lure them into a meeting or persuade them to send intimate materials. To detect it, text conversations would have to be automatically scanned and the language and communication patterns within them analyzed.

Denmark Revives the Discussion

Pushing something like the so-called Chat Control requires a considerable amount of courage. A public consultation by the European Commission, to which former German Pirate Party MEP Patrick Breyer refers, revealed, among other things, that **more than 80% of respondents opposed the introduction of mandatory chat control.**

**However, the one not afraid to revive the discussion is Denmark.** It **leads the Council of the EU** from July 1st until the end of this year and has set the effort to push through CSA as one of the key points of its program. In the program, it **explicitly states that it has assigned high priority to the task.**

“Law enforcement authorities must have the necessary tools, including access to data, to effectively investigate and prosecute crimes,” also resounds from Copenhagen.

The Danish Minister of Justice then stated less than a month ago at a press conference in Copenhagen: “We must ask ourselves, whose privacy are we actually most concerned about? Is it the privacy of thousands of children who are sexually abused? Or is it the privacy of ordinary people who may or may not be abused. If they share content related to child sexual abuse, a protective order can be issued against what they share. We must agree on a compromise in these differing views.”

In the EU Council, unless anything changes by then, the **vote on the adoption of the new law will take place on October 14**. At least 15 out of 27 member states must vote in favor.

Currently, only three states are against, the Czech Republic is undecided for now

**So far, only three member states have openly opposed the new regulation: Poland, the Netherlands, and Austria** (although its new government coalition may well change its mind, writes Euronews).

According to a leaked transcript of the EU Council meeting with lawyers on July 30, the Czech Republic is on the list of countries that have not yet taken a specific stance. Reportedly due to the upcoming elections. France and Germany will then have absolutely key positions in the vote. However, neither country’s delegation has yet clearly sided with any of the options.

Some domestic politicians also commented on the situation. For example, Pirate MEP Markéta Gregorová strongly objects to the adoption of the proposal, emphasizing the importance of so-called trilogues in a thread on X. These are informal meetings of representatives of the European Parliament, the EU Council, and the European Commission. The Council and Parliament are currently at opposite ends of the spectrum in terms of opinion, which, **according to Gregorová, could ultimately mean that we will not have Chat Control 2.0 here for at least another four years.**

MEP **Kateřina Konečná (STAČILO!)** also strongly opposes it. In a post on X, she **called on the Czech government to “take a negative stance and protect the rights of its citizens.”**

**The legal service of the EU Council has repeatedly stated that the proposal is contrary to European law**. According to them, it is not in line with Articles 7 and 8 of the EU Charter of Fundamental Rights and Article 8 of the European Convention on Human Rights. The mentioned client-side scanning method would reportedly be a violation of the principle of confidentiality of communication.

Let’s discuss: Is a similar regulation appropriate?

The European Union faces a fundamental dilemma. On one hand, there is a legitimate effort to protect children from severe forms of abuse, which increasingly occur in the online environment. On the other hand, the Chat Control 2.0 proposal raises concerns that widespread scanning of messages – including encrypted ones – could mean the end of confidential electronic communication, weakening of cybersecurity, and an infringement on the fundamental rights of all citizens.

The question therefore is: is it possible to find a balance between protecting children and protecting privacy, or are these two goals in direct conflict? And if the regulation were to truly lead to weakened encryption and widespread monitoring, are we as a society prepared to accept such a high price for crime prevention?

What is your view on the issue surrounding Chat Control?

Sources: European Commission (1, 2, 3, 4), Patrick-Breyer.de, Euronews, Netzpolitik, TechRadar, Danish Presidency of the Council of the EU

About the author

Adam Kurfürst

Adam studuje na gymnáziu a technologické žurnalistice se věnuje od svých 14 let. Pakliže pomineme jeho vášeň pro chytré telefony, tablety a příslušenství, rád se… More about the author

Adam Kurfürst
Sdílejte: