Loophole in Chat Control 2.0 compromises information security

Mass surveillance

Published October 14, 2023 – By Karl Emil Nikka

The controversial mass surveillance proposal, Chat Control 2.0, is plagued by several technical and information security-related issues. The biggest problem is the requirement that even end-to-end encrypted communication services be included. This requirement exists despite it being technically impossible for service providers to scan the contents of properly end-to-end encrypted conversations. It has always been this way, and it always will be.

The face of the mass surveillance proposal, EU Commissioner Ylva Johansson, initially believed that such scanning was possible (see for instance Many inaccuracies around Chat Control 2.0 in the 'Aktuellt' interview). She likened the process to how a drug-sniffing dog can sniff for drugs in closed bags. This analogy is completely incorrect because properly end-to-end encrypted conversations never leak any sniffable traces of their content. It doesn't matter how advanced future scanning technology becomes because there are never any traces to scan ("sniff") for.

Proponents of the proposal, therefore, want to bypass the function of end-to-end encryption by implementing a technology called client-side scanning. This means that service providers have to equip their apps with backdoors, allowing them to scan the content before it is sent (before it's encrypted) and after it has been received (after it has been decrypted). This is the technology that the UN's Human Rights Commissioner literally advises against, partly due to the dangers it poses for vulnerable children and adults in totalitarian states. (The imminent risk of data leaks and the obvious risk of self-censorship are two other reasons highlighted by the UN's Human Rights Commissioner.)

The loophole in the definition

From a strictly technical perspective, client-side scanning could be implemented without either prohibiting end-to-end encryption or weakening the encryption. Technically speaking, the client-side scanning itself doesn't affect the encryption. Client-side scanning merely causes the encryption to cease serving its purpose. With implemented client-side scanning, conversation participants continue to send messages end-to-end encrypted to each other, but both parties simultaneously have a spy looking over their shoulder, seeing everything they write and hearing everything they say.

This definitional loophole is now being exploited by several parties. The parties and their EU Parliamentarians claim that they want to allow end-to-end encryption, yet at the same time, they demand that the content in end-to-end encrypted services can be scanned. In this way, their permission of end-to-end encryption becomes irrelevant. This loophole argument was, incidentally, precisely what I feared when I expressed my skepticism in an interview with Dagens Nyheter at the end of April (see comment in Possible EU turnaround on chat control law – must not weaken encryption).

On our theme website, chatcontrol.se, we have a monitoring database with over 400 Swedish articles written about the proposal. I've reviewed these articles as well as the amendment proposals that Swedish parties' EU Parliamentarians have put forward. Based on this, I've been able to identify which proposal advocates are trying to mislead the public by allowing end-to-end encryption while simultaneously demanding that end-to-end encryption be bypassed.

The Tidö agreement parties and the Green Party

The governing parties have presented a proposal for Sweden's position in the Council of Ministers. The proposal contains the following text which paradoxically wants encrypted messages to be protected while also needing to be scanned:

"A tracing order must ultimately be executed without being impeded by a service being encrypted, for example, through machine scanning before the message is encrypted and sent. At the same time, information security must not be jeopardized; encrypted messages should be protected against unauthorized access".

(From an appendix to a document from the EU Committee 2023/24:4F1902, 2023-09-18)

In the European Parliament, neither the Moderates nor the Christian Democrats share the stance of the Swedish government. Both the Moderates and the Christian Democrats are clear that the function of end-to-end encryption must never be undermined. This is evident in amendment 389 signed by all EU Parliamentarians from the Moderates and the Christian Democrats (Arba Kokalari, Jessica Polfjärd, Tomas Tobé, Jörgen Warborn, David Lega, and Sara Skyttedal).

"End-to-end encryption is an essential tool to guarantee the security, privacy, and confidentiality of the communications between users, including those of children. Any weakening of the end-to-end encryption’s effect could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communications shall be understood the processing of any data, that would compromise or put at risk the integrity and confidentiality of the aforementioned end-to-end encrypted content. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side-channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provide third party actors access to the end-to-end encrypted content and communications".

(Amendment 389, 2023-07-28)

The Sweden Democrats have not criticized the government's line domestically. However, in the European Parliament, the Sweden Democrats have clarified that they are opposed to the proposal. SD Parliamentarian Johan Nissinen has signed the same amendment as the Moderates and the Christian Democrats (amendment 389).

The Green Party, which was previously opposed to the proposal, has now chosen to support the government's line, even though the Green Party initially said they did not want to support "the parts that involve mandatory scanning of private communication as it is formulated in the Commission's proposal right now" (2023-04-18). The change is evident from the minutes of the Justice Committee's meeting on 2023-09-14 and is confirmed by Rasmus Ling in an interview with Syre (2023-09-22).

The Social Democrats

The Social Democrats in Sweden support the Presidency's (Spain) compromise proposal. This is reflected in the minutes of the Justice Committee meeting on September 14, 2023.

In addition, in the European Parliament, three Socialist MEPs are trying to use the same loophole to advocate for scanning of end-to-end encrypted services without banning end-to-end encryption.

Heléne Fritzon and Carina Ohlsson first want to introduce an amendment to allow for end-to-end encryption. They want to add the following point to Article 10's list of technologies and safeguards.

"[The technologies shall be] not able to prohibit or make end- to-end encryption impossible".

(From Amendment 1161, 2023-07-28)

In the introductory recitals, they also stress, together with S-Parliamentarian Evin Incir, that nothing in the proposal should be interpreted as prohibiting full-spectrum encryption.

Nothing in this Regulation should therefore be interpreted as prohibiting end-to-end encryption or making it impossible.

(From Amendment 385, 2023-07-28)

However, Heléne Fritzon and Carina Ohlsson also want the following addition to Article 7 (Issuance of tracking orders).

For the scope of this Regulation and for the sole purpose to prevent and combat child sexual abuse, providers of interpersonal communications services shall be subjected to obligations to prevent, detect, report and remove online child sexual abuse on all their services, which may include as well those covered by end-to-end encryption, when there is a significant risk that their specific service.

(From Amendment 1049, 2023-07-28)

Other parties

The Left Party and the Center Party have, unlike the other parliamentary parties, chosen not to use the definitional loophole. Both the Left Party and the Center Party instead side with the children and distance themselves from the mass surveillance proposal that violates the Convention on the Rights of the Child.

 


This article is published under the CC BY 4.0 license, except for quotes and images where another photographer is indicated, from Nikka Systems.

The position of the Swedish parties

More information on the positions of all parties and MEPs can be found on the thematic website chatcontrol.se. The information on these positions is also updated on a weekly basis.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Email was never built for privacy

Mass surveillance

How Proton makes email privacy simple.

Published November 8, 2025 – By Naomi Brockwell

Email was never built for privacy. It’s closer to a digital postcard than a sealed letter, bouncing through and sitting on servers you don’t control, and mainstream providers like Gmail read and analyze everything that is inside.

Email isn’t going anywhere in our society, it’s baked into how the digital world communicates. But luckily there are ways to make your emails more private. One tool that you can use is PGP, which stands for “Pretty Good Privacy”.

PGP is one of the oldest and most powerful tools for email privacy. It takes your message and locks it with the recipient’s public key, so only they can unlock it with their private key. That means even if someone intercepts the email, whether it’s a hacker, your ISP, or a government agency, they see only scrambled text.

Unfortunately it is notoriously complicated. Normally, you’d have to install command-line tools, generate keys manually, and run cryptic commands just to send an encrypted email.

But Proton Mail makes all of that easy, and builds PGP right into your inbox.

How Proton makes PGP simple

Proton is a great, privacy-focused email provider (and no they’re not sponsoring this newsletter, they’re simply an email provider that I like to use).

If you email someone within the Proton ecosystem (ie send an email from one Proton user to another Proton user), your email is automatically end-to-end encrypted using PGP.

But what if you email someone outside of the Proton ecosystem?

Here’s where it would usually get tricky.

First, you’d need to install a PGP client, which is a program that lets you generate and manage your encryption keys.

Then you’d run command-line prompts, choosing the key type, size, expiration, associating the email you want to use the key with, and you’d export your public key. It’s complicated.

But if you use Proton, they make using PGP super easy.

Let’s go through how to use it.

Automatic search for public PGP key

First of all, when you type an email address into the “To” field in Proton Mail, it automatically searches for a public PGP key associated with that address. Proton checks its own network, your contact list, and Web Key Directory (WKD) on the associated email domain.

WKD is a small web‑standard that allows someone to publish their public key at their domain in a way that makes it easily findable for an email app. For example if Proton finds a key for a certain address at the associated domain, Proton will automatically encrypt a message with it.

If they find a key, you’ll see a green lock next to the recipient in the ‘To’ field, indicating the message will be encrypted.

You don’t need to copy, paste, or import anything. It just works.

Great, your email has been automatically encrypted using PGP, and only the recipient of the email will be able to use their private key to decrypt it.

Manually uploading someone’s PGP key

What if Proton doesn’t automatically find someone’s PGP key? You can hunt down the key manually and import it. Some people will have their key available on their website, either in plain text, or as a .asc file. Proton allows you to save this PGP key in your contacts.

To add one manually, first you type their email address in the “to” field.

Then right-click on that address, and select “view contact details”

Then click the settings wheel to go to email settings, and select “show advanced PGP settings”

Under “public keys”, select “upload” and upload their public key in an .asc format.

Once the key is uploaded, the “encrypt emails” toggle will automatically switch on, and all future emails to that contact will automatically be protected with PGP. You can turn that off at any time, and also remove or replace the public key.

How do others secure emails to you using PGP?

Super! So you’ve sent an encrypted email to someone using their PGP key. What if they want to send you an email back, will that be automatically end-to-end encrypted (E2EE) using PGP? Not necessarily.

In order for someone to send you an end-to-end encrypted email, they need your public PGP key.

Download your public-private key pair inside Proton

Proton automatically generates a public-private key pair for each address that you have configured inside Proton Mail, and manages encryption inside its own network.

If you want people outside Proton to be able to encrypt messages to you, the first step is to export your public key from your Proton account so you can share it with them.

To do this:

  • Go to Setting
  • Click “All settings”
  • Select “encryption and keys”
  • Under “email encryption keys” you’ll have a dropdown menu of all your email addresses associated with your Proton account. Select the address that you want to export the public key for.
  • Under the “action” column, click “export public key”

It will download as an .asc file, and ask you where you want to save the file.

Normally a PGP key is written in 1s and 0s that your computer can read. The .asc file takes that key and wraps it in readable characters, and it ends up in a format that looks something like this:

Sharing your public key

Now that you’ve downloaded the public key, how do you share it with people so that they can contact you privately? There are several ways.

For @proton.me and @protonmail.com addresses, Proton publishes your public key in its WKD automatically. You don’t have to do anything.

For custom domains configured in Proton Mail, Proton doesn’t host WKD for you. You can publish WKD yourself on your own domain by serving it at a special path on your website. Or you can delegate WKD to a managed service. Or if you don’t want to use WKD at all, you can upload your key to a public keyserver like keys.openpgp.org, which provides another way for mail apps to discover it.

We’re not going to cover those setups in this article. Instead here are simpler ways to share your public key:

1) You can send people your .asc file directly if you want them to be able to encrypt emails to you (be sure to let them know which email address is associated with this key), or you can host this .asc file on your website for people to download.

2) You can open the .asc file in a text editor and copy and paste the key, and then send people this text, or upload the text on your website. This is what I have done:

This way if anyone wants to send me an email more privately, they can do so.

But Proton makes it even easier to share your PGP key: you can opt to automatically attach your public key to every email.

To turn this on:

  1. Go to Settings → Encryption & keys → External PGP settings
  2. Enable
    • Sign external messages
    • Attach public key

Once this is on, every email you send will automatically include your public key file, as a small .asc text file.

This means anyone using a PGP-capable mail client (like Thunderbird, Mailvelope, etc.) can import it immediately, with no manual steps required.

Password-protected emails

Proton also lets you send password-protected emails, so even if the other person doesn’t use PGP you can still keep the contents private. This isn’t PGP -- Proton encrypts the message and attachments in your browser and the recipient gets a link to a secure viewing page. They enter a password you share separately to open it. Their provider (like Gmail) only sees a notification email with a link, not the message itself. You can add a password hint, and the message expires after a set time (28 days by default).

The bottom line

Email privacy doesn’t have to be painful. Proton hides the complexity by adding a password option, or automating a lot of the PGP process for you: it automatically looks up recipients’ keys, encrypts your messages, and makes your key easy for others to use when they reply.

As Phil Zimmermann, the creator of PGP, explained in Why I Wrote PGP:

“PGP empowers people to take their privacy into their own hands. There has been a growing social need for it. That’s why I wrote it".

We’re honored to have Mr. Zimmermann on our board of advisors at Ludlow Institute.

Pioneers like him fought hard so we could protect our privacy. It’s on us to use the tools they gave us.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

Swedish police secretly using Palantir’s surveillance system for years

Mass surveillance

Published November 4, 2025 – By Editorial staff
Palantir Technologies headquarters in Silicon Valley.

The Swedish Police Authority has for at least five years been using an AI-based analysis tool from the notorious American security company Palantir.

The program, which has been specially adapted for Swedish conditions, can within seconds compile comprehensive profiles of individuals by combining data from various registers.

Behind the system stands the American tech company Palantir, which is internationally controversial and has been accused of involvement in surveillance activities. This summer, the company was identified in a UN report as complicit in genocide in Gaza.

The Swedish version of Palantir's Gotham platform is called Acus and uses artificial intelligence to compile, analyze and visualize large amounts of information. According to an investigation by the left-wing newspaper Dagens ETC, investigators using the system can quickly obtain detailed personal profiles that combine data from surveillance and criminal registers with information from Bank-id (Sweden's national digital identification system), mobile operators and social media.

A former analyst employed by the police, who chooses to remain anonymous, describes to the newspaper how the system was surrounded by great secrecy:

— There was very much hush-hush around that program.

Rejection of document requests

When the newspaper requested information about the system and how it is used, they were met with rejection. The Swedish Police Authority cited confidentiality and stated that they can neither "confirm nor deny relationships with Palantir" citing "danger to national security".

This is not the first time Palantir's tools have been used in Swedish law enforcement. In the high-profile Operation Trojan Shield, the FBI, with support from Palantir's technology, managed to infiltrate and intercept the encrypted messaging app Anom.

The operation led to the arrest of a large number of people connected to serious crime, both in Sweden and internationally. The FBI called the operation "a shining example of innovative law enforcement".

But the method has also received criticism. Attorney Johan Grahn, who has represented defendants in several Anom-related cases, is critical of the approach.

— In these cases, it has been indiscriminate mass surveillance, he states.

Mapping dissidents

Palantir has long sparked debate due to its assignments and methods. The company works with both American agencies and foreign security services.

In the United States, the surveillance company's systems are used to map undocumented immigrants. In the United Kingdom, British police have been criticized for using the company's technology to build registers of citizens' sex lives, political views, religious affiliation, ethnicity and union involvement – information that according to observers violates fundamental privacy principles.

This summer, a UN report also identified Palantir as co-responsible for acts of genocide in Gaza, after the company's analysis tools were allegedly used in attacks where Palestinian civilians were killed.

How extensive the Swedish police's use of the system is, and what legal frameworks govern the handling of Swedish citizens' personal data in the platform, remains unclear as long as the Swedish Police Authority chooses to keep the information classified.

IT expert warns: ID requirements online bring us closer to totalitarian surveillance

Mass surveillance

Published November 3, 2025 – By Editorial staff
Swedish Liberal Party politician Nina Larsson wants to introduce age verification – but IT experts warn of serious consequences

IT security specialist Karl Emil Nikka advises Sweden against following the UK's example of mandatory age verification on pornographic websites. The risk of data breaches and increased surveillance is too great, he argues.

Swedish Gender Equality Minister Nina Larsson wants Sweden to introduce technical barriers requiring age verification on pornographic websites to protect children from explicit sexual content.

The proposal is based on the British model where websites must verify users' age or identity, for example through authentication with ID cards or credit cards.

But Karl Emil Nikka, an IT security specialist, is strongly critical of the proposal. He points to serious flaws in the British solution, not least the risk of data breaches.

As an example, he mentions the leak from the messaging platform Discord, where photos of 70,000 users ended up in the wrong hands after a cyberattack in connection with the law change. Additionally, the barriers are easy to circumvent using VPN services, which caused the use of such services to skyrocket when the British law came into effect.

Risks surveillance

Nikka also warns that requirements for online identification bring Sweden closer to a type of surveillance that otherwise only exists in totalitarian states.

— It's a small problem as long as we live in a democracy, but it's damn dangerous to believe we always will, he says.

Instead, parents should be encouraged to use the controls already built into phones and other devices, where one can easily choose which sites to block.

— From a security perspective, it's the only reasonable solution, Nikka states.

Foreign sites attract

An additional risk with technical barriers is that young users turn to lesser-known foreign sites that don't care about legal requirements, Nikka argues. Jannike Tillå, head of communications and social benefit at the Swedish Internet Foundation, confirms this picture.

— According to experts in various countries, it seems that people have turned to other lesser-known websites abroad, she says.

However, Tillå believes that technical solutions can have a place, provided they are more anonymous than the British ones and combined with other measures.

— It can help raise thresholds and reduce exposure.

Conversations crucial

At the same time, she emphasizes the importance of complementing any technical solutions with investments in digital literacy and, above all, conversations between parents and children.

— That's where real protection begins. We know that many parents find it difficult to have the porn conversation, but you should do it early, says Jannike Tillå.

She stresses that the question of privacy and freedom online must not be set against child protection.

— We must find that balance and manage both things, she concludes.

Safety apps normalize surveillance of children

Mass surveillance

Published October 15, 2025 – By Editorial staff
Swedish researcher Katarina Winter warns that surveillance of children has become normalized when technology is packaged as care rather than control.

Apps promised to increase safety are often used for everyday logistics – and normalize secret surveillance.

Researchers at Stockholm University have examined 48 Swedish safety apps and warn that the technology is packaged as care while ethical questions disappear.

In two research projects at Stockholm University in Sweden, researchers are investigating various safety technologies in Sweden – everything from digital safety maps and security sensors to apps marketed as tools for creating safer communities. But instead of measuring whether the technology works, the researchers critically examine its consequences.

— It's important to ask what kind of safety we're after, and for whom? What is worth calling safety? Which actors and interests determine what constitutes safety in a society? The project on safety apps shows, among other things, how surveillance becomes normalized when we use this technology, says Katarina Winter, associate professor and senior lecturer in criminology and doctor in sociology at Stockholm University.

She leads the projects, which are conducted in collaboration with researchers from the University of Gävle and Södertörn University. The researchers have mapped 48 Swedish safety apps and interviewed both developers and users, including parents who use apps to keep track of their children.

"The technology is so kindly framed"

A central finding is how normalized it has become to monitor children, often without their knowledge.

— One example is how normalized it has become to monitor your children even though they don't know about it, although some have an agreement with their children. Because the technology is so kindly framed – that it's about protecting the children – it doesn't become something you have to stand up for as a parent. The normalization can therefore happen under the radar. When technology is packaged as care, we easily lose sight of the ethical questions, she explains.

The surveillance also affects family relationships.

— Many use the apps to avoid nagging their children, and in the short term it may be convenient and simplify family logistics. But something happens on an interpersonal level, we cut off part of the interaction between each other. It's seen as deviant behavior if you don't want to share your location, which I think is negative.

Confusing messages during adult education center shooting

The researchers see a clear discrepancy between developers' ideals about a safer society and how the apps are actually used. For private individuals, it's often about completely different things than safety.

— In a way, these parents reproduce an insecurity in society related to crime and vulnerability when they justify why they use an app. But in reality, it's often extremely connected to everyday logistics – when should I start cooking the pasta depending on where my child is? explains the criminologist.

The researchers have also examined the school safety app CoSafe, which was used during the shooting at Campus Risbergska, an adult education center in Malmö, southern Sweden. The app was criticized for sending contradictory alerts about both evacuation (leaving the building) and lockdown (staying inside and seeking shelter). Of the total eleven people killed, two students followed the instruction to evacuate instead of seeking shelter indoors.

— The Risbergska case demonstrates the complexity of technical solutions for crisis situations. While the app may have helped some seek shelter, the incident raises important questions about responsibility distribution and technical reliability when it comes to life and death, Winter notes.

Private actors profit from insecurity

The researcher also sees how private companies use the public debate about insecurity to sell their solutions, particularly to municipalities.

— We have both a political landscape that focuses on insecurity and a market that takes it on because it's in focus. It's logical that opportunities for entrepreneurship are found in the societal debate we're in, but it becomes more brutal when it comes to safety than with other phenomena. Partly because actors profit from portraying society as unsafe, and partly because companies are generally interested in specific user groups that may not have many safety problems.

She calls for a critical attitude toward technological optimism.

— It's important to pause on these questions that otherwise tend to rush ahead in a kind of faith that 'now everything will be better because we have new technology'. When the overarching word is safety, questions about surveillance and privacy risk being deprioritized.