“Many misleading claims about Chat Control 2.0”

Mass surveillance

Ylva Johansson chooses to ignore the fact that a mass surveillance proposal requires mass surveillance, Karl Emil Nikka, IT security expert, writes.

Updated March 6, 2024, Published September 28, 2023
IT security expert Karl Emil Nikka. EU Commissioner Ylva Johansson.
This is an opinion piece. The author is responsible for the views expressed in the article.

One of the topics discussed in the last week's episode of Medierna i P1 was the European Commission's controversial mass surveillance proposal Chat Control 2.0 and its consequences for journalists. The episode featured EU Commissioner Ylva Johansson, IT and media lawyer Daniel Westman and Anne Lagercrantz, President of the Swedish Publishers Association.

Westman and Lagercrantz were critical of the mass surveillance proposal, partly because of the consequences for the protection of sources. The Swedish Association of Journalists and the Swedish Newspaper Publishers have previously warned about the consequences of the proposal for the same reasons.

Comically, the pre-recorded interview began with Johansson asking if she could call Martina Pierrou, the interviewing journalist, via Signal or Whatsapp instead.

At the time of the interview, Johansson and Pierrou were able to talk via Signal, but if the mass surveillance proposal goes through, that possibility will disappear. In a response to me on X (Twitter), Signal's CEO announced that they will leave the EU if they are forced to build backdoors into their app.

This is a very wise decision on Signal's part as such backdoors undermine the safety and security of children and adults around the world. The rest of the world should not have to suffer because we in Europe are unable to stop EU proposals that violate human rights, the Convention on the Rights of the Child and our own EU Charter.

Below is an analysis of all the statements made by Johansson in the interview. The quotes are printed in full. The time codes link directly to the paragraphs in the section where the claims were made.

Incorrect suggestion of a requirement for a court decision

When asked about what the bill means in practice (18:55), Johansson repeated her recurring lie that a court order would be required to scan communications. She explained the practical implications of the proposal with the following sentence.

"To force the companies to make risk assessments, to take measures to ensure that their services are not used for this terrible crime and ultimately to make it possible, by court order, to also allow the scanning of communications to find these abuses." - Ylva Johansson (2023-09-23)

Pierrou followed up with a remark that the proposal may require scanning without suspicion of crime against any individual (19.24). Ylva Johansson responded as follows.

"No, scanning will take place when there is a risk that a certain service is being used extensively to spread these criminal offenses. Then a court can decide that scanning is permitted and necessary." - Ylva Johansson (2023-09-23)

The suggestion that a court decision would be required is incorrect. Johansson made the same claim in the debate against me in Svenska Dagbladet from April this year (the only debate in the Swedish media that Johansson has participated in). I then offered to correct her claim myself, in order to investigate whether she knew that her proposal did not require a court decision. The proposal also accepts decisions from administrative authorities. Johansson knew this. Nevertheless, she repeated the lie in the interview in SVT Aktuellt (April 2023), Ekot's Saturday interview (June 2023) and now today in Medierna i P1.

Omitted consequence

In the answer to the same question, Johansson omitted the most crucial point, namely that backdoors are a prerequisite for the scanning of end-to-end encrypted conversations to be done at all. Once these backdoors are in place, they can be abused and cause data leaks. Other states, such as the US where most of the affected services are based, can use the backdoors to scan for content they are interested in.

The proposal states that service providers may only use their position to scan for child abuse material and grooming attempts. Even if we ignore the likely purpose creep, it doesn't matter. Today, we have technical protections that ensure that our end-to-end encrypted conversations are impossible to intercept. The European Commission wants to replace these technical protections with legal restrictions on what the new backdoors can (and cannot) be used for.

This naivety is unprecedented. It is incomprehensible to me how the EU can believe that the US would allow American companies to install back doors that are limited to the EU's prescribed use. As a thought experiment, we can consider how the EU would react if the US tried to do the same to our companies.

If we take into account the highly likely purpose creep, the situation gets even worse. We only have to go back to 2008 to demonstrate this. At that time, the FRA debate was in full swing and FRA Director General Ingvar Åkesson wrote a debate article in Svenska Dagbladet with the following memorable words.

"FRA cannot spy on domestic phenomena. /.../ Yet the idea is being cultivated that FRA should listen to all Swedes' phone calls, read their e-mails and text messages. A disgusting idea. How can so many people believe that a democratically elected parliament would wish its people so ill?" - Ingvar Åkesson (2008-06-29)

15 years later, Åkesson can hopefully understand why we thought that a democratically elected parliament could want its people so badly. Right now exactly this "disgusting idea" (the Director General's choice of words) is being proposed.

Belief in the existence of non-existent technologies

Pierrou then asked how the solution would actually work. Pierrou pointed out that "according to an opinion from the European Data Protection Board, the technology required by the proposal does not exist today" (19.55).

Johansson responded with a quote that will go down in history.

"I believe that there is. But my bill is technology-neutral and that means that we set standards for what the technology must be able to do and what high standards of integrity the technology must meet." - Ylva Johansson (2023-09-23)

Here Johansson again shows that she based her proposal on incorrect assumptions about how technology works. After having been refuted by the world's experts, she is now forced to switch to opinion arguments such as "I believe it exists".

Whether technology exists (or can exist) is of course not a matter of opinion. It is, always has been, and always will be technically impossible to scan the content of properly end-to-end encrypted conversations.

To smooth over the embarrassment, Johansson pointed out that the bill is technology-neutral. This may sound good, but it says nothing in the context. Setting standards for what technology must do is only embarrassing when it is done without first examining what is practically possible.

If service providers of end-to-end encrypted services are to be able to scan the content of conversations, they must build in backdoors. The backdoors allow them to scan the content before it is encrypted and after it has been decrypted. Without backdoors, scanning is and remains technically impossible.

Opinion on mass surveillance in mass surveillance proposals

Pierrou concluded the interview by asking what Johansson thought about the image of the proposal being painted as a mass surveillance proposal (20.19). Johansson then answered the following.

"Yes, that is a completely wrong picture. It is not about anyone monitoring at all." - Ylva Johansson (2023-09-23)

The definition of mass surveillance should be that the masses are monitored (as opposed to targeted surveillance against selected suspects). As highlighted by Pierrou in a previous question, the Chat Control 2.0 scan does not require any suspicion of crime against individuals. Service providers should monitor what the masses write and say on the platforms. Service providers will report suspicious conversations to the new EU centre to be set up in The Hague.

The proposal is thus, by definition, a mass surveillance proposal.

However, Johansson chose to ignore the fact that a mass surveillance proposal requires mass surveillance. Instead, she tried to dismiss the criticism with the following argument and a pat on her own shoulder (20.34).

"It is obvious that when you are a bit of a pioneer, as I am in this case, you have to expect that you will also be questioned." - Ylva Johansson (2023-09-23)

Unfortunately, I must crush Commissioner Johansson's self-image and state that she has never been questioned for being a pioneer. Johansson is not even a pioneer in the field, something she herself should know.

It has barely been 30 years since the Stasi was disbanded.

 

Karl Emil Nikka

 


This article is republished from nikkasystems.com under CC BY 4.0.

About the author

Karl Emil Nikka is the founder of Nikka Systems, Security Profile of the Year 2021, author and a IT security expert.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Swedish government proposes real-time AI facial recognition

Mass surveillance

Published November 28, 2025 – By Editorial staff
The Swedish government's press conference where new tools for crime prevention were presented.

The Swedish government is presenting a legislative proposal that would give the police the ability to identify individuals using artificial intelligence. The technology is intended to be used to more quickly locate suspects, wanted persons, and crime victims.

Swedish Justice Minister Gunnar Strömmer (Moderate Party) announced at a press conference that the government has decided on a legislative proposal that would allow police to use AI-based facial recognition in real time.

We are presenting a powerful new tool, said Strömmer, who also emphasized the importance of camera surveillance in stopping violence and investigating crimes.

Swedish Minister for Civil Defence Erik Slottner (Christian Democrats) stressed that the technology could dramatically transform police work. What previously took several weeks can now be done "in a matter of seconds," according to the minister.

Through real-time facial recognition, we can find criminals, abducted children or wanted terrorists, Slottner explained.

Currently, AI-based facial recognition in public spaces is essentially prohibited in Sweden. The government's proposal would give police broader exemptions from the ban in order to combat serious crime.

The Liberal Party's Martin Melin specified that the technology would be used to locate victims, prevent serious violent crimes, investigate offenses such as murder and rape, and enforce sentences.

EU countries agree on Chat Control – opens door to supranational mass surveillance

Mass surveillance

Published November 26, 2025 – By Editorial staff
Swedish Social Democrat Ylva Johansson has been a strong advocate of the supranational mass surveillance directive that is now partially gaining traction.

EU member state governments have agreed on their position regarding the controversial Chat Control legislation. The proposal, which officially aims to combat child sexual abuse, opens the door to extensive surveillance of all citizens' digital communication, according to critics.

Sweden has approved it through the government and the Social Democrats, while the Sweden Democrats reject the proposal.

EU ambassadors approved a compromise proposal on Wednesday for the so-called CSAM regulation (Child Sexual Abuse Material), originally developed by Swedish EU Commissioner Ylva Johansson. The decision paves the way for final negotiations with the European Parliament on a permanent framework for digital surveillance, reports Samnytt.

The new negotiating mandate means that the most controversial parts of the Commission's original proposal are removed. Mandatory "detection orders" that would give authorities the right to require tech companies to scan citizens' chats, emails and messages – even in encrypted services – are struck from the text.

Instead, platforms' obligations to conduct risk assessments and implement "risk-reducing measures" are strengthened. Voluntary scanning of messages is highlighted as a possible tool. At the same time, a new EU agency is proposed, a special CSAM center, to coordinate the law's implementation.

From mandatory to "voluntary" surveillance

The removal of mandatory detection orders is presented by EU representatives as a balanced compromise. Critics argue, however, that the change is more cosmetic than real.

The new Council proposal emphasizes that encryption should be protected, but simultaneously lists message scanning as a possible risk-reducing measure. If a company is deemed to have excessively high risks, pressure from supervisory authorities can in practice turn voluntary scanning into a requirement.

The proposal also opens the door to extensive age verification. To determine which users are children, systems can be introduced where everyone must identify themselves with ID documents or biometric methods to use email, chat apps and other communication services.

Warnings of totalitarian surveillance model

Criticism has been massive from privacy experts, researchers and rights organizations. In its original form, the proposal would, according to critics, mean that all EU citizens would have their communication monitored – every phone call, video call, text message, app message, email and file in cloud services could be filtered in real time.

Chat Control has been compared to surveillance systems in totalitarian states. Critics warn of mission creep: once the infrastructure is in place, the filters can quickly be reconfigured for other content, such as political opinions or journalistic sources.

AI filters with massive false positives

AI is intended to detect suspected sexual content or grooming. But the technology already functions poorly on social media, where algorithms flag ironic comments, historical images or harmless material.

When the technology has been tested on known abuse images, up to 80-90 percent of hits have been false positives. The result is that thousands of people risk being identified as suspects for one of the most abhorrent crimes, only to be forced to prove their innocence while their most private images and conversations are examined.

Sweden says yes – SD dissents

The Swedish government – the Moderate Party, Christian Democrats and Liberals – along with the Social Democrats have approved the proposal. When Sweden's position was to be determined in autumn 2024, these parties voted together for approval, despite the fact that cooperation party the Sweden Democrats rejected the proposal.

Sweden Democrat politician Adam Marttinen warned that the proposal goes too far, that encryption is broken in practice and that it opens the door to mass surveillance on a slippery slope.

IT expert: "Politicians have been deceived"

IT security expert Karl Emil Nikka has sharply criticized both the EU Commission and supporting politicians. He argues that the technology described – where systems only search for child pornography without "seeing" anything else – does not exist.

That technology obviously does not exist. It has never existed and by definition cannot exist, Nikka explained.

He warned that Chat Control means "insecurity by design," where all communication apps are forced to build in vulnerabilities that can be exploited by hostile states or criminal actors.

Nikka also pointed out that UNICEF's principles regarding children's right to private communication are violated by the proposal. He believes politicians have been deceived by the EU Commission's campaigns that have downplayed the privacy consequences.

The UN Human Rights Commissioner has warned that surveillance of digital communication is a primary tool for authoritarian regimes to persecute opposition groups and religious minorities. That the EU is now taking the lead with a model that, according to critics, normalizes mass surveillance is described as a historic step in the wrong direction.

GrapheneOS exits France after threats and smear campaign

Totalitarianism

Published November 25, 2025 – By Editorial staff
GrapheneOS is considered the world's most secure mobile operating system while being nearly identical to Android, making it very user-friendly and popular.

The Canadian open-source organization behind the security-focused mobile operating system GrapheneOS announces it is ending all operations in France.

The background is an escalating conflict with French authorities, who according to the GrapheneOS team are spreading false accusations in the media and threatening arrests and server seizures.

GrapheneOS, a non-profit project that develops an operating system for Android phones with extra focus on privacy and security, has in recent days published a series of posts on the X platform about what they describe as a coordinated campaign by French police. According to the team, authorities have sent out internal messages to the country's police forces where all Google Pixel phones with GrapheneOS are labeled as suspicious. This has led to a wave of articles in French media, where claims that the system is used for criminal purposes are repeated without fact-checking or opportunity for GrapheneOS to respond.

"France's law enforcement are making outrageously false and unsubstantiated claims about GrapheneOS, which are being printed by both state and corporate media as facts when they're not", GrapheneOS writes in a post on X on November 23. The team emphasizes that they were not given any chance to review or respond to the accusations before publication. Instead, they have been forced into a defensive position, where they now plan to exercise their right of reply in French media.

Threats of intervention and demands for concessions

The conflict has escalated to direct threats, according to GrapheneOS. In contacts with French authorities, the team has been urged to assist with decryption of devices, something they technically cannot or will not do due to the system's design.

"They have made several quite direct threats of arrests and seizures of servers, just as they did with SkyECC and Encrochat", GrapheneOS emphasizes in an update on November 25. The reference refers to previous cases where French authorities intervened against encrypted communication networks.

The authorities are in practice demanding that GrapheneOS stop distributing functioning disk encryption, otherwise the project risks legal action. This is likened to the famous dispute between Apple and FBI in the United States, but with a twist: Google's hardware in Pixel phones is designed to resist such demands, and GrapheneOS builds further on that security. "They don't demand the same thing from Google for standard Pixel despite nearly identical encryption, because they are much less secure and can be exploited in advance". the team explains.

GrapheneOS emphasizes that their work is legal in countries like Canada, Germany, the United States, and the Netherlands. France, however, is pushing for laws that force backdoors in encryption, a policy that has not yet been implemented but which police are acting as if it already applies.

Dismantling of infrastructure and future plans

In response to the threats, GrapheneOS has initiated a rapid withdrawal of its presence in France. They are leaving the server provider OVH, a French company, and migrating their 15 servers – spread across Canada, Singapore, Germany, and the United States – to alternative locations.

"We are leaving France as a server location and OVH as a provider before they do anything", they announce in a post on Tuesday. Already now, ten servers have been replaced, including those used for standard updates. The remaining five, which handle email, forums, and other services in Beauharnois, Canada, are planned to be moved to colocation servers in Toronto.

For European users, GrapheneOS promises maintained performance through servers in Switzerland, Luxembourg, and the Netherlands – countries that do not support the EU's controversial "Chat Control" proposal on mass surveillance. "We can offer low latency and high throughput to users in France without servers there", the team assures. They also intend to avoid travel to France, including conferences, and discourage employees from working from the country.

The incident raises questions about the EU's future for open-source projects within privacy. GrapheneOS, which is financed through donations and sponsors, has built its reputation on open source and robust protections against exploits (security vulnerabilities that can be exploited). Now they see France as an "unsafe place for open-source privacy projects."

A spokesperson for the French Ministry of the Interior has not commented on the accusations, but previous statements from the government point to a harder line against encryption in the fight against organized crime.

Swedish company continues to invest in GrapheneOS despite conflict

In the midst of the ongoing conflict, the Swedish technology company Teuton Systems shows continued confidence in GrapheneOS. The company works exclusively with the system in its privacy-secure mobile phone, the Matrix phone, which is one of the first such products on the Nordic market. Teuton Systems emphasizes that the installation of GrapheneOS occurs only via the official source and with open-source tools like Aurora Store and F-Droid, to ensure transparency and maximum privacy without dependence on Google services.

The Matrix phone, based on Google Pixel phone hardware, is delivered with GrapheneOS pre-installed and prepared secure apps for everyday tasks. The product offers advanced features such as granular control over app permissions, sensor blocking, and automatic security updates. "GrapheneOS gives users full control over their data in a time of increasing surveillance, without compromising user-friendliness", Teuton Systems emphasizes on its website.

The company, which focuses on Nordic users, underscores the system's independent review and absence of backdoors, making it a reliable choice for privacy-conscious users.

Email was never built for privacy

Mass surveillance

How Proton makes email privacy simple.

Published November 8, 2025 – By Naomi Brockwell

Email was never built for privacy. It’s closer to a digital postcard than a sealed letter, bouncing through and sitting on servers you don’t control, and mainstream providers like Gmail read and analyze everything that is inside.

Email isn’t going anywhere in our society, it’s baked into how the digital world communicates. But luckily there are ways to make your emails more private. One tool that you can use is PGP, which stands for “Pretty Good Privacy”.

PGP is one of the oldest and most powerful tools for email privacy. It takes your message and locks it with the recipient’s public key, so only they can unlock it with their private key. That means even if someone intercepts the email, whether it’s a hacker, your ISP, or a government agency, they see only scrambled text.

Unfortunately it is notoriously complicated. Normally, you’d have to install command-line tools, generate keys manually, and run cryptic commands just to send an encrypted email.

But Proton Mail makes all of that easy, and builds PGP right into your inbox.

How Proton makes PGP simple

Proton is a great, privacy-focused email provider (and no they’re not sponsoring this newsletter, they’re simply an email provider that I like to use).

If you email someone within the Proton ecosystem (ie send an email from one Proton user to another Proton user), your email is automatically end-to-end encrypted using PGP.

But what if you email someone outside of the Proton ecosystem?

Here’s where it would usually get tricky.

First, you’d need to install a PGP client, which is a program that lets you generate and manage your encryption keys.

Then you’d run command-line prompts, choosing the key type, size, expiration, associating the email you want to use the key with, and you’d export your public key. It’s complicated.

But if you use Proton, they make using PGP super easy.

Let’s go through how to use it.

Automatic search for public PGP key

First of all, when you type an email address into the “To” field in Proton Mail, it automatically searches for a public PGP key associated with that address. Proton checks its own network, your contact list, and Web Key Directory (WKD) on the associated email domain.

WKD is a small web‑standard that allows someone to publish their public key at their domain in a way that makes it easily findable for an email app. For example if Proton finds a key for a certain address at the associated domain, Proton will automatically encrypt a message with it.

If they find a key, you’ll see a green lock next to the recipient in the ‘To’ field, indicating the message will be encrypted.

You don’t need to copy, paste, or import anything. It just works.

Great, your email has been automatically encrypted using PGP, and only the recipient of the email will be able to use their private key to decrypt it.

Manually uploading someone’s PGP key

What if Proton doesn’t automatically find someone’s PGP key? You can hunt down the key manually and import it. Some people will have their key available on their website, either in plain text, or as a .asc file. Proton allows you to save this PGP key in your contacts.

To add one manually, first you type their email address in the “to” field.

Then right-click on that address, and select “view contact details”

Then click the settings wheel to go to email settings, and select “show advanced PGP settings”

Under “public keys”, select “upload” and upload their public key in an .asc format.

Once the key is uploaded, the “encrypt emails” toggle will automatically switch on, and all future emails to that contact will automatically be protected with PGP. You can turn that off at any time, and also remove or replace the public key.

How do others secure emails to you using PGP?

Super! So you’ve sent an encrypted email to someone using their PGP key. What if they want to send you an email back, will that be automatically end-to-end encrypted (E2EE) using PGP? Not necessarily.

In order for someone to send you an end-to-end encrypted email, they need your public PGP key.

Download your public-private key pair inside Proton

Proton automatically generates a public-private key pair for each address that you have configured inside Proton Mail, and manages encryption inside its own network.

If you want people outside Proton to be able to encrypt messages to you, the first step is to export your public key from your Proton account so you can share it with them.

To do this:

  • Go to Setting
  • Click “All settings”
  • Select “encryption and keys”
  • Under “email encryption keys” you’ll have a dropdown menu of all your email addresses associated with your Proton account. Select the address that you want to export the public key for.
  • Under the “action” column, click “export public key”

It will download as an .asc file, and ask you where you want to save the file.

Normally a PGP key is written in 1s and 0s that your computer can read. The .asc file takes that key and wraps it in readable characters, and it ends up in a format that looks something like this:

Sharing your public key

Now that you’ve downloaded the public key, how do you share it with people so that they can contact you privately? There are several ways.

For @proton.me and @protonmail.com addresses, Proton publishes your public key in its WKD automatically. You don’t have to do anything.

For custom domains configured in Proton Mail, Proton doesn’t host WKD for you. You can publish WKD yourself on your own domain by serving it at a special path on your website. Or you can delegate WKD to a managed service. Or if you don’t want to use WKD at all, you can upload your key to a public keyserver like keys.openpgp.org, which provides another way for mail apps to discover it.

We’re not going to cover those setups in this article. Instead here are simpler ways to share your public key:

1) You can send people your .asc file directly if you want them to be able to encrypt emails to you (be sure to let them know which email address is associated with this key), or you can host this .asc file on your website for people to download.

2) You can open the .asc file in a text editor and copy and paste the key, and then send people this text, or upload the text on your website. This is what I have done:

This way if anyone wants to send me an email more privately, they can do so.

But Proton makes it even easier to share your PGP key: you can opt to automatically attach your public key to every email.

To turn this on:

  1. Go to Settings → Encryption & keys → External PGP settings
  2. Enable
    • Sign external messages
    • Attach public key

Once this is on, every email you send will automatically include your public key file, as a small .asc text file.

This means anyone using a PGP-capable mail client (like Thunderbird, Mailvelope, etc.) can import it immediately, with no manual steps required.

Password-protected emails

Proton also lets you send password-protected emails, so even if the other person doesn’t use PGP you can still keep the contents private. This isn’t PGP -- Proton encrypts the message and attachments in your browser and the recipient gets a link to a secure viewing page. They enter a password you share separately to open it. Their provider (like Gmail) only sees a notification email with a link, not the message itself. You can add a password hint, and the message expires after a set time (28 days by default).

The bottom line

Email privacy doesn’t have to be painful. Proton hides the complexity by adding a password option, or automating a lot of the PGP process for you: it automatically looks up recipients’ keys, encrypts your messages, and makes your key easy for others to use when they reply.

As Phil Zimmermann, the creator of PGP, explained in Why I Wrote PGP:

“PGP empowers people to take their privacy into their own hands. There has been a growing social need for it. That’s why I wrote it".

We’re honored to have Mr. Zimmermann on our board of advisors at Ludlow Institute.

Pioneers like him fought hard so we could protect our privacy. It’s on us to use the tools they gave us.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.