Friday, April 18, 2025

Polaris of Enlightenment

Ad:

“Many misleading claims about Chat Control 2.0”

Mass surveillance

Ylva Johansson chooses to ignore the fact that a mass surveillance proposal requires mass surveillance, Karl Emil Nikka, IT security expert, writes.

Published 28 September 2023
IT security expert Karl Emil Nikka. EU Commissioner Ylva Johansson.
This is an opinion piece. The author is responsible for the views expressed in the article.

One of the topics discussed in the last week’s episode of Medierna i P1 was the European Commission’s controversial mass surveillance proposal Chat Control 2.0 and its consequences for journalists. The episode featured EU Commissioner Ylva Johansson, IT and media lawyer Daniel Westman and Anne Lagercrantz, President of the Swedish Publishers Association.

Westman and Lagercrantz were critical of the mass surveillance proposal, partly because of the consequences for the protection of sources. The Swedish Association of Journalists and the Swedish Newspaper Publishers have previously warned about the consequences of the proposal for the same reasons.

Comically, the pre-recorded interview began with Johansson asking if she could call Martina Pierrou, the interviewing journalist, via Signal or Whatsapp instead.

At the time of the interview, Johansson and Pierrou were able to talk via Signal, but if the mass surveillance proposal goes through, that possibility will disappear. In a response to me on X (Twitter), Signal’s CEO announced that they will leave the EU if they are forced to build backdoors into their app.

This is a very wise decision on Signal’s part as such backdoors undermine the safety and security of children and adults around the world. The rest of the world should not have to suffer because we in Europe are unable to stop EU proposals that violate human rights, the Convention on the Rights of the Child and our own EU Charter.

Below is an analysis of all the statements made by Johansson in the interview. The quotes are printed in full. The time codes link directly to the paragraphs in the section where the claims were made.

Incorrect suggestion of a requirement for a court decision

When asked about what the bill means in practice (18:55), Johansson repeated her recurring lie that a court order would be required to scan communications. She explained the practical implications of the proposal with the following sentence.

“To force the companies to make risk assessments, to take measures to ensure that their services are not used for this terrible crime and ultimately to make it possible, by court order, to also allow the scanning of communications to find these abuses.” – Ylva Johansson (2023-09-23)

Pierrou followed up with a remark that the proposal may require scanning without suspicion of crime against any individual (19.24). Ylva Johansson responded as follows.

“No, scanning will take place when there is a risk that a certain service is being used extensively to spread these criminal offenses. Then a court can decide that scanning is permitted and necessary.” – Ylva Johansson (2023-09-23)

The suggestion that a court decision would be required is incorrect. Johansson made the same claim in the debate against me in Svenska Dagbladet from April this year (the only debate in the Swedish media that Johansson has participated in). I then offered to correct her claim myself, in order to investigate whether she knew that her proposal did not require a court decision. The proposal also accepts decisions from administrative authorities. Johansson knew this. Nevertheless, she repeated the lie in the interview in SVT Aktuellt (April 2023), Ekot’s Saturday interview (June 2023) and now today in Medierna i P1.

Omitted consequence

In the answer to the same question, Johansson omitted the most crucial point, namely that backdoors are a prerequisite for the scanning of end-to-end encrypted conversations to be done at all. Once these backdoors are in place, they can be abused and cause data leaks. Other states, such as the US where most of the affected services are based, can use the backdoors to scan for content they are interested in.

The proposal states that service providers may only use their position to scan for child abuse material and grooming attempts. Even if we ignore the likely purpose creep, it doesn’t matter. Today, we have technical protections that ensure that our end-to-end encrypted conversations are impossible to intercept. The European Commission wants to replace these technical protections with legal restrictions on what the new backdoors can (and cannot) be used for.

This naivety is unprecedented. It is incomprehensible to me how the EU can believe that the US would allow American companies to install back doors that are limited to the EU’s prescribed use. As a thought experiment, we can consider how the EU would react if the US tried to do the same to our companies.

If we take into account the highly likely purpose creep, the situation gets even worse. We only have to go back to 2008 to demonstrate this. At that time, the FRA debate was in full swing and FRA Director General Ingvar Åkesson wrote a debate article in Svenska Dagbladet with the following memorable words.

“FRA cannot spy on domestic phenomena. /…/ Yet the idea is being cultivated that FRA should listen to all Swedes’ phone calls, read their e-mails and text messages. A disgusting idea. How can so many people believe that a democratically elected parliament would wish its people so ill?” – Ingvar Åkesson (2008-06-29)

15 years later, Åkesson can hopefully understand why we thought that a democratically elected parliament could want its people so badly. Right now exactly this “disgusting idea” (the Director General’s choice of words) is being proposed.

Belief in the existence of non-existent technologies

Pierrou then asked how the solution would actually work. Pierrou pointed out that “according to an opinion from the European Data Protection Board, the technology required by the proposal does not exist today” (19.55).

Johansson responded with a quote that will go down in history.

“I believe that there is. But my bill is technology-neutral and that means that we set standards for what the technology must be able to do and what high standards of integrity the technology must meet.” – Ylva Johansson (2023-09-23)

Here Johansson again shows that she based her proposal on incorrect assumptions about how technology works. After having been refuted by the world’s experts, she is now forced to switch to opinion arguments such as “I believe it exists”.

Whether technology exists (or can exist) is of course not a matter of opinion. It is, always has been, and always will be technically impossible to scan the content of properly end-to-end encrypted conversations.

To smooth over the embarrassment, Johansson pointed out that the bill is technology-neutral. This may sound good, but it says nothing in the context. Setting standards for what technology must do is only embarrassing when it is done without first examining what is practically possible.

If service providers of end-to-end encrypted services are to be able to scan the content of conversations, they must build in backdoors. The backdoors allow them to scan the content before it is encrypted and after it has been decrypted. Without backdoors, scanning is and remains technically impossible.

Opinion on mass surveillance in mass surveillance proposals

Pierrou concluded the interview by asking what Johansson thought about the image of the proposal being painted as a mass surveillance proposal (20.19). Johansson then answered the following.

“Yes, that is a completely wrong picture. It is not about anyone monitoring at all.” – Ylva Johansson (2023-09-23)

The definition of mass surveillance should be that the masses are monitored (as opposed to targeted surveillance against selected suspects). As highlighted by Pierrou in a previous question, the Chat Control 2.0 scan does not require any suspicion of crime against individuals. Service providers should monitor what the masses write and say on the platforms. Service providers will report suspicious conversations to the new EU centre to be set up in The Hague.

The proposal is thus, by definition, a mass surveillance proposal.

However, Johansson chose to ignore the fact that a mass surveillance proposal requires mass surveillance. Instead, she tried to dismiss the criticism with the following argument and a pat on her own shoulder (20.34).

“It is obvious that when you are a bit of a pioneer, as I am in this case, you have to expect that you will also be questioned.” – Ylva Johansson (2023-09-23)

Unfortunately, I must crush Commissioner Johansson’s self-image and state that she has never been questioned for being a pioneer. Johansson is not even a pioneer in the field, something she herself should know.

It has barely been 30 years since the Stasi was disbanded.

 

Karl Emil Nikka

 


This article is republished from nikkasystems.com under CC BY 4.0.

About the author

Karl Emil Nikka is the founder of Nikka Systems, Security Profile of the Year 2021, author and a IT security expert.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Amazon updates privacy settings – all voice data to be stored in the cloud

Mass surveillance

Published 26 March 2025
– By Editorial Staff
Amazon itself states that it saves users' calls in order to improve the service.

As of March 28, some Echo devices will no longer be able to process voice data locally – all voice information will be sent to Amazon’s cloud service, regardless of the user’s will.

Echo is a series of smart devices, including speakers, developed by Amazon. The device records what you say and sends it to Amazon’s servers to be stored and analyzed, allegedly to improve the service. Privacy settings have previously allowed some devices to process voice data locally without sending it to Amazon.

In an email to Echo users, shared on Reddit, Amazon announced that the ability to process voice commands locally is being removed. Instead, all recordings will be sent to the cloud for processing, as Sweclockers has reported.

If the user doesn’t actively change their settings before March 28, they will automatically be set to “do not save data”. This means that Amazon will still collect and process your voice information, but that this will be deleted after Alexa handles the request. However, it is unclear how long the information will be stored before it is actually deleted.

Amazon states that voice data is needed to train the company’s AI model, Alexa Plus. At the same time, the company promises that all previously saved voice data will be deleted if the user has the “do not save data” feature enabled.

The tech mogul on the future of AI: Constant mass surveillance

Mass surveillance

Published 24 January 2025
– By Editorial Staff
With the help of AI, Ellison believes that in the future, those in power will be able to follow citizens' every move.

Tech giant Oracle’s CEO Larry Ellison believes in a future where artificial intelligence becomes an integral part of a borderless mass surveillance society where privacy no longer exists and where everything citizens do is mapped and recorded.

Oracle and Larry Ellison will play a key role in Trump’s AI venture “Stargate” expected to cost upwards of $500 billion and described by the President himself as “by far the largest AI infrastructure project in history”.

There is no doubt that Ellison is one of the world’s most successful tech moguls just last fall he overtook Amazon founder Jeff Bezos to become the world’s second richest man after Elon Musk. But how does he see the future of artificial intelligence and how it will affect our lives?

During a meeting with financial analysts last fall, he predicted a future that critics say is reminiscent of dark dystopian novels like George Orwell’s 1984, where humans are subject to constant mass surveillance and AI is used to map citizens’ every move.

According to Ellison, it is highly likely that in the future, AI models will be used to analyze in real time all the material not only from surveillance cameras, police body cameras, but also from car cameras and doorbells.

Citizens will be on their best behavior because we are constantly recording and reporting everything that’s going on.

Every police officer is going to be supervised at all times, and if there’s a problem, AI will report the problem and report it to the appropriate person, he continued.

“Big brother is watching you”

The multi-billionaire also believes that AI-controlled drones will replace real police officers during car chases and other types of crime and disorder.

– If something happens in a shopping center, a drone goes out there and reaches the scene way faster than a police car.

Technology website Ars Technica’s writer Benji Edwards is one of many who reacted strongly to Ellison’s vision of AI surveillance, saying his comments raise questions about the future of citizens’ privacy and right to privacy.

Ellison’s vision bears more than a passing resemblance to the cautionary world portrayed in George Orwell’s prescient novel 1984. In Orwell’s fiction, the totalitarian government of Oceania uses ubiquitous ‘telescreens’ to monitor citizens constantly, creating a society where privacy no longer exists and independent thought becomes nearly impossible“, Edwards notes.

But Orwell’s famous phrase ‘Big Brother is watching you’ would take on new meaning in Ellison’s tech-driven scenario, where AI systems, rather than human watchers, would serve as the ever-vigilant eyes of authority. Once considered a sci-fi trope, automated systems are already becoming a reality: Similar automated CCTV surveillance systems have already been trialed in London Underground and at the 2024 Olympics“, he continues.

“A slave obeys”

He points out that automated surveillance systems have already been implemented in Chinese cities, among others, and that AI software is already available that can sort and organize the data collected on residents using a network of deployed surveillance cameras.

According to many observers, similar and even more advanced solutions may soon become part of everyday life in the United States and other countries, and there are warnings that a “digital dictatorship” is emerging where the surveillance state is so all-encompassing that it is impossible for anyone to escape.

“‘Good Behavior’ as defined by the billionaires who own and control everything. Otherwise known as blind obedience and willful subservience to their every whim and want. Because a slave obeys, expresses one of many worried voices.

“I have nothing to hide”

Mass surveillance

Ten reasons privacy matters for everyone.

Published 8 January 2025
– By Naomi Brockwell
Is there nothing in your life that is actually private and concerns you and only you?

Challenging the myth

“I have nothing to hide”. It’s a phrase we’ve all heard, and perhaps even said ourselves, when privacy comes up. But it reveals a dangerous misunderstanding of what privacy is and why it matters.

Privacy isn’t about hiding—it’s about control. It’s about having the freedom to decide who gets access to your data and how it’s used. Over the last decade, that freedom has eroded. Today, governments, corporations, and hackers routinely collect and exploit our personal information, often without our consent.

Worse still, the narrative around privacy has shifted. Those who value it are seen as secretive, even criminal, while surveillance is sold to us as a tool for safety and transparency. This mindset benefits only those who profit from our data.

It’s time to push back. Here are 10 arguments you can use the next time someone says, “I have nothing to hide”.

1. Privacy is about consent, not secrecy

Privacy isn’t about hiding secrets—it’s about having control over your information. It’s the ability to decide for yourself who gets access to your data.

We don’t have to hand over all our personal information just because it’s requested. Tools like email aliases, VoIP numbers, and masked credit cards allow us to protect our data while still using online services. Privacy-focused companies like ProtonMail or Signal respect this principle, giving you more control over your information.

2. Nothing to hide, everything to protect

Even if you think you have nothing to hide, you have everything to protect. Oversharing data makes you vulnerable to hackers, scammers, and malicious actors.

For example:

  • Hackers can use personal details like your home address or purchase history to commit fraud or even locate you.
  • Data brokers can manipulate you with targeted content and even influence your political beliefs, as seen in the Cambridge Analytica scandal.

Protecting your data is about safeguarding yourself from these threats and protecting your autonomy.

3. Your data is forever

Data collected about you today will still exist decades from now. Governments change, laws evolve, and what’s harmless now could be used against you or your children in the future.

Surveillance infrastructure rarely disappears once it’s built. Limiting the data collected about you now is essential for protecting yourself from unknown risks down the line.

4. It’s not about you

Privacy isn’t just a personal issue—it’s about protecting others. Activists, journalists, and whistleblowers rely on privacy to do their work safely. By dismissing privacy, you’re ignoring the people for whom it’s a matter of life and death.

For example, Pegasus spyware has been used to track and silence journalists and activists. We should be leaning in to privacy tools, supporting the privacy ecosystem, and ensuring that those helping to keep our society free and safe are protected, whether we personally feel like we need privacy or not.

5. Surveillance isn’t about criminals

The claim that surveillance is “only for catching bad guys” is a myth. Once surveillance tools are deployed, they almost always expand beyond their original purpose.

History has shown how governments use surveillance to target dissenters, minorities, and anyone challenging the status quo. Privacy isn’t just for criminals—it’s a safeguard against abuse of power.

6. Your choices put others at risk

When you disregard privacy, you expose not just yourself but also the people around you.

For example:

  • Using apps that access your contact list can leak your friends’ and family’s phone numbers and addresses without their consent.
  • Insisting on non-private communication tools can expose sensitive conversations to surveillance or data breaches.
  • Uploading your photos to a non-private cloud like Google Drive allows those in your photos to be identified using facial recognition, and profiled based on information Google AI sees in your photos.

Respecting privacy isn’t just about protecting yourself—it’s about respecting the privacy boundaries of others.

7. Privacy is not dead

For some people, “I have nothing to hide” is a coping mechanism.
“Privacy is dead, so why bother?”

This defeatist attitude is both false and harmful. Privacy is alive—it’s a choice we can make every day. Let’s stop disempowering others by convincing them they shouldn’t even try.

There are countless privacy tools you can incorporate into your life. By choosing these tools, you take back control over your information and send a clear message that privacy matters.

8. Your data can be weaponized

All it takes is one bad actor—a rogue employee, an ex-partner, or a hacker—to turn your data against you. From revenge hacking to identity theft, the consequences of oversharing are real and dangerous.

Limiting the amount of data collected about you reduces your vulnerability and makes it harder for others to exploit your information.

9. Surveillance stifles creativity and dissent

Surveillance doesn’t just invade your privacy—it affects how you think and behave. Studies show that people censor themselves when they know they’re being watched.

This “chilling effect” stifles creativity, innovation, and dissent. Without privacy, we lose the ability to think freely, explore controversial ideas, and push back against authority.

10. Your choices send a signal

Every decision you make about technology sends a message. Choosing privacy-focused companies tells the market, “This matters”. It encourages innovation and creates demand for tools that protect individual freedom.

Conversely, supporting data-harvesting companies reinforces the status quo and pushes privacy-focused alternatives out of the market. When people say “I have nothing to hide” instead of leaning into the privacy tools around them, it ignores the role we all play in shaping the future of society.

Takeaways: Why privacy matters

  1. Privacy is about consent, not secrecy. It’s your right to control who accesses your data.
  2. You have everything to protect. Data breaches and scams are real threats.
  3. Data is forever. What’s collected today could harm you tomorrow.
  4. Privacy protects others. Journalists and activists depend on it to do their work safely.
  5. Surveillance tools expand. They rarely stop at targeting criminals.
  6. Your choices matter. Privacy tools send a message to the market and inspire change.
  7. Privacy isn’t dead. We have tools to protect ourselves—it’s up to us to use them.

A fight we can’t afford to lose

Privacy isn’t about hiding—it’s about protecting your rights, your choices, and your future. Surveillance is a weapon that can silence opposition, suppress individuality, and enforce conformity. Without privacy, we lose the freedom to dissent, innovate, and live without fear.

The next time someone says, “I have nothing to hide”, remind them: privacy is normal. It’s necessary. And it’s a fight we can’t afford to lose.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

Police used Tesla driver data: “A double-edged sword”

Mass surveillance

Published 6 January 2025
– By Editorial Staff
The Tesla Cybertruck that exploded outside the Trump Hotel in Las Vegas, January 1 this year.

The explosion of a Tesla Cybertruck in Las Vegas on New Year’s Day has highlighted how much information modern cars collect about their drivers and events around them. Tesla CEO Elon Musk quickly provided police with data and video footage, which helped the investigation determine that it was a suicide rather than an accident or terrorism.

The data collected has been praised by police for helping to quickly clarify the circumstances. At the same time, the collection has raised questions about privacy and potential abuse.

It’s a double-edged sword, David Choffnes of the Cybersecurity and Privacy Institute in Boston told the Washington Post.

– The companies collecting the data could misuse it.

Others, like Tesla enthusiast Justin Demaree, agree on the dual aspect. He emphasizes the importance of helping in the event of a serious incident, but also the concern about how much personal information is being stored:

– We want our privacy and we don’t want our data shared … but you want to help in a situation where terrorism could be a factor.

Tesla and other car companies have access to extensive data that includes camera recordings and location information, among other things. According to a 2023 Mozilla Foundation report, over 75 percent of automakers say they may share or sell driver data, often without drivers being aware of this. Only two brands, Renault and Dacia, offer drivers the option to delete their personal data.

Cars, often associated with freedom and autonomy, risk becoming one of the most monitored spaces in people’s lives, experts warn.

– There’s something deeply ironic that this emblem of personal autonomy, might be one of the most heavily surveilled places in many of our lives, said Albert Fox Cahn of the Surveillance Technology Oversight Project.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.