Sunday, January 19, 2025

Polaris of Enlightenment

Ad:

“Many misleading claims about Chat Control 2.0”

Mass surveillance

Ylva Johansson chooses to ignore the fact that a mass surveillance proposal requires mass surveillance, Karl Emil Nikka, IT security expert, writes.

Published 28 September 2023
IT security expert Karl Emil Nikka. EU Commissioner Ylva Johansson.
This is an opinion piece. The author is responsible for the views expressed in the article.

One of the topics discussed in the last week’s episode of Medierna i P1 was the European Commission’s controversial mass surveillance proposal Chat Control 2.0 and its consequences for journalists. The episode featured EU Commissioner Ylva Johansson, IT and media lawyer Daniel Westman and Anne Lagercrantz, President of the Swedish Publishers Association.

Westman and Lagercrantz were critical of the mass surveillance proposal, partly because of the consequences for the protection of sources. The Swedish Association of Journalists and the Swedish Newspaper Publishers have previously warned about the consequences of the proposal for the same reasons.

Comically, the pre-recorded interview began with Johansson asking if she could call Martina Pierrou, the interviewing journalist, via Signal or Whatsapp instead.

At the time of the interview, Johansson and Pierrou were able to talk via Signal, but if the mass surveillance proposal goes through, that possibility will disappear. In a response to me on X (Twitter), Signal’s CEO announced that they will leave the EU if they are forced to build backdoors into their app.

This is a very wise decision on Signal’s part as such backdoors undermine the safety and security of children and adults around the world. The rest of the world should not have to suffer because we in Europe are unable to stop EU proposals that violate human rights, the Convention on the Rights of the Child and our own EU Charter.

Below is an analysis of all the statements made by Johansson in the interview. The quotes are printed in full. The time codes link directly to the paragraphs in the section where the claims were made.

Incorrect suggestion of a requirement for a court decision

When asked about what the bill means in practice (18:55), Johansson repeated her recurring lie that a court order would be required to scan communications. She explained the practical implications of the proposal with the following sentence.

“To force the companies to make risk assessments, to take measures to ensure that their services are not used for this terrible crime and ultimately to make it possible, by court order, to also allow the scanning of communications to find these abuses.” – Ylva Johansson (2023-09-23)

Pierrou followed up with a remark that the proposal may require scanning without suspicion of crime against any individual (19.24). Ylva Johansson responded as follows.

“No, scanning will take place when there is a risk that a certain service is being used extensively to spread these criminal offenses. Then a court can decide that scanning is permitted and necessary.” – Ylva Johansson (2023-09-23)

The suggestion that a court decision would be required is incorrect. Johansson made the same claim in the debate against me in Svenska Dagbladet from April this year (the only debate in the Swedish media that Johansson has participated in). I then offered to correct her claim myself, in order to investigate whether she knew that her proposal did not require a court decision. The proposal also accepts decisions from administrative authorities. Johansson knew this. Nevertheless, she repeated the lie in the interview in SVT Aktuellt (April 2023), Ekot’s Saturday interview (June 2023) and now today in Medierna i P1.

Omitted consequence

In the answer to the same question, Johansson omitted the most crucial point, namely that backdoors are a prerequisite for the scanning of end-to-end encrypted conversations to be done at all. Once these backdoors are in place, they can be abused and cause data leaks. Other states, such as the US where most of the affected services are based, can use the backdoors to scan for content they are interested in.

The proposal states that service providers may only use their position to scan for child abuse material and grooming attempts. Even if we ignore the likely purpose creep, it doesn’t matter. Today, we have technical protections that ensure that our end-to-end encrypted conversations are impossible to intercept. The European Commission wants to replace these technical protections with legal restrictions on what the new backdoors can (and cannot) be used for.

This naivety is unprecedented. It is incomprehensible to me how the EU can believe that the US would allow American companies to install back doors that are limited to the EU’s prescribed use. As a thought experiment, we can consider how the EU would react if the US tried to do the same to our companies.

If we take into account the highly likely purpose creep, the situation gets even worse. We only have to go back to 2008 to demonstrate this. At that time, the FRA debate was in full swing and FRA Director General Ingvar Åkesson wrote a debate article in Svenska Dagbladet with the following memorable words.

“FRA cannot spy on domestic phenomena. /…/ Yet the idea is being cultivated that FRA should listen to all Swedes’ phone calls, read their e-mails and text messages. A disgusting idea. How can so many people believe that a democratically elected parliament would wish its people so ill?” – Ingvar Åkesson (2008-06-29)

15 years later, Åkesson can hopefully understand why we thought that a democratically elected parliament could want its people so badly. Right now exactly this “disgusting idea” (the Director General’s choice of words) is being proposed.

Belief in the existence of non-existent technologies

Pierrou then asked how the solution would actually work. Pierrou pointed out that “according to an opinion from the European Data Protection Board, the technology required by the proposal does not exist today” (19.55).

Johansson responded with a quote that will go down in history.

“I believe that there is. But my bill is technology-neutral and that means that we set standards for what the technology must be able to do and what high standards of integrity the technology must meet.” – Ylva Johansson (2023-09-23)

Here Johansson again shows that she based her proposal on incorrect assumptions about how technology works. After having been refuted by the world’s experts, she is now forced to switch to opinion arguments such as “I believe it exists”.

Whether technology exists (or can exist) is of course not a matter of opinion. It is, always has been, and always will be technically impossible to scan the content of properly end-to-end encrypted conversations.

To smooth over the embarrassment, Johansson pointed out that the bill is technology-neutral. This may sound good, but it says nothing in the context. Setting standards for what technology must do is only embarrassing when it is done without first examining what is practically possible.

If service providers of end-to-end encrypted services are to be able to scan the content of conversations, they must build in backdoors. The backdoors allow them to scan the content before it is encrypted and after it has been decrypted. Without backdoors, scanning is and remains technically impossible.

Opinion on mass surveillance in mass surveillance proposals

Pierrou concluded the interview by asking what Johansson thought about the image of the proposal being painted as a mass surveillance proposal (20.19). Johansson then answered the following.

“Yes, that is a completely wrong picture. It is not about anyone monitoring at all.” – Ylva Johansson (2023-09-23)

The definition of mass surveillance should be that the masses are monitored (as opposed to targeted surveillance against selected suspects). As highlighted by Pierrou in a previous question, the Chat Control 2.0 scan does not require any suspicion of crime against individuals. Service providers should monitor what the masses write and say on the platforms. Service providers will report suspicious conversations to the new EU centre to be set up in The Hague.

The proposal is thus, by definition, a mass surveillance proposal.

However, Johansson chose to ignore the fact that a mass surveillance proposal requires mass surveillance. Instead, she tried to dismiss the criticism with the following argument and a pat on her own shoulder (20.34).

“It is obvious that when you are a bit of a pioneer, as I am in this case, you have to expect that you will also be questioned.” – Ylva Johansson (2023-09-23)

Unfortunately, I must crush Commissioner Johansson’s self-image and state that she has never been questioned for being a pioneer. Johansson is not even a pioneer in the field, something she herself should know.

It has barely been 30 years since the Stasi was disbanded.

 

Karl Emil Nikka

 


This article is republished from nikkasystems.com under CC BY 4.0.

About the author

Karl Emil Nikka is the founder of Nikka Systems, Security Profile of the Year 2021, author and a IT security expert.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

“I have nothing to hide”

Mass surveillance

Ten reasons privacy matters for everyone.

Published 8 January 2025
– By Naomi Brockwell
Is there nothing in your life that is actually private and concerns you and only you?

Challenging the myth

“I have nothing to hide”. It’s a phrase we’ve all heard, and perhaps even said ourselves, when privacy comes up. But it reveals a dangerous misunderstanding of what privacy is and why it matters.

Privacy isn’t about hiding—it’s about control. It’s about having the freedom to decide who gets access to your data and how it’s used. Over the last decade, that freedom has eroded. Today, governments, corporations, and hackers routinely collect and exploit our personal information, often without our consent.

Worse still, the narrative around privacy has shifted. Those who value it are seen as secretive, even criminal, while surveillance is sold to us as a tool for safety and transparency. This mindset benefits only those who profit from our data.

It’s time to push back. Here are 10 arguments you can use the next time someone says, “I have nothing to hide”.

1. Privacy is about consent, not secrecy

Privacy isn’t about hiding secrets—it’s about having control over your information. It’s the ability to decide for yourself who gets access to your data.

We don’t have to hand over all our personal information just because it’s requested. Tools like email aliases, VoIP numbers, and masked credit cards allow us to protect our data while still using online services. Privacy-focused companies like ProtonMail or Signal respect this principle, giving you more control over your information.

2. Nothing to hide, everything to protect

Even if you think you have nothing to hide, you have everything to protect. Oversharing data makes you vulnerable to hackers, scammers, and malicious actors.

For example:

  • Hackers can use personal details like your home address or purchase history to commit fraud or even locate you.
  • Data brokers can manipulate you with targeted content and even influence your political beliefs, as seen in the Cambridge Analytica scandal.

Protecting your data is about safeguarding yourself from these threats and protecting your autonomy.

3. Your data is forever

Data collected about you today will still exist decades from now. Governments change, laws evolve, and what’s harmless now could be used against you or your children in the future.

Surveillance infrastructure rarely disappears once it’s built. Limiting the data collected about you now is essential for protecting yourself from unknown risks down the line.

4. It’s not about you

Privacy isn’t just a personal issue—it’s about protecting others. Activists, journalists, and whistleblowers rely on privacy to do their work safely. By dismissing privacy, you’re ignoring the people for whom it’s a matter of life and death.

For example, Pegasus spyware has been used to track and silence journalists and activists. We should be leaning in to privacy tools, supporting the privacy ecosystem, and ensuring that those helping to keep our society free and safe are protected, whether we personally feel like we need privacy or not.

5. Surveillance isn’t about criminals

The claim that surveillance is “only for catching bad guys” is a myth. Once surveillance tools are deployed, they almost always expand beyond their original purpose.

History has shown how governments use surveillance to target dissenters, minorities, and anyone challenging the status quo. Privacy isn’t just for criminals—it’s a safeguard against abuse of power.

6. Your choices put others at risk

When you disregard privacy, you expose not just yourself but also the people around you.

For example:

  • Using apps that access your contact list can leak your friends’ and family’s phone numbers and addresses without their consent.
  • Insisting on non-private communication tools can expose sensitive conversations to surveillance or data breaches.
  • Uploading your photos to a non-private cloud like Google Drive allows those in your photos to be identified using facial recognition, and profiled based on information Google AI sees in your photos.

Respecting privacy isn’t just about protecting yourself—it’s about respecting the privacy boundaries of others.

7. Privacy is not dead

For some people, “I have nothing to hide” is a coping mechanism.
“Privacy is dead, so why bother?”

This defeatist attitude is both false and harmful. Privacy is alive—it’s a choice we can make every day. Let’s stop disempowering others by convincing them they shouldn’t even try.

There are countless privacy tools you can incorporate into your life. By choosing these tools, you take back control over your information and send a clear message that privacy matters.

8. Your data can be weaponized

All it takes is one bad actor—a rogue employee, an ex-partner, or a hacker—to turn your data against you. From revenge hacking to identity theft, the consequences of oversharing are real and dangerous.

Limiting the amount of data collected about you reduces your vulnerability and makes it harder for others to exploit your information.

9. Surveillance stifles creativity and dissent

Surveillance doesn’t just invade your privacy—it affects how you think and behave. Studies show that people censor themselves when they know they’re being watched.

This “chilling effect” stifles creativity, innovation, and dissent. Without privacy, we lose the ability to think freely, explore controversial ideas, and push back against authority.

10. Your choices send a signal

Every decision you make about technology sends a message. Choosing privacy-focused companies tells the market, “This matters”. It encourages innovation and creates demand for tools that protect individual freedom.

Conversely, supporting data-harvesting companies reinforces the status quo and pushes privacy-focused alternatives out of the market. When people say “I have nothing to hide” instead of leaning into the privacy tools around them, it ignores the role we all play in shaping the future of society.

Takeaways: Why privacy matters

  1. Privacy is about consent, not secrecy. It’s your right to control who accesses your data.
  2. You have everything to protect. Data breaches and scams are real threats.
  3. Data is forever. What’s collected today could harm you tomorrow.
  4. Privacy protects others. Journalists and activists depend on it to do their work safely.
  5. Surveillance tools expand. They rarely stop at targeting criminals.
  6. Your choices matter. Privacy tools send a message to the market and inspire change.
  7. Privacy isn’t dead. We have tools to protect ourselves—it’s up to us to use them.

A fight we can’t afford to lose

Privacy isn’t about hiding—it’s about protecting your rights, your choices, and your future. Surveillance is a weapon that can silence opposition, suppress individuality, and enforce conformity. Without privacy, we lose the freedom to dissent, innovate, and live without fear.

The next time someone says, “I have nothing to hide”, remind them: privacy is normal. It’s necessary. And it’s a fight we can’t afford to lose.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

Police used Tesla driver data: “A double-edged sword”

Mass surveillance

Published 6 January 2025
– By Editorial Staff
The Tesla Cybertruck that exploded outside the Trump Hotel in Las Vegas, January 1 this year.

The explosion of a Tesla Cybertruck in Las Vegas on New Year’s Day has highlighted how much information modern cars collect about their drivers and events around them. Tesla CEO Elon Musk quickly provided police with data and video footage, which helped the investigation determine that it was a suicide rather than an accident or terrorism.

The data collected has been praised by police for helping to quickly clarify the circumstances. At the same time, the collection has raised questions about privacy and potential abuse.

It’s a double-edged sword, David Choffnes of the Cybersecurity and Privacy Institute in Boston told the Washington Post.

– The companies collecting the data could misuse it.

Others, like Tesla enthusiast Justin Demaree, agree on the dual aspect. He emphasizes the importance of helping in the event of a serious incident, but also the concern about how much personal information is being stored:

– We want our privacy and we don’t want our data shared … but you want to help in a situation where terrorism could be a factor.

Tesla and other car companies have access to extensive data that includes camera recordings and location information, among other things. According to a 2023 Mozilla Foundation report, over 75 percent of automakers say they may share or sell driver data, often without drivers being aware of this. Only two brands, Renault and Dacia, offer drivers the option to delete their personal data.

Cars, often associated with freedom and autonomy, risk becoming one of the most monitored spaces in people’s lives, experts warn.

– There’s something deeply ironic that this emblem of personal autonomy, might be one of the most heavily surveilled places in many of our lives, said Albert Fox Cahn of the Surveillance Technology Oversight Project.

8 Ways to Fight for Privacy Today

Mass surveillance

How you can make an impact.

Published 19 December 2024
– By Naomi Brockwell
We do not yield to the mass surveillance machine.

Last week, we published the Priv/Acc Manifesto, and I was deeply moved by the outpouring of responses from people eager to take action. So many of you reached out, asking, “How can I help?”

The threat to privacy is obvious—relentless surveillance from governments, corporations, and bad actors alike. Countless bills trying to ban end-to-end encryption and mandate back doors. A sea of complacency from people who have been tricked into thinking privacy is about hiding, instead of about consent.

But the path to meaningful action isn’t always clear.

The good news is that everyone, regardless of background, has a role to play in safeguarding privacy. Whether you’re a coder building better tools, an educator raising awareness, an advocate pushing for change, or simply someone who values personal freedom, there are practical steps you can take to make a difference.

In this newsletter, I want to share some of the ways you can contribute to this critical fight.

#1 Lead by example

The easiest way to contribute is by making deliberate choices about the products and services you use. Switching to privacy-respecting tools not only protects your data, but also sends a powerful market signal that privacy matters. When you choose privacy-focused companies, you help them thrive, fostering the development of even better tools. On the flip side, continuing to use platforms that harvest our data undermines privacy-focused alternatives, pushing them out of the market.

Here are a handful of my favorite tools, but our channel features hundreds of videos showcasing great alternatives you can explore:

  • Messaging: Signal
  • Web Browsing: Brave Browser
  • VPNs: Mullvad VPN
  • Email: ProtonMail and Tutanota
  • Productivity: CryptPad and LibreOffice

#2 Push Back Against Cultural Norms

The phrase “I have nothing to hide” has become a lazy justification for dismissing privacy. It’s time to reframe the conversation. Privacy isn’t about secrecy – it’s about consent. It’s about having the right to choose who gets access to our data and rejecting the idea that valuing privacy is something to be ashamed of.

Privacy protects whistleblowers, activists, and everyday individuals from surveillance and coercion. When someone parrots “nothing to hide,” remind them that privacy safeguards freedom, creativity, and autonomy. Changing this mindset is essential to making privacy a societal priority.

#3 User Manuals and Educational Awareness

You don’t need to be technical to make a huge impact. Writing clear, accessible guides for privacy tools is one of the most valuable ways to help. Blogs with beginner-friendly tutorials or personal experiences using privacy tools contribute to a growing reservoir of educational material for the community. Translating tutorials into other languages can expand their reach even further.

Even super simple tutorials—like explaining that Gmail can read your emails—can be eye-opening for many people. Education is a powerful way to build awareness, and your efforts might help someone take their first step toward reclaiming their privacy.

#4 Contribute to Open-Source Projects

For those with technical expertise, contributing to open-source privacy projects is one of the most effective ways to support the cause. Free and Open Source Software (FOSS) tools like TorGrapheneOS, and VeraCrypt are essential for people worldwide, but these projects are often critically underfunded and under-resourced.

Developers can help by building features or fixing bugs, while researchers can perform security audits to identify vulnerabilities. Remember the Heartbleed vulnerability? It was a major flaw in SSL, a cornerstone of internet security, that went undetected for years—illustrating the need for more eyes on open-source projects. Even small contributions, like reviewing code, can make a huge difference.

#5 Test Privacy Tools and Provide Feedback

For privacy tools to succeed, they need to be user-friendly and accessible to everyone—not just tech enthusiasts. By testing privacy platforms and sharing constructive feedback, you can help developers improve default settings and refine the overall user experience (UX). These small adjustments can make tools more intuitive, significantly boosting adoption among non-technical users.

Even if you’re not a coder, your contributions—like testing tools, reporting bugs, or improving documentation—are invaluable to open-source projects. Developers rely on user input to ensure their tools work for everyone, making your efforts critical to advancing privacy.

#6 Financial Support

Financial support is vital for building a robust ecosystem of privacy tools. Many open-source projects rely on donations to survive, and businesses building privacy tools need customers to remain sustainable. FOSS ensures that privacy tools are accessible to everyone, but if you can afford to donate or pay for premium versions, your support keeps these tools available for those who need them most.

#7 Drive Change From Within

If you work for a tech company, advocate for privacy-by-design principles—embedding privacy into products from the ground up. Push for policies like data minimization and transparency, or encourage your organization to invest in privacy research. Cutting-edge technologies like zero-knowledge proofs and homomorphic encryption are redefining what’s possible in privacy-preserving data analysis. Supporting innovation in these areas can have a profound impact on the future of privacy.

#8 Engage in Policy Advocacy

Governments frequently pass laws regulating technology without fully understanding their implications. Your voice can make a difference by shaping these policies to prevent harmful consequences. Push back against attempts to ban privacy tools or mandate backdoors, ensuring that the most vulnerable in society always have a way to protect themselves.

Supporting organizations like the EFF or other advocacy groups is another great way to get involved. These groups lobby for digital rights, educate the public, and fight back against policies that fuel the surveillance state. Together, we can help ensure that privacy remains a fundamental right.

The Power of Community

Privacy advocacy is about more than safeguarding our own information—it’s about defending the fundamental rights that underpin a free and just society. It ensures that those on the front lines—whistleblowers, activists, journalists, and others fighting for change—are equipped with the protection they need to carry out their vital work.

Every time you choose a privacy-respecting tool, educate someone about the importance of privacy, or contribute to an open-source project, you’re strengthening the movement.

Privacy isn’t about having something to hide—it’s about having the freedom to live, think, and act without fear or surveillance. It’s the foundation of creativity, dissent, and progress. Together, we can protect this essential right and ensure a future where privacy empowers us all.

Thanks for being part of this movement, everyone. This week I’m truly thankful and grateful to every one of you.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

Microsoft’s Recall saves sensitive information despite ‘security filters’

Advertising partnership with Teuton Systems

Published 14 December 2024
– By Editorial Staff
The Windows feature have led to harsh criticism and questions about the reliability of Microsoft's security measures.

The new Windows “Recall” feature, which is designed to create screenshots of your computer and your activity “for increased productivity”, has been found to store sensitive information such as credit card numbers and social security numbers – even when the feature to filter sensitive data is enabled.

According to a report by Tom’s Hardware, tests have revealed several flaws in Recall’s filtering capabilities and showed that Recall was able to capture information from Windows Notepad and PDF forms in Microsoft Edge, including credit card details and social security numbers, despite the feature to block sensitive information being enabled. So you don’t necessarily have to be working in the cloud, but ordinary offline activity is just as vulnerable. This creates a serious security risk for users who expect their private data to be protected.

The feature worked correctly in some cases, such as when it blocked screenshots from payment pages on e-commerce sites like Pimoroni and Adafruit. In contrast, Recall was able to take screenshots of a custom HTML page created by Tom’s Hardware that contained a credit card form and card details, clearly showing flaws in how the filter identifies sensitive data.

Microsoft itself claims that Recall is designed to automatically detect and filter sensitive information, such as credit card details, passwords and social security numbers. The company says it is working on improving the performance of the feature and ensuring that users’ privacy is protected.

Risking to be costly

These flaws in Recall’s filtering feature have led to harsh criticism and raise questions about the reliability of Microsoft’s security measures. Users who rely on Recall to document workflows may inadvertently expose sensitive data, which risks being very costly.

Experts therefore recommend that users be very careful about what data they handle while Recall is active – or better yet, stop using Windows altogether and switch to Linux-based solutions instead.

Microsoft has not yet clarified when an update to Recall can be expected, but the discovered security flaws underline the importance of security tools undergoing rigorous testing before being used in practical applications.

Screenshot of the Linux-based desktop environment KDE Plasma, which comes pre-installed on Teuton Systems computers and is described as being as easy or easier and more logical than Windows to use.

Linux – a privacy-focused alternative

For those who have grown tired of Microsoft and their products, there are further reasons to look towards Linux, which is highly relevant in times of privacy breaches, data collection and more, and which in most cases can fully replace Microsoft Windows. Teuton Systems, a Swedish-based technology company specializing in security and privacy focused products and services, offers personal computers with Linux pre-installed, completely free of “cloud connections” and surveillance software.

All included software is open source and selected with your security in mind. In addition, you have access to support and Linux-savvy customer service.