“I have nothing to hide”

Mass surveillance

Ten reasons privacy matters for everyone.

Updated January 8, 2025, Published January 8, 2025 – By Naomi Brockwell
Is there nothing in your life that is actually private and concerns you and only you?

Challenging the myth

"I have nothing to hide". It’s a phrase we’ve all heard, and perhaps even said ourselves, when privacy comes up. But it reveals a dangerous misunderstanding of what privacy is and why it matters.

Privacy isn’t about hiding—it’s about control. It’s about having the freedom to decide who gets access to your data and how it’s used. Over the last decade, that freedom has eroded. Today, governments, corporations, and hackers routinely collect and exploit our personal information, often without our consent.

Worse still, the narrative around privacy has shifted. Those who value it are seen as secretive, even criminal, while surveillance is sold to us as a tool for safety and transparency. This mindset benefits only those who profit from our data.

It’s time to push back. Here are 10 arguments you can use the next time someone says, "I have nothing to hide".

1. Privacy is about consent, not secrecy

Privacy isn’t about hiding secrets—it’s about having control over your information. It’s the ability to decide for yourself who gets access to your data.

We don’t have to hand over all our personal information just because it’s requested. Tools like email aliases, VoIP numbers, and masked credit cards allow us to protect our data while still using online services. Privacy-focused companies like ProtonMail or Signal respect this principle, giving you more control over your information.

2. Nothing to hide, everything to protect

Even if you think you have nothing to hide, you have everything to protect. Oversharing data makes you vulnerable to hackers, scammers, and malicious actors.

For example:

  • Hackers can use personal details like your home address or purchase history to commit fraud or even locate you.
  • Data brokers can manipulate you with targeted content and even influence your political beliefs, as seen in the Cambridge Analytica scandal.

Protecting your data is about safeguarding yourself from these threats and protecting your autonomy.

3. Your data is forever

Data collected about you today will still exist decades from now. Governments change, laws evolve, and what’s harmless now could be used against you or your children in the future.

Surveillance infrastructure rarely disappears once it’s built. Limiting the data collected about you now is essential for protecting yourself from unknown risks down the line.

4. It’s not about you

Privacy isn’t just a personal issue—it’s about protecting others. Activists, journalists, and whistleblowers rely on privacy to do their work safely. By dismissing privacy, you’re ignoring the people for whom it’s a matter of life and death.

For example, Pegasus spyware has been used to track and silence journalists and activists. We should be leaning in to privacy tools, supporting the privacy ecosystem, and ensuring that those helping to keep our society free and safe are protected, whether we personally feel like we need privacy or not.

5. Surveillance isn’t about criminals

The claim that surveillance is "only for catching bad guys" is a myth. Once surveillance tools are deployed, they almost always expand beyond their original purpose.

History has shown how governments use surveillance to target dissenters, minorities, and anyone challenging the status quo. Privacy isn’t just for criminals—it’s a safeguard against abuse of power.

6. Your choices put others at risk

When you disregard privacy, you expose not just yourself but also the people around you.

For example:

  • Using apps that access your contact list can leak your friends’ and family’s phone numbers and addresses without their consent.
  • Insisting on non-private communication tools can expose sensitive conversations to surveillance or data breaches.
  • Uploading your photos to a non-private cloud like Google Drive allows those in your photos to be identified using facial recognition, and profiled based on information Google AI sees in your photos.

Respecting privacy isn’t just about protecting yourself—it’s about respecting the privacy boundaries of others.

7. Privacy is not dead

For some people, "I have nothing to hide" is a coping mechanism.
"Privacy is dead, so why bother?"

This defeatist attitude is both false and harmful. Privacy is alive—it’s a choice we can make every day. Let’s stop disempowering others by convincing them they shouldn’t even try.

There are countless privacy tools you can incorporate into your life. By choosing these tools, you take back control over your information and send a clear message that privacy matters.

8. Your data can be weaponized

All it takes is one bad actor—a rogue employee, an ex-partner, or a hacker—to turn your data against you. From revenge hacking to identity theft, the consequences of oversharing are real and dangerous.

Limiting the amount of data collected about you reduces your vulnerability and makes it harder for others to exploit your information.

9. Surveillance stifles creativity and dissent

Surveillance doesn’t just invade your privacy—it affects how you think and behave. Studies show that people censor themselves when they know they’re being watched.

This "chilling effect" stifles creativity, innovation, and dissent. Without privacy, we lose the ability to think freely, explore controversial ideas, and push back against authority.

10. Your choices send a signal

Every decision you make about technology sends a message. Choosing privacy-focused companies tells the market, "This matters". It encourages innovation and creates demand for tools that protect individual freedom.

Conversely, supporting data-harvesting companies reinforces the status quo and pushes privacy-focused alternatives out of the market. When people say “I have nothing to hide” instead of leaning into the privacy tools around them, it ignores the role we all play in shaping the future of society.

Takeaways: Why privacy matters

  1. Privacy is about consent, not secrecy. It’s your right to control who accesses your data.
  2. You have everything to protect. Data breaches and scams are real threats.
  3. Data is forever. What’s collected today could harm you tomorrow.
  4. Privacy protects others. Journalists and activists depend on it to do their work safely.
  5. Surveillance tools expand. They rarely stop at targeting criminals.
  6. Your choices matter. Privacy tools send a message to the market and inspire change.
  7. Privacy isn’t dead. We have tools to protect ourselves—it’s up to us to use them.

A fight we can’t afford to lose

Privacy isn’t about hiding—it’s about protecting your rights, your choices, and your future. Surveillance is a weapon that can silence opposition, suppress individuality, and enforce conformity. Without privacy, we lose the freedom to dissent, innovate, and live without fear.

The next time someone says, "I have nothing to hide", remind them: privacy is normal. It’s necessary. And it’s a fight we can’t afford to lose.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Email was never built for privacy

Mass surveillance

How Proton makes email privacy simple.

Published November 8, 2025 – By Naomi Brockwell

Email was never built for privacy. It’s closer to a digital postcard than a sealed letter, bouncing through and sitting on servers you don’t control, and mainstream providers like Gmail read and analyze everything that is inside.

Email isn’t going anywhere in our society, it’s baked into how the digital world communicates. But luckily there are ways to make your emails more private. One tool that you can use is PGP, which stands for “Pretty Good Privacy”.

PGP is one of the oldest and most powerful tools for email privacy. It takes your message and locks it with the recipient’s public key, so only they can unlock it with their private key. That means even if someone intercepts the email, whether it’s a hacker, your ISP, or a government agency, they see only scrambled text.

Unfortunately it is notoriously complicated. Normally, you’d have to install command-line tools, generate keys manually, and run cryptic commands just to send an encrypted email.

But Proton Mail makes all of that easy, and builds PGP right into your inbox.

How Proton makes PGP simple

Proton is a great, privacy-focused email provider (and no they’re not sponsoring this newsletter, they’re simply an email provider that I like to use).

If you email someone within the Proton ecosystem (ie send an email from one Proton user to another Proton user), your email is automatically end-to-end encrypted using PGP.

But what if you email someone outside of the Proton ecosystem?

Here’s where it would usually get tricky.

First, you’d need to install a PGP client, which is a program that lets you generate and manage your encryption keys.

Then you’d run command-line prompts, choosing the key type, size, expiration, associating the email you want to use the key with, and you’d export your public key. It’s complicated.

But if you use Proton, they make using PGP super easy.

Let’s go through how to use it.

Automatic search for public PGP key

First of all, when you type an email address into the “To” field in Proton Mail, it automatically searches for a public PGP key associated with that address. Proton checks its own network, your contact list, and Web Key Directory (WKD) on the associated email domain.

WKD is a small web‑standard that allows someone to publish their public key at their domain in a way that makes it easily findable for an email app. For example if Proton finds a key for a certain address at the associated domain, Proton will automatically encrypt a message with it.

If they find a key, you’ll see a green lock next to the recipient in the ‘To’ field, indicating the message will be encrypted.

You don’t need to copy, paste, or import anything. It just works.

Great, your email has been automatically encrypted using PGP, and only the recipient of the email will be able to use their private key to decrypt it.

Manually uploading someone’s PGP key

What if Proton doesn’t automatically find someone’s PGP key? You can hunt down the key manually and import it. Some people will have their key available on their website, either in plain text, or as a .asc file. Proton allows you to save this PGP key in your contacts.

To add one manually, first you type their email address in the “to” field.

Then right-click on that address, and select “view contact details”

Then click the settings wheel to go to email settings, and select “show advanced PGP settings”

Under “public keys”, select “upload” and upload their public key in an .asc format.

Once the key is uploaded, the “encrypt emails” toggle will automatically switch on, and all future emails to that contact will automatically be protected with PGP. You can turn that off at any time, and also remove or replace the public key.

How do others secure emails to you using PGP?

Super! So you’ve sent an encrypted email to someone using their PGP key. What if they want to send you an email back, will that be automatically end-to-end encrypted (E2EE) using PGP? Not necessarily.

In order for someone to send you an end-to-end encrypted email, they need your public PGP key.

Download your public-private key pair inside Proton

Proton automatically generates a public-private key pair for each address that you have configured inside Proton Mail, and manages encryption inside its own network.

If you want people outside Proton to be able to encrypt messages to you, the first step is to export your public key from your Proton account so you can share it with them.

To do this:

  • Go to Setting
  • Click “All settings”
  • Select “encryption and keys”
  • Under “email encryption keys” you’ll have a dropdown menu of all your email addresses associated with your Proton account. Select the address that you want to export the public key for.
  • Under the “action” column, click “export public key”

It will download as an .asc file, and ask you where you want to save the file.

Normally a PGP key is written in 1s and 0s that your computer can read. The .asc file takes that key and wraps it in readable characters, and it ends up in a format that looks something like this:

Sharing your public key

Now that you’ve downloaded the public key, how do you share it with people so that they can contact you privately? There are several ways.

For @proton.me and @protonmail.com addresses, Proton publishes your public key in its WKD automatically. You don’t have to do anything.

For custom domains configured in Proton Mail, Proton doesn’t host WKD for you. You can publish WKD yourself on your own domain by serving it at a special path on your website. Or you can delegate WKD to a managed service. Or if you don’t want to use WKD at all, you can upload your key to a public keyserver like keys.openpgp.org, which provides another way for mail apps to discover it.

We’re not going to cover those setups in this article. Instead here are simpler ways to share your public key:

1) You can send people your .asc file directly if you want them to be able to encrypt emails to you (be sure to let them know which email address is associated with this key), or you can host this .asc file on your website for people to download.

2) You can open the .asc file in a text editor and copy and paste the key, and then send people this text, or upload the text on your website. This is what I have done:

This way if anyone wants to send me an email more privately, they can do so.

But Proton makes it even easier to share your PGP key: you can opt to automatically attach your public key to every email.

To turn this on:

  1. Go to Settings → Encryption & keys → External PGP settings
  2. Enable
    • Sign external messages
    • Attach public key

Once this is on, every email you send will automatically include your public key file, as a small .asc text file.

This means anyone using a PGP-capable mail client (like Thunderbird, Mailvelope, etc.) can import it immediately, with no manual steps required.

Password-protected emails

Proton also lets you send password-protected emails, so even if the other person doesn’t use PGP you can still keep the contents private. This isn’t PGP -- Proton encrypts the message and attachments in your browser and the recipient gets a link to a secure viewing page. They enter a password you share separately to open it. Their provider (like Gmail) only sees a notification email with a link, not the message itself. You can add a password hint, and the message expires after a set time (28 days by default).

The bottom line

Email privacy doesn’t have to be painful. Proton hides the complexity by adding a password option, or automating a lot of the PGP process for you: it automatically looks up recipients’ keys, encrypts your messages, and makes your key easy for others to use when they reply.

As Phil Zimmermann, the creator of PGP, explained in Why I Wrote PGP:

“PGP empowers people to take their privacy into their own hands. There has been a growing social need for it. That’s why I wrote it".

We’re honored to have Mr. Zimmermann on our board of advisors at Ludlow Institute.

Pioneers like him fought hard so we could protect our privacy. It’s on us to use the tools they gave us.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

Swedish police secretly using Palantir’s surveillance system for years

Mass surveillance

Published November 4, 2025 – By Editorial staff
Palantir Technologies headquarters in Silicon Valley.

The Swedish Police Authority has for at least five years been using an AI-based analysis tool from the notorious American security company Palantir.

The program, which has been specially adapted for Swedish conditions, can within seconds compile comprehensive profiles of individuals by combining data from various registers.

Behind the system stands the American tech company Palantir, which is internationally controversial and has been accused of involvement in surveillance activities. This summer, the company was identified in a UN report as complicit in genocide in Gaza.

The Swedish version of Palantir's Gotham platform is called Acus and uses artificial intelligence to compile, analyze and visualize large amounts of information. According to an investigation by the left-wing newspaper Dagens ETC, investigators using the system can quickly obtain detailed personal profiles that combine data from surveillance and criminal registers with information from Bank-id (Sweden's national digital identification system), mobile operators and social media.

A former analyst employed by the police, who chooses to remain anonymous, describes to the newspaper how the system was surrounded by great secrecy:

— There was very much hush-hush around that program.

Rejection of document requests

When the newspaper requested information about the system and how it is used, they were met with rejection. The Swedish Police Authority cited confidentiality and stated that they can neither "confirm nor deny relationships with Palantir" citing "danger to national security".

This is not the first time Palantir's tools have been used in Swedish law enforcement. In the high-profile Operation Trojan Shield, the FBI, with support from Palantir's technology, managed to infiltrate and intercept the encrypted messaging app Anom.

The operation led to the arrest of a large number of people connected to serious crime, both in Sweden and internationally. The FBI called the operation "a shining example of innovative law enforcement".

But the method has also received criticism. Attorney Johan Grahn, who has represented defendants in several Anom-related cases, is critical of the approach.

— In these cases, it has been indiscriminate mass surveillance, he states.

Mapping dissidents

Palantir has long sparked debate due to its assignments and methods. The company works with both American agencies and foreign security services.

In the United States, the surveillance company's systems are used to map undocumented immigrants. In the United Kingdom, British police have been criticized for using the company's technology to build registers of citizens' sex lives, political views, religious affiliation, ethnicity and union involvement – information that according to observers violates fundamental privacy principles.

This summer, a UN report also identified Palantir as co-responsible for acts of genocide in Gaza, after the company's analysis tools were allegedly used in attacks where Palestinian civilians were killed.

How extensive the Swedish police's use of the system is, and what legal frameworks govern the handling of Swedish citizens' personal data in the platform, remains unclear as long as the Swedish Police Authority chooses to keep the information classified.

IT expert warns: ID requirements online bring us closer to totalitarian surveillance

Mass surveillance

Published November 3, 2025 – By Editorial staff
Swedish Liberal Party politician Nina Larsson wants to introduce age verification – but IT experts warn of serious consequences

IT security specialist Karl Emil Nikka advises Sweden against following the UK's example of mandatory age verification on pornographic websites. The risk of data breaches and increased surveillance is too great, he argues.

Swedish Gender Equality Minister Nina Larsson wants Sweden to introduce technical barriers requiring age verification on pornographic websites to protect children from explicit sexual content.

The proposal is based on the British model where websites must verify users' age or identity, for example through authentication with ID cards or credit cards.

But Karl Emil Nikka, an IT security specialist, is strongly critical of the proposal. He points to serious flaws in the British solution, not least the risk of data breaches.

As an example, he mentions the leak from the messaging platform Discord, where photos of 70,000 users ended up in the wrong hands after a cyberattack in connection with the law change. Additionally, the barriers are easy to circumvent using VPN services, which caused the use of such services to skyrocket when the British law came into effect.

Risks surveillance

Nikka also warns that requirements for online identification bring Sweden closer to a type of surveillance that otherwise only exists in totalitarian states.

— It's a small problem as long as we live in a democracy, but it's damn dangerous to believe we always will, he says.

Instead, parents should be encouraged to use the controls already built into phones and other devices, where one can easily choose which sites to block.

— From a security perspective, it's the only reasonable solution, Nikka states.

Foreign sites attract

An additional risk with technical barriers is that young users turn to lesser-known foreign sites that don't care about legal requirements, Nikka argues. Jannike Tillå, head of communications and social benefit at the Swedish Internet Foundation, confirms this picture.

— According to experts in various countries, it seems that people have turned to other lesser-known websites abroad, she says.

However, Tillå believes that technical solutions can have a place, provided they are more anonymous than the British ones and combined with other measures.

— It can help raise thresholds and reduce exposure.

Conversations crucial

At the same time, she emphasizes the importance of complementing any technical solutions with investments in digital literacy and, above all, conversations between parents and children.

— That's where real protection begins. We know that many parents find it difficult to have the porn conversation, but you should do it early, says Jannike Tillå.

She stresses that the question of privacy and freedom online must not be set against child protection.

— We must find that balance and manage both things, she concludes.

Safety apps normalize surveillance of children

Mass surveillance

Published October 15, 2025 – By Editorial staff
Swedish researcher Katarina Winter warns that surveillance of children has become normalized when technology is packaged as care rather than control.

Apps promised to increase safety are often used for everyday logistics – and normalize secret surveillance.

Researchers at Stockholm University have examined 48 Swedish safety apps and warn that the technology is packaged as care while ethical questions disappear.

In two research projects at Stockholm University in Sweden, researchers are investigating various safety technologies in Sweden – everything from digital safety maps and security sensors to apps marketed as tools for creating safer communities. But instead of measuring whether the technology works, the researchers critically examine its consequences.

— It's important to ask what kind of safety we're after, and for whom? What is worth calling safety? Which actors and interests determine what constitutes safety in a society? The project on safety apps shows, among other things, how surveillance becomes normalized when we use this technology, says Katarina Winter, associate professor and senior lecturer in criminology and doctor in sociology at Stockholm University.

She leads the projects, which are conducted in collaboration with researchers from the University of Gävle and Södertörn University. The researchers have mapped 48 Swedish safety apps and interviewed both developers and users, including parents who use apps to keep track of their children.

"The technology is so kindly framed"

A central finding is how normalized it has become to monitor children, often without their knowledge.

— One example is how normalized it has become to monitor your children even though they don't know about it, although some have an agreement with their children. Because the technology is so kindly framed – that it's about protecting the children – it doesn't become something you have to stand up for as a parent. The normalization can therefore happen under the radar. When technology is packaged as care, we easily lose sight of the ethical questions, she explains.

The surveillance also affects family relationships.

— Many use the apps to avoid nagging their children, and in the short term it may be convenient and simplify family logistics. But something happens on an interpersonal level, we cut off part of the interaction between each other. It's seen as deviant behavior if you don't want to share your location, which I think is negative.

Confusing messages during adult education center shooting

The researchers see a clear discrepancy between developers' ideals about a safer society and how the apps are actually used. For private individuals, it's often about completely different things than safety.

— In a way, these parents reproduce an insecurity in society related to crime and vulnerability when they justify why they use an app. But in reality, it's often extremely connected to everyday logistics – when should I start cooking the pasta depending on where my child is? explains the criminologist.

The researchers have also examined the school safety app CoSafe, which was used during the shooting at Campus Risbergska, an adult education center in Malmö, southern Sweden. The app was criticized for sending contradictory alerts about both evacuation (leaving the building) and lockdown (staying inside and seeking shelter). Of the total eleven people killed, two students followed the instruction to evacuate instead of seeking shelter indoors.

— The Risbergska case demonstrates the complexity of technical solutions for crisis situations. While the app may have helped some seek shelter, the incident raises important questions about responsibility distribution and technical reliability when it comes to life and death, Winter notes.

Private actors profit from insecurity

The researcher also sees how private companies use the public debate about insecurity to sell their solutions, particularly to municipalities.

— We have both a political landscape that focuses on insecurity and a market that takes it on because it's in focus. It's logical that opportunities for entrepreneurship are found in the societal debate we're in, but it becomes more brutal when it comes to safety than with other phenomena. Partly because actors profit from portraying society as unsafe, and partly because companies are generally interested in specific user groups that may not have many safety problems.

She calls for a critical attitude toward technological optimism.

— It's important to pause on these questions that otherwise tend to rush ahead in a kind of faith that 'now everything will be better because we have new technology'. When the overarching word is safety, questions about surveillance and privacy risk being deprioritized.