Saturday, September 13, 2025

Polaris of Enlightenment

What I wish I knew about privacy sooner

The hard truths no one warned me about.

Published 22 March 2025
– By Naomi Brockwell
5 minute read

I’ve been deep in the privacy world for years, but I wasn’t always this way. If I could go back, I’d grab my younger self by the shoulders and say: “Wake up. The internet is a battlefield of people fighting for your attention, and many of them definitely don’t have your best interests at heart”.

I used to think I was making my own decisions—choosing what platforms to try, what videos to watch, what to believe. I didn’t realize I was part of a system designed to shape my behavior. Some just wanted to sell me things I didn’t need—or even things that actively harm me. But more importantly, some were paying to influence my thoughts, my votes, and even who I saw as the enemy.

There is a lot at stake when we lose the ability to make choices free from manipulation. When our digital exhaust—every click, every pause, every hesitation—is mined and fed into psychological experiments designed to drive behavior, our ability to think independently is undermined.

No one warned me about this. But it’s not too late—not for you. Here are the lessons I wish I had learned sooner—and the steps you can take now, before you wish you had.

1. Privacy mistakes compound over time—like a credit score, but worse

Your digital history doesn’t reset—once data is out there, it’s nearly impossible to erase.

The hard truth:

  • Companies connect everything—your new email, phone number, or payment method can be linked back to your old identity through data brokers, loyalty programs, and behavioral analysis.
  • Switching to a new device or platform doesn’t give you a blank slate—it just gives companies another data point to connect.

What to do:

  • Break the chain before it forms. Use burner emails, aliases, and virtual phone numbers.
  • Change multiple things at once. A new email won’t help if you keep the same phone number and credit card.
  • Be proactive, not reactive. Once a profile is built, you can’t undo it—so prevent unnecessary links before they happen.

2. You’re being tracked—even when you’re not using the internet

Most people assume tracking only happens when they’re browsing, posting, or shopping—but some of the most invasive tracking happens when you’re idle. Even when you think you’re being careful, your devices continue leaking data, and websites have ways to track you that go beyond cookies.

The hard truth:

  • Your phone constantly pings cell towers, creating a movement map of your location—even if you’re not using any apps.
  • Smart devices send data home at all hours, quietly updating manufacturers without your consent.
  • Websites fingerprint you the moment you visit, using unique device characteristics to track you, even if you clear cookies or use a VPN.
  • Your laptop and phone make hidden network requests, syncing background data you never approved.
  • Even privacy tools like incognito mode or VPNs don’t fully protect you. Websites use behavioral tracking to identify you based on how you type, scroll, or even the tilt of your phone.
  • Battery percentage, Bluetooth connections, and light sensor data can be used to re-identify you after switching networks.

What to do:

  • Use a privacy-focused browser like Mullvad Browser or Brave Browser.
  • Check how unique your device fingerprint is at coveryourtracks.eff.org.
  • Monitor hidden data leaks with a reverse firewall like Little Snitch (for Mac)—you’ll be shocked at how much data leaves your devices when you’re not using them.
  • Use a VPN like Mullvad to prevent network-level tracking, but don’t rely on it alone.
  • Break behavioral tracking patterns by changing your scrolling, typing, and browsing habits.

3. Your deleted data isn’t deleted—it’s just hidden from you

Deleting a file, message, or account doesn’t mean it’s gone.

The hard truth:

  • Most services just remove your access to data, not the data itself.
  • Even if you delete an email from Gmail, Google has already analyzed its contents and added what it learned to your profile.
  • Companies don’t just store data—they train AI models on it. Even if deletion were possible, what they’ve learned can’t be undone.

What to do:

  • Use services that don’t collect your data in the first place. Try ProtonMail instead of Gmail, or Brave instead of Google Search.
  • Assume that if a company has your data, it may never be deleted—so don’t hand it over in the first place.

4. The biggest privacy mistake: Thinking privacy isn’t important because “I have nothing to hide”

Privacy isn’t about hiding—it’s about control over your own data, your own life, and your own future.

The hard truth:

  • Data collectors don’t care who you are—they collect everything. If laws change, or you become notable, your past is already logged and available to be used against you.
  • “I have nothing to hide” becomes “I wish I had hidden that.” Your past purchases, social media comments, or medical data could one day be used against you.
  • Just because you don’t feel the urgency of privacy now doesn’t mean you shouldn’t be choosing privacy-focused products. Every choice you make funds a future—you’re either supporting companies that protect people or ones that normalize surveillance. Which future are you contributing to?
  • Anonymity only works if there’s a crowd. The more people use privacy tools, the safer we all become. Even if your own safety doesn’t feel like a concern right now, your choices help protect the most vulnerable members of society by strengthening the privacy ecosystem.

What to do:

  • Support privacy-friendly companies.
  • Normalize privacy tools in your circles. The more people use them, the less suspicious they seem.
  • Act now, not when it’s too late. Privacy matters before you need it.

5. You’re never just a customer—you’re a product

Free services don’t serve you—they serve the people who pay for your data.

The hard truth:

  • When I first signed up for Gmail, I thought I was getting a free email account. In reality, I was handing over my private conversations for them to scan, profile, and sell.
  • Even paid services can sell your data. Many “premium” apps still track and monetize your activity.
  • AI assistants and smart devices extract data from you. Be intentional about the data you give them, knowing they are mining your information.

What to do:

  • Ask: “Who profits from my data?”
  • Use privacy-respecting alternatives.
  • Think twice before using free AI assistants that explicitly collect your data, or speaking near smart devices.

Final thoughts: The future isn’t written yet

Knowing what I know now, I’d tell my younger self this: you are not powerless. The tools you use, the services you fund, and the choices you make shape the world we all live in.

Take your first step toward reclaiming your privacy today. Because every action counts, and the future isn’t written yet.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

AI company pays billions in damages to authors

Published 10 September 2025
– By Editorial Staff
The AI company has used pirated books to train its AI bot Claude.
1 minute read

AI company Anthropic is paying $1.5 billion to hundreds of thousands of authors in a copyright lawsuit. The settlement is the first and largest of its kind in the AI field.

It was last year that authors Andrea Bartz, Charles Graeber and Kirk Wallace Johnson filed a lawsuit against Anthropic for using pirated books to train their AI Claude.

In June, a federal judge ruled that it was not illegal to train AI chatbots on copyrighted books, but that Anthropic had wrongfully obtained millions of books via pirate sites.

Now Anthropic has agreed to pay approximately $3,000 for each of the estimated 500,000 books covered. In total, this amounts to $1.5 billion.

First of its kind

The settlement is the first in a series of legal proceedings ongoing against AI companies regarding the use of copyrighted material for AI training. Among others, George R.R. Martin together with 16 other authors has sued OpenAI for copyright infringement.

As best as we can tell, it’s the largest copyright recovery ever, says Justin Nelson, lawyer for the authors, according to The Guardian. It’s the first of its kind in the AI era.

If Anthropic had not agreed to the settlement, experts say it could have cost significantly more.

We were looking at a strong possibility of multiple billions of dollars, enough to potentially cripple or even put Anthropic out of business, says William Long, legal analyst at Wolters Kluwer.

Spyware takes photos of porn users for blackmail

Published 9 September 2025
– By Editorial Staff
Strangely enough, Stealerium is distributed as free open source code on Github.
2 minute read

Security company Proofpoint has discovered malicious software that automatically photographs users through their webcams when they visit pornographic sites. The images are then used for extortion purposes.

The new spyware Stealerium has a particularly disturbing function: it monitors the victim’s browser for pornography-related search terms like “sex” and “porn”, while simultaneously taking screenshots and webcam photos of the user, sending everything to the hacker.

Security company Proofpoint discovered the software in tens of thousands of email messages sent since May this year. Victims were tricked into downloading the program through fake invoices and payment demands, primarily targeting companies in hospitality, education and finance.

— When it comes to infostealers, they typically are looking for whatever they can grab, says Selena Larson, researcher at Proofpoint to Wired.

— This adds another layer of privacy invasion and sensitive information that you definitely wouldn’t want in the hands of a particular hacker. It’s gross. I hate it, she adds.

Available openly on Github

In addition to the automated sextortion function, Stealerium also steals traditional data such as banking information, passwords and cryptocurrency wallet keys. All information is sent to the hacker via services like Telegram, Discord or email.

Strangely, Stealerium is distributed as free open source code on Github. The developer, who calls himself witchfindertr and claims to be a “malware analyst” in London, maintains that the program is “for educational purposes only”.

— How you use this program is your responsibility. I will not be held accountable for any illegal activities. Nor do i give a shit how u use it, the developer writes on the page.

Kyle Cucci, also a researcher at Proofpoint, calls automated webcam images of users browsing porn “pretty much unheard of”. The only similar case was an attack against French-speaking users in 2019.

New trend among cybercriminals

According to Larson, the new type of attacks may be part of a larger trend where smaller hacker groups are turning away from large-scale ransomware attacks that attract authorities’ attention.

— For a hacker, it’s not like you’re taking down a multimillion-dollar company that is going to make waves and have a lot of follow-on impacts. They’re trying to monetize people one at a time. And maybe people who might be ashamed about reporting something like this, Larson explains.

Proofpoint has not identified specific victims of the sextortion function, but believes that the function’s existence suggests it has likely already been used.

New robot takes on household chores

The future of AI

Published 7 September 2025
– By Editorial Staff
1 minute read

The AI robot Helix can wash dishes, fold laundry and collaborate with other robots. It is the first robot of its kind that can control the entire upper part of the body.

The American robotics company Figure AI’s new humanoid robot has visual perception, language understanding and full control over fingers, wrists, torso and head. This enables the robot to pick up small objects and thereby help with household tasks.

Helix is powered by a so-called dual-system architecture, which can be explained as having a unique “two-brain” AI architecture where one part interprets language and vision while another part controls movements quickly and precisely.

Among other things, the company demonstrates that the robot can load dishes into the dishwasher, fold laundry and sort groceries. The robot can also sort and weigh packages at postal facilities.

It can also handle thousands of new objects in cluttered environments, without prior demonstrations or custom programming. This means it can perform tasks it is not programmed for and is designed to solve problems independently in an unpredictable environment.

It can follow voice commands in a similar way to talking with a human and act accordingly. What also makes the robot special is that it can collaborate with other robots. In tests, for example, two Helix robots have successfully been able to work together to unpack groceries.

Stop feeding Apple your data

Homebrew is the app store that doesn’t spy on you.

Published 6 September 2025
– By Naomi Brockwell
5 minute read

If you’re on a Mac, chances are you download apps from Apple’s App Store. Add your Apple ID, and everything is neatly in one place, updated with the click of a button.

But convenience comes at a price. Linking an Apple ID to your computer ties all your activity together and makes profiling you effortless.

In past articles, we’ve shown how much data Apple collects, and explained that Linux is the gold standard for privacy. But if you’re not ready to switch, there are still steps you can take right now to make your Mac more private.

This article focuses on Apple IDs, the App Store, and a powerful alternative called Homebrew. It’s a package manager that gives you the convenience of centralized updates without the surveillance.

Apple ID and the App Store

It may seem impossible to avoid Apple IDs and the App Store. On an iPhone, you’re locked in: You need to add an Apple ID and use the App Store to download any apps. (The EU recently forced Apple to allow sideloading, but that doesn’t apply everywhere.)

On a Mac, things are different. You don’t need the App Store at all. You can download software directly from each developer’s website, which means you never need to attach an Apple ID to your computer. And that’s one of the best privacy moves you can make.

Unfortunately, Apple makes it a little tricky to opt out.

When you buy a new Mac, the store will push you to hand over an Apple ID at checkout. You should tell them you don’t have one.

Then when you first set up your computer, it will prompt you to add an Apple ID, and it’s not immediately clear how to skip past this step. The “Continue” button is grayed out unless you fill in your ID. What you might have missed is in the bottom left corner it says “Set Up Later”. Click that.

But Apple still puts up roadblocks. Gatekeeper, which is a macOS security feature that controls which apps are allowed to run on your Mac, by default only allows apps from the App Store or from developers that Apple has verified. If you want to allow downloads from elsewhere, you first have to turn off Gatekeeper’s strict enforcement using command line, and then go back into your settings and select the option to allow apps from “Anywhere”.

Apple really wants every download to run through them. That way, they can log every install, every update, and build a permanent profile of your habits and interests.

App Store: Convenient, But Costly

Of course, there are perks. The App Store makes managing your apps painless. You can update everything with a single click. If you’ve downloaded apps from a dozen different sites, updating becomes a chore. Each app has to be opened and checked manually.

You can always enable auto-updates, but that means your apps constantly ping servers in the background. For many people, that’s a privacy trade-off not worth making.

In fact, I use the firewall software Little Snitch to block my apps from unnecessarily talking to the internet, which makes auto updates even harder. I have to disable the firewall, check every app one by one, and then remember to re-enable the firewall afterwards. It’s easy to slip up, and Apple knows most people won’t bother with this manual process.

Enter Homebrew

This is where Homebrew comes in handy, by providing the convenience of the App Store without all the tracking.

Homebrew is a package manager for macOS and Linux. A package manager is similar to an app store in many ways. Think of it like a hub: a single place to find, install, and update apps quickly and reliably.

Homebrew is well known and open source, but it looks different from the stores you’re used to. There’s no visual store or GUI: Instead you use the command line in Terminal. You can’t buy anything in the store, the software is free. Some apps have paid upgrade features, but Homebrew itself has no ability to collect payments. You don’t need any account to access it, there’s no hidden tracking, and no ads.

Benefits

There are three main benefits, in my opinion, to using Homebrew over downloading apps directly from random websites.

Convenience

Instead of bouncing between dozens of sites to find, install, and update apps, Homebrew gives you simple commands that do it all in one place. One system, and you can use it without handing over telemetry about everything you’re doing.

Safety

Homebrew can help decrease the risk of installing fake or malicious software. When you download apps manually, there’s always the chance that you spelled the URL wrong or landed on a fake site through a phishing link. Homebrew pulls apps from official, verified sources, and it automatically checks the integrity of every file. Generally open source repos include a checksum of the file on their website, which is a hash of the exact file. The checksum of what you downloaded should be identical to the checksum the app has provided. Homebrew verifies that they match, so you’re not getting a tampered or unsafe version. Their code is watched over by a large community, and they log every change publicly. No system is 100% safe, but Homebrew is highly reputable and widely regarded in the open source community.

Gateway to Linux

Homebrew is also compatible with Linux. If you ever decide to switch operating systems, Homebrew is a great way to make the transition easier, and get comfortable with tools you’ll probably use on Linux too.

Tutorial coming soon

Friday week we’ll release a full video tutorial on how to install and use Homebrew, so keep an eye out.

For now, the takeaway is simple: Homebrew gives you the convenience of centralized updates without the privacy trade-offs. You get easy installs, built-in safety checks, and you never have to tie your Mac to an Apple ID.

If you want the benefits of an app store without the profiling that comes with it, Homebrew is the smarter choice.

 

Yours in Privacy,
Naomi

 

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.