Saturday, April 19, 2025

Polaris of Enlightenment

Ad:

What I wish I knew about privacy sooner

The hard truths no one warned me about.

Published 22 March 2025
– By Naomi Brockwell

I’ve been deep in the privacy world for years, but I wasn’t always this way. If I could go back, I’d grab my younger self by the shoulders and say: “Wake up. The internet is a battlefield of people fighting for your attention, and many of them definitely don’t have your best interests at heart”.

I used to think I was making my own decisions—choosing what platforms to try, what videos to watch, what to believe. I didn’t realize I was part of a system designed to shape my behavior. Some just wanted to sell me things I didn’t need—or even things that actively harm me. But more importantly, some were paying to influence my thoughts, my votes, and even who I saw as the enemy.

There is a lot at stake when we lose the ability to make choices free from manipulation. When our digital exhaust—every click, every pause, every hesitation—is mined and fed into psychological experiments designed to drive behavior, our ability to think independently is undermined.

No one warned me about this. But it’s not too late—not for you. Here are the lessons I wish I had learned sooner—and the steps you can take now, before you wish you had.

1. Privacy mistakes compound over time—like a credit score, but worse

Your digital history doesn’t reset—once data is out there, it’s nearly impossible to erase.

The hard truth:

  • Companies connect everything—your new email, phone number, or payment method can be linked back to your old identity through data brokers, loyalty programs, and behavioral analysis.
  • Switching to a new device or platform doesn’t give you a blank slate—it just gives companies another data point to connect.

What to do:

  • Break the chain before it forms. Use burner emails, aliases, and virtual phone numbers.
  • Change multiple things at once. A new email won’t help if you keep the same phone number and credit card.
  • Be proactive, not reactive. Once a profile is built, you can’t undo it—so prevent unnecessary links before they happen.

2. You’re being tracked—even when you’re not using the internet

Most people assume tracking only happens when they’re browsing, posting, or shopping—but some of the most invasive tracking happens when you’re idle. Even when you think you’re being careful, your devices continue leaking data, and websites have ways to track you that go beyond cookies.

The hard truth:

  • Your phone constantly pings cell towers, creating a movement map of your location—even if you’re not using any apps.
  • Smart devices send data home at all hours, quietly updating manufacturers without your consent.
  • Websites fingerprint you the moment you visit, using unique device characteristics to track you, even if you clear cookies or use a VPN.
  • Your laptop and phone make hidden network requests, syncing background data you never approved.
  • Even privacy tools like incognito mode or VPNs don’t fully protect you. Websites use behavioral tracking to identify you based on how you type, scroll, or even the tilt of your phone.
  • Battery percentage, Bluetooth connections, and light sensor data can be used to re-identify you after switching networks.

What to do:

  • Use a privacy-focused browser like Mullvad Browser or Brave Browser.
  • Check how unique your device fingerprint is at coveryourtracks.eff.org.
  • Monitor hidden data leaks with a reverse firewall like Little Snitch (for Mac)—you’ll be shocked at how much data leaves your devices when you’re not using them.
  • Use a VPN like Mullvad to prevent network-level tracking, but don’t rely on it alone.
  • Break behavioral tracking patterns by changing your scrolling, typing, and browsing habits.

3. Your deleted data isn’t deleted—it’s just hidden from you

Deleting a file, message, or account doesn’t mean it’s gone.

The hard truth:

  • Most services just remove your access to data, not the data itself.
  • Even if you delete an email from Gmail, Google has already analyzed its contents and added what it learned to your profile.
  • Companies don’t just store data—they train AI models on it. Even if deletion were possible, what they’ve learned can’t be undone.

What to do:

  • Use services that don’t collect your data in the first place. Try ProtonMail instead of Gmail, or Brave instead of Google Search.
  • Assume that if a company has your data, it may never be deleted—so don’t hand it over in the first place.

4. The biggest privacy mistake: Thinking privacy isn’t important because “I have nothing to hide”

Privacy isn’t about hiding—it’s about control over your own data, your own life, and your own future.

The hard truth:

  • Data collectors don’t care who you are—they collect everything. If laws change, or you become notable, your past is already logged and available to be used against you.
  • “I have nothing to hide” becomes “I wish I had hidden that.” Your past purchases, social media comments, or medical data could one day be used against you.
  • Just because you don’t feel the urgency of privacy now doesn’t mean you shouldn’t be choosing privacy-focused products. Every choice you make funds a future—you’re either supporting companies that protect people or ones that normalize surveillance. Which future are you contributing to?
  • Anonymity only works if there’s a crowd. The more people use privacy tools, the safer we all become. Even if your own safety doesn’t feel like a concern right now, your choices help protect the most vulnerable members of society by strengthening the privacy ecosystem.

What to do:

  • Support privacy-friendly companies.
  • Normalize privacy tools in your circles. The more people use them, the less suspicious they seem.
  • Act now, not when it’s too late. Privacy matters before you need it.

5. You’re never just a customer—you’re a product

Free services don’t serve you—they serve the people who pay for your data.

The hard truth:

  • When I first signed up for Gmail, I thought I was getting a free email account. In reality, I was handing over my private conversations for them to scan, profile, and sell.
  • Even paid services can sell your data. Many “premium” apps still track and monetize your activity.
  • AI assistants and smart devices extract data from you. Be intentional about the data you give them, knowing they are mining your information.

What to do:

  • Ask: “Who profits from my data?”
  • Use privacy-respecting alternatives.
  • Think twice before using free AI assistants that explicitly collect your data, or speaking near smart devices.

Final thoughts: The future isn’t written yet

Knowing what I know now, I’d tell my younger self this: you are not powerless. The tools you use, the services you fund, and the choices you make shape the world we all live in.

Take your first step toward reclaiming your privacy today. Because every action counts, and the future isn’t written yet.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Exposing the lies that keep you trapped in surveillance culture

Debunking the biggest myths about data collection

Published today 7:39
– By Naomi Brockwell

Let’s be honest: data is useful. But we’re constantly told that in order to benefit from modern tech—and the insights that come with it—we have to give up our privacy. That useful data only comes from total access. That once your info is out there, you’ve lost control. That there’s no point in trying to protect it anymore.

These are myths. And they’re holding us back.

The truth is, you can benefit from data-driven tools without giving away everything. You can choose which companies to trust. You can protect one piece of information while sharing another. You can demand smarter systems that deliver insights without exploiting your identity.

Privacy isn’t about opting out of technology—it’s about choosing how you engage with it.

In this issue, we’re busting four of the most common myths about data collection. Because once you understand what’s possible, you’ll see how much power you still have.

Myth #1: “I gave data to one company, so my privacy is already gone”.

This one is everywhere. Once people sign up for a social media account or share info with a fitness app, they often throw up their hands and say, “Well, I guess my privacy’s already gone”.

But that’s not how privacy works.

Privacy is about choice. It’s about context. It’s about setting boundaries that make sense for you.

Just because you’ve shared data with one company doesn’t mean you’re giving blanket permission to every app, government agency, or ad network to track you forever.

You’re allowed to:

  • Share one piece of information and protect another.
  • Say yes to one service and no to others.
  • Change your mind, rotate your identifiers, and reduce future exposure.

Privacy isn’t all or nothing. And it’s never too late to take some power back.

Myth #2: “If I give a company data, they can do whatever they want with it”.

Not if you pick the right company.

Many businesses are committed to ethical data practices. Some explicitly state in their terms that they’ll never share your data, sell it, or use it outside the scope of the service you signed up for.

Look for platforms that don’t retain unnecessary data. There are more of them out there than you think.

Myth #3: “To get insights, a company needs to see my data”.

This one’s finally starting to crumble—thanks to game-changing tech like homomorphic encryption.

Yes, really: companies can now do compute on encrypted data without ever decrypting it.

It’s already in use in financial services, research, and increasingly, consumer apps. It proves that privacy and data analysis can go hand in hand.

Imagine this: a health app computes your sleep averages, detects issues, and offers recommendations—without ever seeing your raw data. It stays encrypted the whole time.

We need to champion this kind of innovation. More research. More tools. More adoption. And more support for companies already doing it—because our business sends a signal that this investment was worth it for them, and encourages other companies to jump on board.

Myth #4: “To prove who you are, you have to hand over sensitive data.”

You’ve heard this from banks, employers, and government forms: “We need your full ID to verify who you are”.

But here’s the problem: every time we hand over sensitive data, we increase our exposure to breaches and identity theft. It’s a bad system.

There’s a better way.
With zero-knowledge proofs, we can prove things like being over 18, or matching a record—without revealing our address, birthdate, or ID number.

The tech already exists. But companies and institutions are slow to adopt it or even recognize it as legitimate. This won’t change until we demand better.

Let’s push for a world where:

  • Our identity isn’t a honeypot for hackers.
  • We can verify ourselves without becoming vulnerable.
  • Privacy-first systems are the norm—not the exception.

Takeaways

The idea that we have to trade privacy for progress is a myth. You can have both. The tools exist. The choice is ours.

Privacy isn’t about hiding—it’s about control. You can choose to share specific data without giving up your rights or exposing everything.

Keep these in mind:

  • Pick tools that respect you. Look for platforms with strong privacy practices and transparent terms.
  • Use privacy-preserving tech. Homomorphic encryption and zero-knowledge proofs are real—and growing.
  • Don’t give up just because you shared once. Privacy is a spectrum. You can always take back control.
  • Talk about it. The more people realize they have options, the faster we change the norm.

Being informed doesn’t have to mean being exploited.
Let’s demand better.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

NATO implements AI system for military operations

The future of AI

Published 17 April 2025
– By Editorial Staff
Modern warfare increasingly resembles what only a few years ago was science fiction.

The military pact NATO has entered into an agreement with the American tech company Palantir to introduce the AI-powered system Maven Smart System (MSS) in its military operations.

The Nordic Times has previously highlighted Palantir’s founder Peter Thiel and his influence over the circle around Trump, and how the company’s AI technology has been used to develop drones that can identify Russians and automate killing.

NATO announced on April 14 that it has signed a contract with Palantir Technologies to implement the Maven Smart System (MSS NATO), within the framework of Allied Command Operations, reports DefenceScoop.

MSS NATO uses generative AI and machine learning to quickly process information, and the system is designed to provide a sharper situational awareness by analyzing large amounts of data in real time.

This ranges from satellite imagery to intelligence reports, which are then used to identify targets and plan operations.

Terminator
In the “Terminator” movies, the remnants of the Earth’s population fight against the AI-controlled Skynet weapon system.

Modernizing warfare

According to the NATO Communications Agency NCIA, the aim is to modernize warfare capabilities. What used to require hundreds of intelligence analysts can now, with the help of MSS, be handled by a small group of 20-50 soldiers, according to the NCIA.

Palantir has previously supplied similar technology to the US Army, Air Force and Space Force. In September 2024, the company also signed a $100 million contract with the US military to expand the use of AI in targeting.

The system is expected to be operational as early as mid-May 2025.

The new deal has also caused financial markets to react and Palantir’s stock has risen. The company has also generally seen strong growth in recent years, with revenues increasing by 50% between 2022 and 2024.

Criticism and concerns

Palantir has previously been criticized for its cooperation with the Israeli Defense Forces, which led a major Nordic investor to cancel its involvement in the company. Criticisms include the risk of AI technology being used in ways that could violate human rights, especially in conflict zones.

On social media, the news has provoked mixed reactions. Mario Nawfal, a well-known voice on platform X, wrote in a post that “NATO goes full Skynet”, …referring to the fictional AI system in the Terminator movies, where technology takes control of the world.

Several critics express concerns about the implications of technology, while others see it as a necessary step to counter modern threats.

NATO and Palantir emphasize that technology does not replace human decision-making. They emphasize that the system is designed to support military leaders and not to act independently.

Nevertheless, there is a growing debate and concern about how AI’s role in warfare could affect future conflicts and global security. Some analysts also see the use of US technologies such as MSS as a way for NATO to strengthen ties across the Atlantic.

OpenAI may develop AI weapons for the Pentagon

The future of AI

Published 14 April 2025
– By Editorial Staff
Sam Altman's OpenAI is already working with defense technology company Anduril Industries.

OpenAI CEO Sam Altman, does not rule out that his and his company will help the Pentagon develop new AI-based weapon systems in the future.

– I will never say never, because the world could get really weird, the tech billionaire cryptically states.

The statement came during Thursday’s Vanderbilt Summit on Modern Conflict and Emerging Threat, and Altman added that he does not believe he will be working on developing weapons systems for the US military “in the foreseeable future” – unless it is deemed the best of several bad options.

– I don’t think most of the world wants AI making weapons decisions, he continued.

The fact that companies developing consumer technology are also developing military weapons has long been highly controversial – and in 2018, for example, led to widespread protests within Google’s own workforce, with many also choosing to leave voluntarily or being forced out by company management.

Believes in “exceptionally smart” systems before year-end

However, the AI industry in particular has shown a much greater willingness to enter into such agreements, and OpenAI has revised its policy on work related to “national security” in the past year. Among other things, it has publicly announced a partnership with defense technology company Anduril Industries Inc to develop anti-drone technology.

Altman also stressed the need for the US government to increase its expertise in AI.

– I don’t think AI adoption in the government has been as robust as possible, he said, adding that there will be “exceptionally smart” AI systems in operation ready before the end of the year.

Altman and Nakasone a retired four-star general attended the event ahead of the launch of OpenAI’s upcoming AI model, which is scheduled to be released next week. The audience included hundreds of representatives from intelligence agencies, the military and academia.

Don’t hit “Restore from backup” on your new device

A clean slate is better for privacy—and your peace of mind.

Published 12 April 2025
– By Naomi Brockwell

We all get a new computer or phone at some point. And when we do, there’s a screen that pops up: “Restore from backup?” One tap, and your whole digital life is right back where you left it. Easy, fast, familiar.

But what most people don’t realize is that restoring from backup doesn’t just bring back your apps. It reactivates years of old permissions, forgotten vulnerabilities, and tracking infrastructure that follows you from one device to the next.

It’s not a fresh start. It’s a rerun of your entire surveillance footprint.

Why you shouldn’t migrate everything

When you restore from a backup, you’re not just getting your apps and data—you’re reintroducing all your digital clutter. Here’s what comes with it:

  • App bloat
    Those one-off apps you installed become permanent squatters. Even if you’ve forgotten them, they could still be harvesting all your activities in the background, sending your data out to third parties.
  • Attack surface
    Every piece of software has vulnerabilities, and the more apps you install, the higher your security risk. Rather than transferring everything over, use this moment to think carefully about which apps are truly worth the added exposure.
  • Accounts and tracking
    Your Apple ID, Google account, or other login credentials build up a massive behavioral profile on you. A new device can also help you sever those old data pipelines. By starting fresh with a new account, you make it far harder for data brokers to link your future activity to the massive profile built under your old ID.
  • Ghost data
    Resetting from scratch also clears out “ghost data”: old settings, hidden config files, and leftover profiles you might not even realize you’re lugging around. Restoring from a backup can drag in outdated privacy defaults or security practices that no longer make sense. Plus, even uninstalled apps can leave behind bits of data—like login tokens or lingering preferences. Starting fresh ensures you’re adopting the newest, most secure configurations and leaving all that digital baggage behind.
  • Habit traps
    Sometimes we keep using apps just because they auto-restore, not because they actually serve us. Starting fresh is like a mini “reset”—you can ditch old routines and make room for better, more privacy-focused tools. Maybe there’s an app you’ve been curious to try but never got around to because your usual go-tos were already at your fingertips. A clean slate finally gives you that push to explore new options and live intentionally.
  • Mental bandwidth
    Clutter weighs us down—physically, mentally, and digitally. Fewer icons, fewer updates, fewer random notifications equals more headspace for the apps and tasks that truly matter.

Instead of dragging all that over, why not start with a clean slate?

7 smart moves when starting fresh

Here are seven clear, privacy-focused steps to help you make the most of your fresh start when setting up a new device:

    1. Start with essentials
      Install just the critical apps you truly can’t live without. Leave everything else off until you discover a real need for it.
    2. Use a browser
      Skip invasive native apps where you can. A privacy-friendly browser often demands fewer permissions and leaks less of your data.
    3. Pack light
      Think carefully about what needs to be on your phone 24/7. Not every app has to follow you everywhere—some can stay on a secondary device.
    4. Try privacy-focused alternatives
      While adding new apps, consider switching to more secure, privacy-respecting services. It’s a perfect time to level up your toolkit.
    5. Set up new accounts
      If you’re able, create fresh IDs instead of reusing old ones clogged with data exhaust. This cuts the thread linking your activity to outdated profiles. There might be costs associated with purchasing some new apps again, so decide if this is the right choice for you.
    6. Check permissions
      Pay attention to each permission request—location, contacts, camera—and limit or deny wherever possible. Don’t dish out unnecessary access.
    7. Be selective with backups
      Only migrate the essentials. Export contacts separately, store photos in a secure cloud, and keep old voice memos on a local drive if needed. Bringing less forward keeps your new device clutter-free.

Takeaways

Reclaiming your privacy isn’t about being perfect. It’s about making intentional choices. One of the easiest but most impactful things you can do is say no to restoring from backup.

This single decision sets the tone for your entire digital footprint. It gives you a clean slate. And it lets you rebuild on your terms.

Start with a handful of tools. Skip the bloat. Be picky about what gets installed and who gets your data.

It’s not about inconvenience—it’s about control. The digital world is filled with people trying to make decisions for you. Starting fresh is a way to take that power back.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.