Saturday, May 17, 2025

Polaris of Enlightenment

Ad:

Lock down your Mac

No Apple ID, no tracking, no nonsense.

Published today 8:16
– By Naomi Brockwell

Apple markets itself as a privacy-focused company. And compared to Google or Microsoft, it is. But let’s be clear: Apple is still collecting a lot of your data.

If you want the most private computer setup, your best option is to switch to Linux. Not everyone is ready to take that step though, and many might prefer to keep their existing computer instead.

If you want to keep your current device but make it more private, what are your options?

Windows is basically a privacy disaster. Privacy expert Michael Bazzell says in his book Extreme Privacy:

I do not believe any modern Microsoft Windows system is capable of providing a secure or private environment for our daily computing needs. Windows is extremely vulnerable to malicious software and their telemetry of user actions is worse than Apple’s. I do not own a Windows computer and I encourage you to avoid them for any sensitive tasks”.

If you want to keep your Mac without handing over your digital life to Apple, there are ways to lock it down and make it more private.

In this article, I’ll walk you through how to set up a Mac for better privacy—from purchasing the computer to tweaking your system settings, installing tools, and blocking unwanted data flows.

We’ll be following the setup laid out by Michael Bazzell in Extreme Privacy, with some added tips from my own experience.

We also made a video tutorial that you can follow along.

You don’t need to do everything. Each chapter is modular. But if you follow the full guide, you’ll end up with a Mac that doesn’t require an Apple ID, doesn’t leak constant data, and gives you control over your digital environment.

Buying your Mac

Choose a model that still gets security updates

Apple eventually drops support for older devices. A privacy-hardened system isn’t useful if it doesn’t receive security updates.

Two helpful sites:

Pay with cash in a physical store

If you buy a Mac with a credit card, the serial number is forever linked to your identity.
Cash keeps you anonymous. You might get strange looks, but it’s completely within your rights. Be polite. Be firm. They’ll grumble. That’s fine.

Fresh install of macOS

If it’s a refurbished Mac—or even brand new—it’s worth doing a clean install.

Update macOS

  • System Settings > General > Software Update
  • Install updates, reboot, and reach the welcome screen.

Erase all content

  • System Settings > General > Transfer or Reset > Erase All Content and Settings
  • Enter your password, confirm warnings
  • Your Mac will restart and erase itself

This restores factory defaults: user data and settings are gone, but the OS remains installed.

Optional: Wipe the disk completely (advanced)

If you want a truly clean install, you’ll need to manually erase the entire internal disk. Only do this if you’re comfortable in recovery mode.

Modern Macs split the system into two parts—a sealed system volume and a data volume—tied together with something called firmlinks. If you don’t erase both correctly, you can end up with phantom volumes that clog your disk and break things silently.

Steps:

  • Enter Recovery Mode:
    • Apple Silicon: Hold power > click “Options”
    • Intel: Hold Command + R on boot
  • Open Disk Utility
  • Click View > Show All Devices
  • Select the top-level physical disk (e.g., “Apple SSD”)
  • Click Erase
    • Name: Macintosh HD
    • Format: APFS
    • Scheme: GUID Partition Map

Warning: Skip “Show All Devices” or erase the wrong item and you could brick your Mac. Only do this if you understand what you’re doing.

Once erased, return to the recovery menu and choose Reinstall macOS.

First boot setup

macOS wants to immediately link your device to iCloud and Apple services. Stay offline as long as possible.

Setup tips:

  • Region: Choose your location
  • Accessibility: Skip
  • Wi-Fi: Click “Other Network Options” > “My computer does not connect to the internet”
  • Data & Privacy: Continue
  • Migration Assistant: Skip (we’re starting fresh!)
  • Apple ID: Choose “Set up later”
  • Terms: Agree
  • Computer Name: Use a generic name like Laptop or Computer
  • Password: Strong and memorable. No hint. Write it down somewhere safe.
  • Location Services: Off
  • Time Zone: Set manually
  • Analytics: Off
  • Screen Time: Skip
  • Siri: Skip
  • Touch ID: Optional
  • Display Mode: Your choice

Harden system settings

Wi-fi & bluetooth

  • System Settings > Wi-Fi: Turn off
    • Disable “Ask to join networks” and “Ask to join hotspots”
  • System Settings > Bluetooth: Turn off

Firewall (built-In)

  • System Settings > Network > Firewall: Turn on
    • Disable “Automatically allow built-in software…”
    • Disable “Automatically allow downloaded signed software…”
    • Enable Stealth Mode
    • Remove any pre-approved entries

Notifications

  • System Settings > Notifications
    • Show Previews: Never
    • Turn off for Lock Screen, Sleep, and Mirroring
    • Manually disable for each app

Sound settings

  • System Settings > Sound
    • Alert Volume: Minimum
    • Disable sound effects and interface feedback

AirDrop & sharing

  • System Settings > General > AirDrop & Handoff: Turn everything off
  • System Settings > General > Sharing: Disable all toggles

Siri & Apple Intelligence

  • System Settings > Siri & Dictation: Disable all
  • Disable Apple Intelligence and per-app Siri access

Switch time server

Your Mac pings Apple to sync the time—leaking your IP every time it does.
Switch to a decentralized time server instead.

How:

  • System Settings > General > Date & Time
  • Click “Set…” > Enter password
  • Enter: pool.ntp.org
  • Click Done

Spotlight & gatekeeper

Spotlight

  • System Settings > Spotlight: Turn off “Help Apple improve search”

Gatekeeper

Gatekeeper prevents you from opening non-Apple-approved apps and sends app data to Apple.

If you’re a confident user, disable it:

  • Terminalsudo spctl --master-disable
  • System Settings > Privacy & Security: Allow apps from anywhere

FileVault & lockdown mode

FileVault

Encrypt your entire disk:

  • System Settings > Privacy & Security > FileVault: Turn on
  • Choose “Create a recovery key and do not use iCloud”
  • Write down your recovery key. Store it OFF your computer.

Lockdown mode (Optional)

Restricts features like USB accessories, AirDrop, and others. Useful for high-risk users.

Customize appearance & finder

Desktop & dock

  • Disable “Show Suggested and Recent Apps”
  • Disable “Recent apps in Stage Manager”

Wallpaper

Use a solid color instead of version-specific defaults to reduce your system’s fingerprint.

Lock screen

  • Screensaver: Never
  • Require password: Immediately
  • Sleep timer: Your preference (e.g. 1 hour)

Finder preferences

  • Show all file extensions
  • Hide Recents and Tags
  • Set default folder to Documents
  • View hidden files: Shift + Command + .

Block outbound connections

macOS and many apps connect to servers without asking. You’ll want to monitor and block them.

Use Little Snitch (or LuLu)

Browser

Install a privacy-respecting browser like Brave or Mullvad.

Compare options at privacytests.org

VPN

Use trusted providers like Mullvad or ProtonVPN.

Be careful which VPN you download — they’re often scamware and data collection tools.
Watch this video for more

Optional: Use Homebrew

Instead of the App Store, install software via Homebrew.

We’ll cover this more in a future guide.

Final takeaways

If you followed this guide, you now have:

  • A Mac with no Apple ID
  • No iCloud tether
  • Full disk encryption (FileVault)
  • A silent firewall
  • Blocked outbound connections
  • A private browser and VPN setup

You’ve taken serious steps to reclaim your digital autonomy. Well done.

In an upcoming guide, we’ll explore how to take the next step: switching to Linux.

Thanks again to Michael Bazzell for his work.

Find his book Extreme Privacy at: inteltechniques.com/book7.html

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Specialist doctor warns: Social media is hacking our brains

The rise of technocracy

Published today 12:08
– By Editorial Staff
In 15 years, eating disorders have almost doubled - largely attributed to morbid trends on social media.

Users are not the customers of social media giants – they are the product itself, where the most important thing is to capture our attention for as long as possible and at any cost.

This is the conclusion of psychiatrist Anders Hansen, who points to TikTok as an example of a platform that “creates information that our brains cannot look away from”.

On the Swedish public television channel SVT, he explains what happens to users’ brains when they use social media and how harmful much of the content actually is – especially for young users.

– We humans want to belong to a group at any cost. It’s pure survival. We constantly ask ourselves: ‘Am I good enough for the group, am I attractive enough, or smart enough, or thin enough?’ When we are exposed to this two to three hours a day, we perceive that we are not good enough, that we are not worthy.

– Our brains register this as a threat to our survival, which is why it makes us feel so bad. Some people then try to do something about it, such as starving themselves… These are deeply biological mechanisms within us that are being hacked by this extremely advanced and sophisticated technology.

“The companies don’t care”

Although it is actually prohibited on most platforms to target weight loss tips and similar ideals of thinness to children and young people, this is still very common.

Although eating disorders are a complex illness with many potential causes, Hansen says it cannot be ignored that they have almost doubled since 2010 across the Western world – and that this is likely due to the ideals promoted on social media.

– Companies don’t care if you develop a distorted self-image, they just want to squeeze every last second out of you. If you think about it, maybe you can awaken your inner rebel and not let companies take up your time, he explains.

Profit lost from restrictions

The psychiatrist also points out that all types of regulations and restrictions on algorithms and content mean that users will spend less time on social media – and that this is why social media companies systematically oppose such requirements.

– They have no interest whatsoever in trying to stop this.

Although TikTok is highlighted as the clearest example, there are now a long list of competitors that work in a similar way – including Instagram Reels (Meta), YouTube Shorts (Google), and Snapchat Spotlight.

Your therapist, your doctor, your insurance plan – now in Google’s ad system

Blue Shield exposed 4.7 million patients’ private health info to Google. Your most private information may now be fueling ads, pricing decisions, and phishing scams.

Published 10 May 2025
– By Naomi Brockwell

The healthcare sector is one of the biggest targets for cyberattacks—and it’s only getting worse.

Every breach spills sensitive information—names, medical histories, insurance details, even Social Security numbers. But this time, it wasn’t hackers breaking down the doors.

It was Blue Shield of California leaving the front gate wide open.

Between April 2021 and January 2024, Blue Shield exposed the health data of 4.7 million members by misconfiguring Google Analytics on its websites. That’s right—your protected health information was quietly piped to Google’s advertising systems.

Let’s break down what was shared:

  • Your insurance plan name and group number
  • Your ZIP code, gender, family size
  • Patient names, financial responsibility, and medical claim service dates
  • “Find a Doctor” searches—including provider names and types
  • Internal Blue Shield account identifiers

They didn’t just leak names. They leaked context. The kind of data that paints a detailed picture of your life.

And what’s worse—most people have become so numb to these data breaches that the most common response is “Why should I care?”

Let’s break it down.

1. Health data is deeply personal

This isn’t just a password or an email leak. This is your health. Your body. Your medical history. Maybe your therapist. Maybe a cancer screening. Maybe reproductive care.

This is the kind of stuff people don’t even tell their closest friends. Now imagine it flowing into a global ad system run by one of the biggest surveillance companies on earth.

Once shared, you don’t get to reel it back in. That vulnerability sticks.

2. Your family’s privacy is at risk—even if it was your data

Health information doesn’t exist in a vacuum. A diagnosis on your record might reveal a hereditary condition your children could carry. A test result might imply something about your partner. An STD might not just be your business.

This breach isn’t just about people directly listed on your health plan—it’s about your entire household being exposed by association. When sensitive medical data is shared without consent, it compromises more than your own privacy. It compromises your family’s.

3. Your insurance rates could be affected—without your knowledge

Health insurers already buy data from brokers to assess risk profiles. They don’t need your full medical chart to make decisions—they just need signals: a recent claim, a high-cost provider, a chronic condition inferred from your search history or purchases.

Leaks like this feed that ecosystem.

Even if the data is incomplete or inaccurate, it can still be used to justify higher premiums—or deny you coverage entirely. And good luck challenging that decision. The burden of proof rarely falls on the companies profiling you. It falls on you.

4. Leaked health data fuels exploitative advertising

When companies know which providers you’ve visited, which symptoms you searched, or what procedures you recently underwent, it gives advertisers a disturbingly precise psychological profile.

This kind of data isn’t used to help you—it’s used to sell to you.
You might start seeing ads for drugs, miracle cures, or dubious treatments. You may be targeted with fear-based campaigns designed to exploit your pain, anxiety, or uncertainty. And it can all feel eerily personal—because it is.

This is surveillance operating in a very predatory form. In recent years, the FTC has cracked down on companies like BetterHelp and GoodRx for leaking health data to Facebook and Google to power advertising algorithms.

This breach could be yet another entry in the growing pattern of companies exploiting your data to target you.

5. It’s a goldmine for hackers running spear phishing campaigns

Hackers don’t need much to trick you into clicking a malicious link. But when they know:

  • Your doctor’s name
  • The date you received care
  • How much you owed
  • Your exact insurance plan and member ID

…it becomes trivially easy to impersonate your provider or insurance company.

You get a message that looks official. It references a real event in your life. You click. You log in. You enter your bank info.
And your accounts are drained before you even realize what happened.

6. You can’t predict how this data will be used—and that’s the problem

We tend to underestimate the power of data until it’s too late. It feels abstract. It doesn’t hurt.

But data accumulates. It’s cross-referenced. Sold. Repackaged. Used in ways you’ll never be told—until you’re denied a loan, nudged during an election, or flagged as a potential problem.

The point isn’t to predict every worst-case scenario. It’s that you shouldn’t have to. You should have the right to withhold your data in the first place.

Takeaways

The threat isn’t always a hacker in a hoodie. Sometimes it’s a quiet decision in a California boardroom that compromises millions of people at once.

We don’t get to choose when our data becomes dangerous. That choice is often made for us—by corporations we didn’t elect, using systems we can’t inspect, in a market that treats our lives as inventory.

But here’s what we can do:

  • Choose tools that don’t monetize your data. Every privacy-respecting service you use sends a signal.
  • Push for legislation that treats data like what it is—power. Demand the right to say no.
  • Educate others. Most people still don’t realize how broken the system is. Be the reason someone starts paying attention.
  • Support organizations building a different future. Privacy won’t win by accident. It takes all of us.

Control over your data is control over your future—and while that control is slipping, we’re not powerless.

We can’t keep waiting for the next breach to “wake people up.” Let this be the one that shifts the tide.

Privacy isn’t about secrecy. It’s about consent. And you never consented to this.

So yes, you should care. Because when your health data is treated as a business asset instead of a human right, no one is safe—unless we fight back.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

PoX: New memory chip from China sets speed record

Published 7 May 2025
– By Editorial Staff
Fudan engineers are now working on scaling up the technology and developing new prototypes.

A research team at Fudan University in China has developed the fastest semiconductor memory reported to date. The new memory, called PoX, is a type of non-volatile flash memory that can write a single bit in just 400 picoseconds – equivalent to about 25 billion operations per second.

The results were recently published in the scientific journal Nature and unlike traditional RAM (such as SRAM and DRAM), which is fast but erases data in the event of a power outage, non-volatile memory such as flash retains stored information without power. The problem has been that these memories are significantly slower – often thousands of times – which is a bottleneck for today’s AI systems that handle huge amounts of data in real time.

The research team, led by Professor Zhou Peng, achieved the breakthrough by replacing silicon channels with two-dimensional Dirac graphene – a material that allows extremely fast charge transfer. By fine-tuning the so-called “Gaussian length” of the channel, the researchers were able to create a phenomenon they call two-dimensional superinjection, which allows effectively unlimited charge transfer to the memory storage.

Using AI‑driven process optimization, we drove non‑volatile memory to its theoretical limit. This paves the way for future high‑speed flash memory, Zhou told the Chinese news agency Xinhua.

“Opens up new applications”

Co-author Liu Chunsen compares the difference to going from a USB flash drive that can do 1,000 writes per second to a chip that does a billion – in the same amount of time.

The technology combines low power consumption with extreme speed and could be particularly valuable for AI in battery-powered devices and systems with limited power supplies. If PoX can be mass-produced, it could reduce the need for separate caches, cut energy use and enable instant start-up of computers and mobiles.

Fudan engineers are now working on scaling up the technology and developing prototypes. No commercial partnerships have yet been announced.

– Our breakthrough can reshape storage technology, drive industrial upgrades and open new application scenarios, Zhou asserts.

Without consent

How parents unknowingly build surveillance files on their children.

Published 3 May 2025
– By Naomi Brockwell

Your child’s first digital footprint isn’t made by them—it’s made by you

What does the future look like for your child?

Before they can even talk, many kids already have a bigger digital footprint than their parents did at 25.

Every ultrasound shared on Facebook.
Every birthday party uploaded to Instagram.
Every proud tweet about a funny thing they said.

Each post seems harmless—until you zoom out and realize you’re building a permanent, searchable, biometric dossier on your child, curated by you.

This isn’t fearmongering. It’s the reality of a world where data is forever.
And it’s not just your friends and family who are watching.

Your kid is being profiled before they hit puberty

Here’s the uncomfortable truth:

When you upload baby photos, you’re training facial recognition databases on their face—at every age and stage.

When you post about their interests, health conditions, or behavior, you’re populating detailed profiles that can predict who they might become.

These profiles don’t just sit idle.
They’re analyzed, bought, and sold.

By the time your child applies for a job or stands up for something they believe in, they may already be carrying a hidden score assigned by an algorithm—built on data you posted.

When their childhood data comes back to haunt them

Imagine your child years from now, applying for a travel visa, a job, or just trying to board a flight.

A background check pulls information from facial recognition databases and AI-generated behavior profiles—flagging them for additional scrutiny based on “historic online associations”.

They’re pulled aside. Interrogated. Denied entry. Or worse, flagged permanently.

Imagine a future law that flags people based on past “digital risk indicators”—and your child’s online record becomes a barrier to accessing housing, education, or financial services.

Insurance companies can use their profile to label them a risky customer.

Recruiters might quietly filter them out based on years-old digital behavior.

Not because they did something wrong—but because of something you once shared.

Data doesn’t disappear.
Governments change. Laws evolve.
But surveillance infrastructure rarely gets rolled back.

And once your child’s data is out there, it’s out there forever.
Feeding systems you’ll never see.
Controlled by entities you’ll never meet.

For purposes you’ll never fully understand.

The rise of biometric surveillance—and why it targets kids first

Take Discord’s new AI selfie-based age verification. To prove they’re 13+, children are encouraged to submit selfies—feeding sensitive biometric data into AI systems.

You can change your password. You can’t change your face.

And yet, we’re normalizing the idea that kids should hand over their most immutable identifiers just to participate online.

Some schools already collect facial scans for attendance. Some toys use voice assistants that record everything your child says.

Some apps marketed as “parental control” tools grant third-party employees backend access to your child’s texts, locations—even live audio.

Ask yourself: Do you trust every single person at that company with your child’s digital life?

“I know you love me, and would never do anything to harm me…”

In the short film Without Consent, by Deutsche Telekom, a future version of a young girl named Ella speaks directly to her parents. She pleads with them to protect her digital privacy before it’s too late.

She imagines a future where:

  • Her identity is stolen.
  • Her voice is cloned to scam her mom into sending money.
  • Her old family photo is turned into a meme, making her a target of school-wide bullying.
  • Her photos appear on exploitation sites—without her knowledge or consent.

It’s haunting because it’s plausible.

This is the world we’ve built.
And your child’s data trail—your posts—is the foundation.

The most powerful privacy lesson you can teach? How you live online.

Children learn how to navigate the digital world by watching you.

What are you teaching them if you trade their privacy for likes?

The best gift you can give them isn’t a new device—it’s the mindset and tools to protect themselves in a world that profits from their exposure.

Even “kid-safe” tech often betrays that trust.

Baby monitors have leaked footage.

Tracking apps have glitched and exposed locations of random children (yes, really).

Schools collect and store sensitive information with barely any safeguards—and breaches happen all the time.

How to protect your child’s digital future

Stop oversharing
Avoid posting photos, birthdays, locations, or anecdotes about your child online—especially on platforms that monetize engagement.

Ditch spyware apps
Instead of surveillance, foster open dialogue. If monitoring is necessary, choose open-source, self-hosted tools where you control the data—not some faceless company.

Teach consent early
Help your child understand that their body, thoughts, and information are theirs to control. Make digital consent a family value.

Opt out of biometric collection
Say no to tools that demand selfies, facial scans, or fingerprints. Fight back against the normalization of biometric surveillance for kids.

Use aliases and VoIP numbers
When creating accounts for your child, use email aliases and VoIP numbers to avoid linking their real identity across platforms.

Push schools and apps for better policies
Ask your child’s school: What data do they collect? Who has access? Is it encrypted?
Push back on apps that demand unnecessary permissions. Ask hard questions.

This isn’t paranoia—it’s parenting in the digital age

This is about protecting your child’s right to grow up without being boxed in by their digital past.

About giving them the freedom to explore ideas, try on identities, and make mistakes—without it becoming a permanent record.

Privacy is protection.
It’s dignity.
It’s autonomy.

And it’s your job to help your child keep it.
Let’s give the next generation a chance to write their own story.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.