Monday, September 29, 2025

Polaris of Enlightenment

Lock down your Mac

No Apple ID, no tracking, no nonsense.

Published 17 May 2025
– By Naomi Brockwell
6 minute read

Apple markets itself as a privacy-focused company. And compared to Google or Microsoft, it is. But let’s be clear: Apple is still collecting a lot of your data.

If you want the most private computer setup, your best option is to switch to Linux. Not everyone is ready to take that step though, and many might prefer to keep their existing computer instead.

If you want to keep your current device but make it more private, what are your options?

Windows is basically a privacy disaster. Privacy expert Michael Bazzell says in his book Extreme Privacy:

I do not believe any modern Microsoft Windows system is capable of providing a secure or private environment for our daily computing needs. Windows is extremely vulnerable to malicious software and their telemetry of user actions is worse than Apple’s. I do not own a Windows computer and I encourage you to avoid them for any sensitive tasks”.

If you want to keep your Mac without handing over your digital life to Apple, there are ways to lock it down and make it more private.

In this article, I’ll walk you through how to set up a Mac for better privacy—from purchasing the computer to tweaking your system settings, installing tools, and blocking unwanted data flows.

We’ll be following the setup laid out by Michael Bazzell in Extreme Privacy, with some added tips from my own experience.

We also made a video tutorial that you can follow along.

You don’t need to do everything. Each chapter is modular. But if you follow the full guide, you’ll end up with a Mac that doesn’t require an Apple ID, doesn’t leak constant data, and gives you control over your digital environment.

Buying your Mac

Choose a model that still gets security updates

Apple eventually drops support for older devices. A privacy-hardened system isn’t useful if it doesn’t receive security updates.

Two helpful sites:

Pay with cash in a physical store

If you buy a Mac with a credit card, the serial number is forever linked to your identity.
Cash keeps you anonymous. You might get strange looks, but it’s completely within your rights. Be polite. Be firm. They’ll grumble. That’s fine.

Fresh install of macOS

If it’s a refurbished Mac—or even brand new—it’s worth doing a clean install.

Update macOS

  • System Settings > General > Software Update
  • Install updates, reboot, and reach the welcome screen.

Erase all content

  • System Settings > General > Transfer or Reset > Erase All Content and Settings
  • Enter your password, confirm warnings
  • Your Mac will restart and erase itself

This restores factory defaults: user data and settings are gone, but the OS remains installed.

Optional: Wipe the disk completely (advanced)

If you want a truly clean install, you’ll need to manually erase the entire internal disk. Only do this if you’re comfortable in recovery mode.

Modern Macs split the system into two parts—a sealed system volume and a data volume—tied together with something called firmlinks. If you don’t erase both correctly, you can end up with phantom volumes that clog your disk and break things silently.

Steps:

  • Enter Recovery Mode:
    • Apple Silicon: Hold power > click “Options”
    • Intel: Hold Command + R on boot
  • Open Disk Utility
  • Click View > Show All Devices
  • Select the top-level physical disk (e.g., “Apple SSD”)
  • Click Erase
    • Name: Macintosh HD
    • Format: APFS
    • Scheme: GUID Partition Map

Warning: Skip “Show All Devices” or erase the wrong item and you could brick your Mac. Only do this if you understand what you’re doing.

Once erased, return to the recovery menu and choose Reinstall macOS.

First boot setup

macOS wants to immediately link your device to iCloud and Apple services. Stay offline as long as possible.

Setup tips:

  • Region: Choose your location
  • Accessibility: Skip
  • Wi-Fi: Click “Other Network Options” > “My computer does not connect to the internet”
  • Data & Privacy: Continue
  • Migration Assistant: Skip (we’re starting fresh!)
  • Apple ID: Choose “Set up later”
  • Terms: Agree
  • Computer Name: Use a generic name like Laptop or Computer
  • Password: Strong and memorable. No hint. Write it down somewhere safe.
  • Location Services: Off
  • Time Zone: Set manually
  • Analytics: Off
  • Screen Time: Skip
  • Siri: Skip
  • Touch ID: Optional
  • Display Mode: Your choice

Harden system settings

Wi-fi & bluetooth

  • System Settings > Wi-Fi: Turn off
    • Disable “Ask to join networks” and “Ask to join hotspots”
  • System Settings > Bluetooth: Turn off

Firewall (built-In)

  • System Settings > Network > Firewall: Turn on
    • Disable “Automatically allow built-in software…”
    • Disable “Automatically allow downloaded signed software…”
    • Enable Stealth Mode
    • Remove any pre-approved entries

Notifications

  • System Settings > Notifications
    • Show Previews: Never
    • Turn off for Lock Screen, Sleep, and Mirroring
    • Manually disable for each app

Sound settings

  • System Settings > Sound
    • Alert Volume: Minimum
    • Disable sound effects and interface feedback

AirDrop & sharing

  • System Settings > General > AirDrop & Handoff: Turn everything off
  • System Settings > General > Sharing: Disable all toggles

Siri & Apple Intelligence

  • System Settings > Siri & Dictation: Disable all
  • Disable Apple Intelligence and per-app Siri access

Switch time server

Your Mac pings Apple to sync the time—leaking your IP every time it does.
Switch to a decentralized time server instead.

How:

  • System Settings > General > Date & Time
  • Click “Set…” > Enter password
  • Enter: pool.ntp.org
  • Click Done

Spotlight & gatekeeper

Spotlight

  • System Settings > Spotlight: Turn off “Help Apple improve search”

Gatekeeper

Gatekeeper prevents you from opening non-Apple-approved apps and sends app data to Apple.

If you’re a confident user, disable it:

  • Terminalsudo spctl --master-disable
  • System Settings > Privacy & Security: Allow apps from anywhere

FileVault & lockdown mode

FileVault

Encrypt your entire disk:

  • System Settings > Privacy & Security > FileVault: Turn on
  • Choose “Create a recovery key and do not use iCloud”
  • Write down your recovery key. Store it OFF your computer.

Lockdown mode (Optional)

Restricts features like USB accessories, AirDrop, and others. Useful for high-risk users.

Customize appearance & finder

Desktop & dock

  • Disable “Show Suggested and Recent Apps”
  • Disable “Recent apps in Stage Manager”

Wallpaper

Use a solid color instead of version-specific defaults to reduce your system’s fingerprint.

Lock screen

  • Screensaver: Never
  • Require password: Immediately
  • Sleep timer: Your preference (e.g. 1 hour)

Finder preferences

  • Show all file extensions
  • Hide Recents and Tags
  • Set default folder to Documents
  • View hidden files: Shift + Command + .

Block outbound connections

macOS and many apps connect to servers without asking. You’ll want to monitor and block them.

Use Little Snitch (or LuLu)

Browser

Install a privacy-respecting browser like Brave or Mullvad.

Compare options at privacytests.org

VPN

Use trusted providers like Mullvad or ProtonVPN.

Be careful which VPN you download — they’re often scamware and data collection tools.
Watch this video for more

Optional: Use Homebrew

Instead of the App Store, install software via Homebrew.

We’ll cover this more in a future guide.

Final takeaways

If you followed this guide, you now have:

  • A Mac with no Apple ID
  • No iCloud tether
  • Full disk encryption (FileVault)
  • A silent firewall
  • Blocked outbound connections
  • A private browser and VPN setup

You’ve taken serious steps to reclaim your digital autonomy. Well done.

In an upcoming guide, we’ll explore how to take the next step: switching to Linux.

Thanks again to Michael Bazzell for his work.

Find his book Extreme Privacy at: inteltechniques.com/book7.html

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Swedes turn to private apps as social media sharing declines

Published today 13:33
– By Editorial Staff
Young men are primarily those making less posts on social media.
1 minute read

Fewer than half of Swedes now regularly share their own posts on social media, shows a new report from the Internet Foundation (Internetstiftelsen), a Swedish internet research organization. At the same time, time spent on open platforms is decreasing – instead, people are increasingly turning to private channels.

In Sweden, the five largest platforms are YouTube, Facebook, Instagram, Snapchat and LinkedIn. However, looking at daily usage, LinkedIn is replaced by TikTok, which has become increasingly popular.

In the report “Swedes and the Internet 2025” it shows that fewer and fewer Swedes are posting their own content on the platforms. Only 45 percent regularly share their own posts on social media, which is a decrease of four percentage points compared to last year.

It is primarily young men who are making fewer posts on social media, and the decrease has mainly occurred on Snapchat. Men born in the 1970s have also essentially stopped making their own posts. Women, however, make their own posts to roughly the same extent as the previous year.

The larger the services become, the more they are filled with content that users haven’t asked for. Then you feel more like a consumer than someone who participates and contributes, says Måns Jonasson at the Internet Foundation, to Sweden’s public broadcaster SVT.

Swedes increasingly prefer to be on private platforms instead, such as WhatsApp – which is growing for the third consecutive year. Among children and young people, more use WhatsApp than Facebook Messenger.

Denmark gears up for digital independence: “We’ve been asleep for too long”

Published 21 September 2025
– By Editorial Staff
Denmark's complete dependence on non-European software, hardware and digital services is a very serious problem, according to Danish digitalization minister Caroline Stage Olsen.
2 minute read

Fear that critical IT systems could suddenly be shut down is driving European countries to strengthen their digital capabilities. Denmark is leading the development with pilot projects for open source, while the municipality of Copenhagen maps alternatives to Silicon Valley giants.

— We have been in a Sleeping Beauty slumber in Europe for too long, says Denmark’s digitalization minister Caroline Stage Olsen.

The statement comes amid a growing European debate about digital sovereignty, where Denmark has taken a leading role through concrete initiatives at both national and municipal levels.

The municipality of Copenhagen is now driving an ambitious effort to map alternatives to today’s dominant IT suppliers. Henrik Appel Esbensen, who leads the municipality’s internal audit, draws parallels to the energy sector:

— For gas and electricity we have alternative suppliers. Now we want to see if we can also become as supply-secure for IT as we want to be.

He emphasizes that the focus is on supply security rather than specifically avoiding American solutions: “For us it’s important not necessarily to get rid of American tech specifically, but that the supply security to Copenhagen is good”.

Pilot project with open source

Denmark’s digitalization ministry has started a pilot project exploring alternatives to Silicon Valley giants’ products, primarily through solutions based on open source. The initiative has gained renewed relevance following recent tensions between Denmark and the US regarding Greenland.

— We are dependent on products, services, software, hardware that come from countries outside Europe and that is a problem, states digitalization minister Caroline Stage Olsen.

Denmark is not alone in taking action. In Germany, the state of Schleswig-Holstein plans to replace Windows with Linux and seek domestic cloud providers. Meanwhile, Poland and the Baltic states are developing plans for large-scale AI data centers – a so-called “AI gigafactory” – to secure their own capacity for artificial intelligence.

— Estonia today uses the major American tech companies’ services, but we want to develop alternatives to secure our digital sovereignty, explains Estonia’s economy and industry minister Erkki Keldo to Swedish public television SVT.

“Must dare to invest”

The view on digital independence has undergone a dramatic change in a short time. Tech investor Johan Brenner from venture capital firm Creandum illustrates the shift:

— If you had asked the question a year ago, I would have just laughed at it. But now you don’t know, you might need to have a plan A and a plan B for European companies.

The path toward greater digital autonomy will be neither simple nor quick, according to Henrik Appel Esbensen in Copenhagen:

— I think it will take a long time. But it requires massive investments because there aren’t that many suppliers in the field right now. There’s no doubt that we must dare to invest in this in Europe.

Concern about what happens if critical IT systems are suddenly shut down or contracts are terminated has transformed digital sovereignty from an abstract discussion into a concrete security issue for European countries – a development that has accelerated markedly over the past year.

OpenAI monitors ChatGPT chats – can report users to police

Mass surveillance

Published 20 September 2025
– By Editorial Staff
What has been perceived as private AI conversations can now end up with police.
2 minute read

OpenAI has quietly begun monitoring users’ ChatGPT conversations and can report content to law enforcement authorities.

The revelation comes after incidents where AI chatbots have been linked to self-harm behavior, delusions, hospitalizations and suicide – what experts call “AI psychosis”.

In a blog post, the company acknowledges that they systematically scan users’ messages. When the system detects users planning to harm others, the conversations are directed to a review team that can suspend accounts and contact police.

“If human reviewers determine that a case involves an imminent threat of serious physical harm to others, we may refer it to law enforcement”, writes OpenAI.

The new policy means in practice that millions of users have their conversations scanned and that what many perceived as private conversations with an AI are now subject to systematic surveillance where content can be forwarded to authorities.

Tech journalist Noor Al-Sibai at Futurism points out that OpenAI’s statement is “short and vague” and that the company does not specify exactly what types of conversations could lead to police reports.

“It remains unclear which exact types of chats could result in user conversations being flagged for human review, much less getting referred to police”, she writes.

Security problems ignored

Ironically, ChatGPT has proven vulnerable to “jailbreaks” where users have been able to trick the system into giving instructions for building neurotoxins or step-by-step guides for suicide. Instead of addressing these fundamental security flaws, OpenAI is now choosing extensive surveillance of users.

The surveillance stands in sharp contrast to the tech company’s actions in the lawsuit against the New York Times, where the company “steadfastly rejected” demands to hand over ChatGPT logs citing user privacy.

“It’s also kind of bizarre that OpenAI even mentions privacy, given that it admitted in the same post that it’s monitoring user chats and potentially sharing them with the fuzz”, Al-Sibai notes.

May be forced to hand over chats

OpenAI CEO Sam Altman has recently acknowledged that ChatGPT does not offer the same confidentiality as conversations with real therapists or lawyers, and due to the lawsuit, the company may be forced to hand over user chats to various courts.

“OpenAI is stuck between a rock and a hard place”, writes Al-Sibai. The company is trying to handle the PR disaster from users who have suffered mental health crises, but since they “clearly having trouble controlling its own tech”, they fall back on “heavy-handed moderation that flies in the face of its own CEO’s promises”.

The tech company announces that they are “currently not” reporting self-harm cases to police, but the wording suggests that even this could change. The company has also not responded to requests to clarify what criteria are used for surveillance.

The internet is a manipulation machine

Be careful you're not playing an avatar in someone else’s propaganda war.

Published 20 September 2025
– By Naomi Brockwell
8 minute read

We’re more polarized than ever. Conversations have turned into shouting matches. Opposing ideas feel like threats, not something to debate.

But here’s something many people don’t realize: privacy and surveillance have everything to do with it. Most people never connect those dots.

Why surveillance is the key to polarization

Surveillance is the engine that makes platform-driven polarization work.

Platforms have one overriding goal: to keep us online as long as possible. And they’ve learned that nothing hooks us like outrage. If they can rile us up, we’ll stay, scroll, and click.

Outrage drives engagement. Engagement drives profit. But when outrage becomes the currency of the system, polarization is the natural byproduct. The more the platforms know about us, the easier it is to feed us the content that will push our buttons, confirm our biases, and keep us in a cycle of anger. And that anger doesn’t just keep us scrolling, it also pushes us further apart.

These platforms are not neutral spaces, they are giant marketplaces where influence is bought and sold. Every scroll, every feed, every “recommended” post is shaped by algorithms built to maximize engagement and auction off your attention. And it’s not just companies pushing shoes or handbags. It’s political groups paying to shift your vote. It’s movements paying to make you hate certain people because you think they hate you. It’s hostile governments paying to fracture our society.

Because our lives are so transparent to the surveillance machine, we’re more susceptible to manipulation than ever. Polarization isn’t cultural drift. When surveillance becomes the operating system of the internet, polarization and manipulation are the natural consequences.

The internet is a manipulation machine

Few people are really aware of how much manipulation there is online. We all fancy ourselves to be independent thinkers. We like to think we make up our own mind about things. That we choose for ourselves which videos to watch next. That we discover interesting articles all on our own.

We want to believe we’re in control. But in a system where people are constantly paying to influence us, that independence is hard to defend. The truth is, our autonomy is far more fragile than we’d like to admit.

This influence creeps into our entire online experience.

Every time you load a web page, you’ll notice that the text appears first, alongside empty white boxes, and there’s a split second before those boxes are filled up. What’s going on in that split second is an auction, as part of what’s called a real-time bidding (RTB) system.

For example, in Google’s RTB system, what’s going on behind the scenes in that split second is Google is announcing to their list of Authorized Buyers, who are the bidders plugged into Google’s ad exchange:

“Hey, this person just opened up her webpage, here’s everything we know about her. She has red hair. She rants a lot about privacy. She likes cats. Here’s her device, location, browsing history, and this is her inferred mood. Who wants to bid to put an ad in front of her?”

These authorized buyers have milliseconds to decide whether to bid and how much.

This “firehose of data” is sprayed at potentially thousands of entities. And the number of data points included can be staggering. Google knows a LOT about you. Only one buyer wins the ad slot and pays, but potentially thousands will get access to that data.

Google doesn’t make their Authorized Buyers list public, but they do publish a list of Certified External Vendors list, which is a public-facing list of vendors like demand-side platforms, ad servers, analytics providers, etc. that Google has certified to interact with their ad systems. This CEV list is the closest proxy the public gets to knowing who is involved in this real-time bidding system.

And if you scroll through the names of some of these vendors, you won’t even find a Wikipedia page for many of them. A huge number have scrubbed themselves from the internet. It’s a mix of ad companies, data brokers, even government shell companies. And many of them you can bet are just sitting quietly in these auctions so they can scrape this data, to share or sell elsewhere, or use for other purposes. Regardless of what Google’s own Terms of Service say, once this data leaves Google’s hands, they have no control.

This real-time bidding system is just one behind-the-scenes mechanisms of the influence economy. But this machinery of influence is everywhere, not just when you load a webpage.

When you go to watch a video, there are thumbnails next to the video suggesting what you should watch next, and you click on one if it looks interesting. Those video thumbnails were not accidental.

When you scroll a social media timeline, the posts that populate are intentional. Everywhere you go, you’re seeing things that people have paid to put in front of you, hoping to nudge you one way or another. Even search results, which feel like neutral gateways to information, are arranged according to what someone else wants you to see.

This system of manipulation isn’t limited to simple commercial influence, where companies just want to get us to buy a new pair of shoes.

There are faceless entities paying to shape our thoughts, shift our behavior, and sway our votes. They work to bend our worldview, to manipulate our emotions, even to make us hate other people by convincing us those people hate us.

Where privacy comes in

This is where privacy comes into play.

The more a company or government knows about us, the easier it is to manipulate us.

  • If we allow every email to be scanned and analyzed, every message to be read, every like, scroll, and post to be fed into a profile about us…
  • If companies scrape every browser click, every book we read, every piece of music we listen to, every film we watch…
  • When faceless entities know everywhere we go, whom we meet, what we do, and then they trace who those people meet, where they go, and what they do, and our entire social graph is mapped…

In this current reality, the surveillance industrial complex knows us better than we know ourselves, and it becomes easy to figure out exactly what will make us click.

“Oh, Naomi is sad today. She’ll be more susceptible to this kind of messaging. Push it to her now.”

Profiles aren’t just about facts. They’re about state of mind. If the system can see that you’re tired, lonely, or angry, it knows exactly when to time the nudge.

Who are the players?

This isn’t just about platforms experimenting with outrage to keep us online. Entire government departments now study these manipulation strategies. When something goes viral, they try to trace where it started: “Was it seeded by a hostile nation, a domestic political shop, or a corporation laying the groundwork for its next rent-seeking scheme?”

Everyone with resources uses these tools. Governments, parties, corporations, activist networks. The mechanism is the same, and the targets are us.

The entire internet runs on a system where people are competing for our attention, and some of the agendas of those involved are downright nefarious.

These systems don’t just predict what we like and hate, they actively shape it, and we have to start realizing that sometimes division itself is the intended outcome.

Filter bubbles were only the beginning

For years, the filter bubble was the go-to explanation for polarization. Algorithms showed us more of what we already agreed with, so we became trapped in echo chambers. We assumed polarization was just the natural consequence of people living in separate informational worlds.

But that story is only half right, and dangerously incomplete.

The real problem isn’t just that we see different things.
It’s that we are being deliberately targeted.

Governments, corporations, and movements know so much about us that they can do more than keep us in bubbles. They can reach inside those bubbles to provoke us, push us, and agitate us.

Filter bubbles were about limiting information. Surveillance-driven targeting is about exploiting information. With enough data, platforms and their partners can predict what will outrage you, when you’re most vulnerable, and which message will make you react.

And that’s the crucial shift. Polarization today isn’t just a byproduct of passive algorithms. It’s the direct result of an influence machine that knows us better than we know ourselves, and uses that knowledge to bend us toward someone else’s agenda.

Fakes, fragments, and manufactured consensus

We live in a world of deepfakes.

We live in a world of soundbites taken out of context.

We live in an era where it’s easier than ever to generate AI fluff. If someone wants to make a point of view seem popular, they can instantly create thousands of websites, all parroting the same slightly tweaked narrative. When we go searching for information, it looks like everyone is in consensus.

Volume now looks like truth, and repetition now looks like proof. And both are cheap.

Remember your humanity

In this era of artificial interactions, manipulation, and engineered outrage, we can’t forget our humanity.

That person that you’re fighting with might not actually be a human, they might be a bot.

That story about that political candidate might have been taken completely out of context, and deliberately targeted at you to make you angry.

Online, we dehumanize each other. But we should instead remember how to talk. Ideas can be discussed without becoming triggers. They don’t have to send us spiraling after four hours of doomscrolling.

Fear is the mindkiller. When something online pushes you to react, pause. Ask whose agenda this serves. Ask what context you might be missing.

The path forward

We are more polarized than ever, largely because we’ve become so transparent to those who profit from using our emotions against us.

Privacy is our ally in this fight. The less companies and governments know about us, the harder it is for them to manipulate us. Privacy protects our autonomy in the digital age.

And we need to see each other as humans first, not as avatars in someone else’s propaganda war. The person you’re arguing with was probably targeted by a completely opposite campaign.

We’ll all be better off if we lift the veil on this manipulation, and remember that we are independent thinkers with the power to make up our own minds, instead of being led by those who want to control us.

 

Yours in Privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.