Tuesday, July 1, 2025

Polaris of Enlightenment

Lock down your Mac

No Apple ID, no tracking, no nonsense.

Published 17 May 2025
– By Naomi Brockwell
6 minute read

Apple markets itself as a privacy-focused company. And compared to Google or Microsoft, it is. But let’s be clear: Apple is still collecting a lot of your data.

If you want the most private computer setup, your best option is to switch to Linux. Not everyone is ready to take that step though, and many might prefer to keep their existing computer instead.

If you want to keep your current device but make it more private, what are your options?

Windows is basically a privacy disaster. Privacy expert Michael Bazzell says in his book Extreme Privacy:

I do not believe any modern Microsoft Windows system is capable of providing a secure or private environment for our daily computing needs. Windows is extremely vulnerable to malicious software and their telemetry of user actions is worse than Apple’s. I do not own a Windows computer and I encourage you to avoid them for any sensitive tasks”.

If you want to keep your Mac without handing over your digital life to Apple, there are ways to lock it down and make it more private.

In this article, I’ll walk you through how to set up a Mac for better privacy—from purchasing the computer to tweaking your system settings, installing tools, and blocking unwanted data flows.

We’ll be following the setup laid out by Michael Bazzell in Extreme Privacy, with some added tips from my own experience.

We also made a video tutorial that you can follow along.

You don’t need to do everything. Each chapter is modular. But if you follow the full guide, you’ll end up with a Mac that doesn’t require an Apple ID, doesn’t leak constant data, and gives you control over your digital environment.

Buying your Mac

Choose a model that still gets security updates

Apple eventually drops support for older devices. A privacy-hardened system isn’t useful if it doesn’t receive security updates.

Two helpful sites:

Pay with cash in a physical store

If you buy a Mac with a credit card, the serial number is forever linked to your identity.
Cash keeps you anonymous. You might get strange looks, but it’s completely within your rights. Be polite. Be firm. They’ll grumble. That’s fine.

Fresh install of macOS

If it’s a refurbished Mac—or even brand new—it’s worth doing a clean install.

Update macOS

  • System Settings > General > Software Update
  • Install updates, reboot, and reach the welcome screen.

Erase all content

  • System Settings > General > Transfer or Reset > Erase All Content and Settings
  • Enter your password, confirm warnings
  • Your Mac will restart and erase itself

This restores factory defaults: user data and settings are gone, but the OS remains installed.

Optional: Wipe the disk completely (advanced)

If you want a truly clean install, you’ll need to manually erase the entire internal disk. Only do this if you’re comfortable in recovery mode.

Modern Macs split the system into two parts—a sealed system volume and a data volume—tied together with something called firmlinks. If you don’t erase both correctly, you can end up with phantom volumes that clog your disk and break things silently.

Steps:

  • Enter Recovery Mode:
    • Apple Silicon: Hold power > click “Options”
    • Intel: Hold Command + R on boot
  • Open Disk Utility
  • Click View > Show All Devices
  • Select the top-level physical disk (e.g., “Apple SSD”)
  • Click Erase
    • Name: Macintosh HD
    • Format: APFS
    • Scheme: GUID Partition Map

Warning: Skip “Show All Devices” or erase the wrong item and you could brick your Mac. Only do this if you understand what you’re doing.

Once erased, return to the recovery menu and choose Reinstall macOS.

First boot setup

macOS wants to immediately link your device to iCloud and Apple services. Stay offline as long as possible.

Setup tips:

  • Region: Choose your location
  • Accessibility: Skip
  • Wi-Fi: Click “Other Network Options” > “My computer does not connect to the internet”
  • Data & Privacy: Continue
  • Migration Assistant: Skip (we’re starting fresh!)
  • Apple ID: Choose “Set up later”
  • Terms: Agree
  • Computer Name: Use a generic name like Laptop or Computer
  • Password: Strong and memorable. No hint. Write it down somewhere safe.
  • Location Services: Off
  • Time Zone: Set manually
  • Analytics: Off
  • Screen Time: Skip
  • Siri: Skip
  • Touch ID: Optional
  • Display Mode: Your choice

Harden system settings

Wi-fi & bluetooth

  • System Settings > Wi-Fi: Turn off
    • Disable “Ask to join networks” and “Ask to join hotspots”
  • System Settings > Bluetooth: Turn off

Firewall (built-In)

  • System Settings > Network > Firewall: Turn on
    • Disable “Automatically allow built-in software…”
    • Disable “Automatically allow downloaded signed software…”
    • Enable Stealth Mode
    • Remove any pre-approved entries

Notifications

  • System Settings > Notifications
    • Show Previews: Never
    • Turn off for Lock Screen, Sleep, and Mirroring
    • Manually disable for each app

Sound settings

  • System Settings > Sound
    • Alert Volume: Minimum
    • Disable sound effects and interface feedback

AirDrop & sharing

  • System Settings > General > AirDrop & Handoff: Turn everything off
  • System Settings > General > Sharing: Disable all toggles

Siri & Apple Intelligence

  • System Settings > Siri & Dictation: Disable all
  • Disable Apple Intelligence and per-app Siri access

Switch time server

Your Mac pings Apple to sync the time—leaking your IP every time it does.
Switch to a decentralized time server instead.

How:

  • System Settings > General > Date & Time
  • Click “Set…” > Enter password
  • Enter: pool.ntp.org
  • Click Done

Spotlight & gatekeeper

Spotlight

  • System Settings > Spotlight: Turn off “Help Apple improve search”

Gatekeeper

Gatekeeper prevents you from opening non-Apple-approved apps and sends app data to Apple.

If you’re a confident user, disable it:

  • Terminalsudo spctl --master-disable
  • System Settings > Privacy & Security: Allow apps from anywhere

FileVault & lockdown mode

FileVault

Encrypt your entire disk:

  • System Settings > Privacy & Security > FileVault: Turn on
  • Choose “Create a recovery key and do not use iCloud”
  • Write down your recovery key. Store it OFF your computer.

Lockdown mode (Optional)

Restricts features like USB accessories, AirDrop, and others. Useful for high-risk users.

Customize appearance & finder

Desktop & dock

  • Disable “Show Suggested and Recent Apps”
  • Disable “Recent apps in Stage Manager”

Wallpaper

Use a solid color instead of version-specific defaults to reduce your system’s fingerprint.

Lock screen

  • Screensaver: Never
  • Require password: Immediately
  • Sleep timer: Your preference (e.g. 1 hour)

Finder preferences

  • Show all file extensions
  • Hide Recents and Tags
  • Set default folder to Documents
  • View hidden files: Shift + Command + .

Block outbound connections

macOS and many apps connect to servers without asking. You’ll want to monitor and block them.

Use Little Snitch (or LuLu)

Browser

Install a privacy-respecting browser like Brave or Mullvad.

Compare options at privacytests.org

VPN

Use trusted providers like Mullvad or ProtonVPN.

Be careful which VPN you download — they’re often scamware and data collection tools.
Watch this video for more

Optional: Use Homebrew

Instead of the App Store, install software via Homebrew.

We’ll cover this more in a future guide.

Final takeaways

If you followed this guide, you now have:

  • A Mac with no Apple ID
  • No iCloud tether
  • Full disk encryption (FileVault)
  • A silent firewall
  • Blocked outbound connections
  • A private browser and VPN setup

You’ve taken serious steps to reclaim your digital autonomy. Well done.

In an upcoming guide, we’ll explore how to take the next step: switching to Linux.

Thanks again to Michael Bazzell for his work.

Find his book Extreme Privacy at: inteltechniques.com/book7.html

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Spotify fills playlists with fake music – while CEO invests millions in military AI

The future of AI

Published today 15:55
– By Editorial Staff
Spotify CEO Daniel Ek accused of diverting artist royalties to military AI development.
3 minute read

Swedish streaming giant Spotify promotes anonymous pseudo-musicians and computer-generated music to avoid paying royalties to real artists, according to a new book by music journalist Liz Pelly.

Meanwhile, criticism grows against Spotify CEO Daniel Ek, who recently invested over €600 million in a company developing AI technology for future warfare.

In the book Mood Machine: The Rise of Spotify and the Costs of the Perfect Playlist, Liz Pelly reveals that Spotify has long been running a secret internal program called Perfect Fit Content (PFC). The program creates cheap, generic background music – often called “muzak” – through a network of production companies with ties to Spotify. This music is then placed in Spotify’s popular playlists, often without crediting any real artists.

The program was tested as early as 2010 and is described by Pelly as Spotify’s most profitable strategy since 2017.

“But it also raises worrying questions for all of us who listen to music. It puts forth an image of a future in which – as streaming services push music further into the background, and normalize anonymous, low-cost playlist filler – the relationship between listener and artist might be severed completely”, Pelly writes.

By 2023, the PFC program controlled hundreds of playlists. More than 150 of them – with names like Deep Focus, Cocktail Jazz, and Morning Stretch – consisted entirely of music produced within PFC.

“Only soulless AI music will remain”

A jazz musician told Pelly that Spotify asked him to create an ambient track for a few hundred dollars as a one-time payment. However, he couldn’t retain the rights to the music. When the track later received millions of plays, he realized he had likely been deceived.

Social media criticism has been harsh. One user writes: “In a few years, only soulless AI music will remain. It’s an easy way to avoid paying royalties to anyone.”

“I deleted Spotify and cancelled my subscription”, comments another.

Spotify has previously faced criticism for similar practices. The Guardian reported in February that the company’s Discovery Mode system allows artists to gain more visibility – but only if they agree to receive 30 percent less payment.

Spotify’s CEO invests in AI for warfare

Meanwhile, CEO Daniel Ek has faced severe criticism for investing over €600 million through his investment firm Prima Materia in the German AI company Helsing. The company develops software for drones, fighter aircraft, submarines, and other military systems.

– The world is being tested in more ways than ever before. That has sped up the timeline. There’s an enormous realisation that it’s really now AI, mass and autonomy that is driving the new battlefield, Ek commented in an interview with Financial Times.

With this investment, Ek has also become chairman of Helsing. The company is working on a project called Centaur, where artificial intelligence will be used to control fighter aircraft.

The criticism was swift. Australian producer Bluescreen explained in an interview with music site Resident Advisor why he chose to leave Spotify – a decision several other music creators have also made.

– War is hell. There’s nothing ethical about it, no matter how you spin it. I also left because it became apparent very quickly that Spotify’s CEO, as all billionaires, only got rich off the exploitation of others.

Competitor chooses different path

Spotify has previously been questioned for its proximity to political power. The company donated $150,000 to Donald Trump’s inauguration fund in 2017 and hosted an exclusive brunch the day before the ceremony.

While Spotify is heavily investing in AI-generated music and voice-controlled DJs, competitor SoundCloud has chosen a different path.

– We do not develop AI tools or allow third parties to scrape or use SoundCloud content from our platform for AI training purposes, explains communications director Marni Greenberg.

– In fact, we implemented technical safeguards, including a ‘no AI’ tag on our site to explicitly prohibit unauthorised use.

FUTO – the obvious choice for privacy-friendly voice and text input on mobile devices

Advertising partnership with Teuton Systems

Ditch Google's input apps and keep what you type and say on your phone.

Published today 12:16
3 minute read

In our series about open, surveillance-free apps, we take a closer look at FUTO Voice Input and FUTO Keyboard – two apps that together challenge the established alternatives for voice input and keyboards on mobile devices. Most smartphone users are accustomed to dictating text using Google or using standard keyboards like Gboard or SwiftKey.

However, few consider that these popular tools often collect what you say and write privately, sending it to tech giants. The FUTO team themselves emphasize that their solution completely eliminates this problem – everything runs locally on the device without any data leaving the phone (offline with no connection requirements).

Here’s what the FUTO apps offer:

  • Privacy focus: FUTO apps run completely offline – no data is sent to the cloud.
  • Full functionality: Swipe typing, text suggestions, autocorrection, and voice-to-text with punctuation – everything works without internet connection (all keyboard functions available offline).
  • High precision: Offline dictation using advanced AI model (OpenAI Whisper) provides fast and accurate transcription (local voice recognition with high accuracy).
  • Multilingual support: Support for many languages and continuous improvements via the open-source community.

FUTO Keyboard

On the keyboard front, FUTO Keyboard impresses by delivering modern convenience without compromising privacy. Unlike conventional keyboards that constantly transmit user data, FUTO requires neither network access nor cloud services – yet it offers features on par with the best.

You can swipe words with your finger across the screen, get relevant text suggestions and automatic spell correction, and customize the theme to your liking – all while the app consistently refuses to send a single keystroke to any external server (all data stays with you). FUTO Keyboard also integrates FUTO Voice Input through a built-in microphone button, allowing ‘speech to text’ to be activated from the same interface.

FUTO Voice Input

For voice input, we have FUTO Voice Input that lets you dictate text directly in apps like messages or notes – completely without internet connection. All processing happens locally using a compact language model, meaning no audio needs to be sent away to become text. According to users who have compared it with Google’s cloud-based solution, FUTO can keep pace and even surpass it in both speed and accurate grammar.

An enthusiastic tester reported that FUTO provided a completely new experience – no delays or strange autocorrections that he previously suffered from with Gboard. This means you can safely speak freely and see the text appear almost immediately, without worrying about unauthorized “listening” on the other end.

Ongoing development and alternatives

Despite FUTO Keyboard being young, it’s already surprisingly capable. The interface feels polished and user-friendly, and the amount of features makes it almost comparable to established alternatives. Currently, text input works excellently in English, while support for smaller languages like Swedish is still being refined. However, development pace is high and the team behind FUTO has announced improvements specifically to autocorrection and expanded language support in upcoming updates. Moreover, global collaboration is encouraged: since the source code is open, engaged developers and users can contribute improvements and new language data to the project.

Among free alternatives, there’s Sayboard, an open source keyboard using Vosk for speech recognition. For pure keyboards, there’s AnySoftKeyboard and FlorisBoard, which are excellent from a privacy perspective but lack some of the advanced features that FUTO offers in one package (especially built-in voice input).

An essential part of the Matrix Phone ecosystem

FUTO Voice Input and Keyboard demonstrate that you can combine the best of both worlds: the convenience of smart text and voice functions, and the security of keeping your data private. For users of Teuton Systems’ Matrix Phone (GrapheneOS phone), these apps come pre-installed as part of the privacy-secure ecosystem. But they’re available to everyone – via Google Play or F-Droid – and constitute a highly recommended switch for anyone who values their privacy in everyday life.

As a tech writer recently put it: you no longer need to choose between functionality and security – with FUTO you get both without compromises.

Swedish regional healthcare app run by chatbot makes serious errors

Published yesterday 10:19
– By Editorial Staff
In one documented case, the app classified an elderly man's symptoms as mild - he died the following day.
2 minute read

An AI-based healthcare app used by the Gävleborg Regional Healthcare Authority in Sweden is now under scrutiny following serious assessment errors. In one notable case, an elderly man’s condition was classified as mild – he died the following day.

Healthcare staff are raising alarms about deficiencies deemed to threaten patient safety, and the app is internally described as a “disaster”.

Min vård Gävleborg (My Healthcare Gävleborg) is used when residents seek digital healthcare or call 1177 (Sweden’s national healthcare advice line). A chatbot asks questions to make an initial medical assessment and then refers the patient to an appropriate level of care. However, according to several doctors in the region, the system is not functioning safely enough.

In one documented case, the app classified an elderly man’s symptoms as mild. He died the following day. An incident report shows that the prioritization was incorrect, although it couldn’t be established that this directly caused the death.

In another case, an inmate at the Gävle Correctional Facility sought care for breathing difficulties – but was referred to a chat with a doctor in Ljusdal, instead of being sent to the emergency room.

– She should obviously have been sent to the emergency room, says Elisabeth Månsson Rydén, a doctor in Ljusdal and board member of the Swedish Association of General Medicine in Gävleborg, speaking to the tax-funded SVT.

“Completely insane”

Criticism from healthcare staff is extensive. Several doctors warn that the app underestimates serious symptoms, which could have life-threatening consequences. Meanwhile, there are examples of the opposite – where patients are given too high priority – which risks unnecessarily burdening healthcare services and causing delays for severely ill patients.

– Doctors have expressed in our meetings that Min vård Gävleborg is a disaster. This is completely insane, says Månsson Rydén.

Despite the death incident, Region Gävleborg has chosen not to report the event to either the Health and Social Care Inspectorate (IVO) or the Swedish Medical Products Agency.

– We looked at the case and decided it didn’t need to be reported, says Chief Medical Officer Agneta Larsson.

Other regions have reacted

The app was developed by Platform24, a Swedish company whose digital systems are used in several regions. In Västra Götaland Region, the app was paused after a report showed that three out of ten patients were assessed incorrectly. In Region Östergötland, similar deficiencies have led to a report to the Swedish Medical Products Agency. An investigation is ongoing.

Despite this, Agneta Larsson defends the version used in Gävleborg:

– We have reviewed our own system, and we cannot see these errors.

Platform24 has declined to be interviewed, but in a written response to Swedish Television, the company’s Medical Director Stina Perdahl defends the app’s basic principles.

“For patient safety reasons, the assessment is deliberately designed to be a bit more cautious initially”, it is claimed.

Your TV is spying on you

Your TV is taking snapshots of everything you watch.

Published 28 June 2025
– By Naomi Brockwell
6 minute read

You sit down to relax, put on your favorite show, and settle in for a night of binge-watching. But while you’re watching your TV… your TV is watching you.

Smart TVs take constant snapshots of everything you watch. Sometimes hundreds of snapshots a second.

Welcome to the future of “entertainment”.

What’s actually happening behind the screens?

Smart TVs are just modern TVs. It’s almost impossible to buy a non-smart TV anymore. And they’re basically just oversized internet-connected computers. They come preloaded with apps like Amazon Prime Video, YouTube, and Hulu.

They also come preloaded with surveillance.

recent study from UC Davis researchers tested TVs from Samsung and LG, two of the biggest players in the market, and came across something known as ACR: Automatic Content Recognition.

What is ACR and why should you care?

ACR is a surveillance technology built into the operating systems of smart TVs. This system takes continuous snapshots of whatever is playing to identify exactly what is on the screen.

LG’s privacy policy states they take a snapshot every 10 milliseconds. That’s 100 per second.
Samsung does it every 500 milliseconds.

From these snapshots, the TV generates a content fingerprint and sends it to the manufacturer. That fingerprint is then matched against a massive database to figure out exactly what you’re watching.

Let that sink in. Your television is taking snapshots of everything you’re watching.

And it doesn’t just apply to shows you’re watching on the TV. Even if you plug in your laptop and use the TV as a dumb monitor, it’s still taking snapshots.

  • Zoom calls
  • Emails
  • Banking apps
  • Personal photos

Audio or video snapshots, or sometimes both, are being collected of all of it.

Currently, the way ACR works, the snapshots themselves are not necessarily sent off-device, but your TV is still collecting them. And we all know that AI is getting better and better. It’s now possible for AI to identify everything in a video or photo: faces, emotions, background details.

As the technology continues to improve, we should presume that TVs will move from fingerprint-based ACR to automatic AI-driven content recognition.

As Toby Lewis from Darktrace told The Guardian:

“Facial recognition, speech-to-text, content analysis—these can all be used together to build an in-depth picture of an individual user”.

This is where we’re headed.

This data doesn’t exist in a vacuum

TV manufacturers don’t just sit on this data. They monetize it.

Viewing habits are combined with data from your other devices: phones, tablets, smart fridges, wearables. Then it’s sold to third parties. Advertisers. Data brokers. Political campaigns.

One study found that almost every TV they tested contacted Netflix servers, even when no Netflix account was configured.

So who’s getting your data?

We don’t know. That’s the point.

How your data gets weaponized

Let’s say your TV learns that:

  • You watch sports every Sunday
  • You binge true crime on weekdays
  • You play YouTube fashion hauls in the afternoons

These habits are then tied to a profile of your IP address, email, and household.

Now imagine that profile combined with:

  • Your Amazon purchase history
  • Your travel patterns
  • Your social media behavior
  • Your voting record

That’s the real goal: total psychological profiling. Knowing not just what you do, but what you’re likely to do. What you’ll buy, how you’ll vote, who you’ll trust.

In other words, your smart TV isn’t just spying.

It’s helping others manipulate you.

Why didn’t I hear about this when I set up my TV?

Because they don’t want you to know.

When TV manufacturers first started doing this, they never informed users. The practice slipped quietly by.

A 2017 FTC lawsuit revealed that Vizio was collecting viewing data from 11 million TVs and selling it without ever getting user consent.

These days, companies technically include “disclosures” in their Terms of Service. But they’re buried under vague names like:

  • “Viewing Information Services”
  • “Live Plus”
  • “Personalized Experiences”

Have you ever actually read those menus? Didn’t think so.

These aren’t written to inform you. They’re written to shield corporations from lawsuits.

If users actually understood what was happening, many would opt out entirely. But the system is designed to confuse and hide from you the truth that surveillance devices entered our living rooms and bedrooms without us realizing.

Researchers are being silenced

Not only are these systems intentionally opaque and confusing, companies design them to discourage scrutiny.

And when researchers try to investigate these systems, they hit two major roadblocks:

  1. Technical – Jailbreaking modern Smart TVs is nearly impossible. Their systems are locked down, and the code is proprietary.
  2. Legal – Researchers who attempt to reverse-engineer modern-day tech risk being sued under the Computer Fraud and Abuse Act (CFAA), a vague and outdated law that doesn’t distinguish between malicious actors and researchers trying to inform the public.

As a result, most of what we know about these TVs comes from inference. Guessing what’s happening by watching network traffic, since direct access is often blocked.

That means most of this surveillance happens in the dark. Unchallenged, unverified, and largely unnoticed.

We need stronger protections for privacy researchers, clearer disclosures for users, and real pressure on companies to stop hiding behind complexity.

Because if we can’t see what the tech is doing, we can’t choose to opt out.

What you can do

Here are the most effective steps you can take to protect your privacy:

1. Don’t connect your TV to the internet.
If you keep the Wi-Fi off, the TV can’t send data to manufacturers or advertisers. Use a laptop or trusted device for streaming instead. If the TV stays offline forever, the data it collects never leaves the device.

2. Turn off ACR settings.
Dig through the menus and disable everything related to viewing info, advertising, and personalization. Look for settings like “Live Plus” or “Viewing Information Services.” Be thorough. These options are often buried.

3. Use dumb displays.
It’s almost impossible to buy a non-smart TV today. The market is flooded with “smart” everything. But a few dumb projectors still exist. And some monitors are safer too, though they don’t go to TV sizes yet.

4. Be vocal.
Ask hard questions when buying devices. Demand that manufacturers disclose how they use your data. Let them know that privacy matters to you.

5. Push for CFAA reform.
The CFAA is being weaponized to silence researchers who try to expose surveillance. If we want to understand how our tech works, researchers must be protected, not punished. We need to fight back against these chilling effects and support organizations doing this work.

The Ludlow Institute is now funding researchers who reverse-engineer surveillance tech. If you’re a researcher, or want to support one, get in touch.

This is just one piece of the puzzle

Smart TVs are among the most aggressive tracking devices in your home. But they’re not alone. Nearly every “smart” device has the same capabilities to build a profile on you: phones, thermostats, lightbulbs, doorbells, fridges.

This surveillance has been normalized. But it’s not normal.

We shouldn’t have let faceless corporations and governments into our bedrooms and living rooms. But now that they’re here, we have to push back.

That starts with awareness. Then it’s up to us to make better choices and help others do the same.

Let’s take back our homes.
Let’s stop normalizing surveillance.

Because privacy isn’t extreme.
It’s common sense.

 

Yours in Privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.