Thursday, September 4, 2025

Polaris of Enlightenment

Your TV is spying on you

Your TV is taking snapshots of everything you watch.

Published 28 June 2025
– By Naomi Brockwell
6 minute read

You sit down to relax, put on your favorite show, and settle in for a night of binge-watching. But while you’re watching your TV… your TV is watching you.

Smart TVs take constant snapshots of everything you watch. Sometimes hundreds of snapshots a second.

Welcome to the future of “entertainment”.

What’s actually happening behind the screens?

Smart TVs are just modern TVs. It’s almost impossible to buy a non-smart TV anymore. And they’re basically just oversized internet-connected computers. They come preloaded with apps like Amazon Prime Video, YouTube, and Hulu.

They also come preloaded with surveillance.

recent study from UC Davis researchers tested TVs from Samsung and LG, two of the biggest players in the market, and came across something known as ACR: Automatic Content Recognition.

What is ACR and why should you care?

ACR is a surveillance technology built into the operating systems of smart TVs. This system takes continuous snapshots of whatever is playing to identify exactly what is on the screen.

LG’s privacy policy states they take a snapshot every 10 milliseconds. That’s 100 per second.
Samsung does it every 500 milliseconds.

From these snapshots, the TV generates a content fingerprint and sends it to the manufacturer. That fingerprint is then matched against a massive database to figure out exactly what you’re watching.

Let that sink in. Your television is taking snapshots of everything you’re watching.

And it doesn’t just apply to shows you’re watching on the TV. Even if you plug in your laptop and use the TV as a dumb monitor, it’s still taking snapshots.

  • Zoom calls
  • Emails
  • Banking apps
  • Personal photos

Audio or video snapshots, or sometimes both, are being collected of all of it.

Currently, the way ACR works, the snapshots themselves are not necessarily sent off-device, but your TV is still collecting them. And we all know that AI is getting better and better. It’s now possible for AI to identify everything in a video or photo: faces, emotions, background details.

As the technology continues to improve, we should presume that TVs will move from fingerprint-based ACR to automatic AI-driven content recognition.

As Toby Lewis from Darktrace told The Guardian:

“Facial recognition, speech-to-text, content analysis—these can all be used together to build an in-depth picture of an individual user”.

This is where we’re headed.

This data doesn’t exist in a vacuum

TV manufacturers don’t just sit on this data. They monetize it.

Viewing habits are combined with data from your other devices: phones, tablets, smart fridges, wearables. Then it’s sold to third parties. Advertisers. Data brokers. Political campaigns.

One study found that almost every TV they tested contacted Netflix servers, even when no Netflix account was configured.

So who’s getting your data?

We don’t know. That’s the point.

How your data gets weaponized

Let’s say your TV learns that:

  • You watch sports every Sunday
  • You binge true crime on weekdays
  • You play YouTube fashion hauls in the afternoons

These habits are then tied to a profile of your IP address, email, and household.

Now imagine that profile combined with:

  • Your Amazon purchase history
  • Your travel patterns
  • Your social media behavior
  • Your voting record

That’s the real goal: total psychological profiling. Knowing not just what you do, but what you’re likely to do. What you’ll buy, how you’ll vote, who you’ll trust.

In other words, your smart TV isn’t just spying.

It’s helping others manipulate you.

Why didn’t I hear about this when I set up my TV?

Because they don’t want you to know.

When TV manufacturers first started doing this, they never informed users. The practice slipped quietly by.

A 2017 FTC lawsuit revealed that Vizio was collecting viewing data from 11 million TVs and selling it without ever getting user consent.

These days, companies technically include “disclosures” in their Terms of Service. But they’re buried under vague names like:

  • “Viewing Information Services”
  • “Live Plus”
  • “Personalized Experiences”

Have you ever actually read those menus? Didn’t think so.

These aren’t written to inform you. They’re written to shield corporations from lawsuits.

If users actually understood what was happening, many would opt out entirely. But the system is designed to confuse and hide from you the truth that surveillance devices entered our living rooms and bedrooms without us realizing.

Researchers are being silenced

Not only are these systems intentionally opaque and confusing, companies design them to discourage scrutiny.

And when researchers try to investigate these systems, they hit two major roadblocks:

  1. Technical – Jailbreaking modern Smart TVs is nearly impossible. Their systems are locked down, and the code is proprietary.
  2. Legal – Researchers who attempt to reverse-engineer modern-day tech risk being sued under the Computer Fraud and Abuse Act (CFAA), a vague and outdated law that doesn’t distinguish between malicious actors and researchers trying to inform the public.

As a result, most of what we know about these TVs comes from inference. Guessing what’s happening by watching network traffic, since direct access is often blocked.

That means most of this surveillance happens in the dark. Unchallenged, unverified, and largely unnoticed.

We need stronger protections for privacy researchers, clearer disclosures for users, and real pressure on companies to stop hiding behind complexity.

Because if we can’t see what the tech is doing, we can’t choose to opt out.

What you can do

Here are the most effective steps you can take to protect your privacy:

1. Don’t connect your TV to the internet.
If you keep the Wi-Fi off, the TV can’t send data to manufacturers or advertisers. Use a laptop or trusted device for streaming instead. If the TV stays offline forever, the data it collects never leaves the device.

2. Turn off ACR settings.
Dig through the menus and disable everything related to viewing info, advertising, and personalization. Look for settings like “Live Plus” or “Viewing Information Services.” Be thorough. These options are often buried.

3. Use dumb displays.
It’s almost impossible to buy a non-smart TV today. The market is flooded with “smart” everything. But a few dumb projectors still exist. And some monitors are safer too, though they don’t go to TV sizes yet.

4. Be vocal.
Ask hard questions when buying devices. Demand that manufacturers disclose how they use your data. Let them know that privacy matters to you.

5. Push for CFAA reform.
The CFAA is being weaponized to silence researchers who try to expose surveillance. If we want to understand how our tech works, researchers must be protected, not punished. We need to fight back against these chilling effects and support organizations doing this work.

The Ludlow Institute is now funding researchers who reverse-engineer surveillance tech. If you’re a researcher, or want to support one, get in touch.

This is just one piece of the puzzle

Smart TVs are among the most aggressive tracking devices in your home. But they’re not alone. Nearly every “smart” device has the same capabilities to build a profile on you: phones, thermostats, lightbulbs, doorbells, fridges.

This surveillance has been normalized. But it’s not normal.

We shouldn’t have let faceless corporations and governments into our bedrooms and living rooms. But now that they’re here, we have to push back.

That starts with awareness. Then it’s up to us to make better choices and help others do the same.

Let’s take back our homes.
Let’s stop normalizing surveillance.

Because privacy isn’t extreme.
It’s common sense.

 

Yours in Privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

IP addresses are used in Sweden to track unemployed people

Published 1 September 2025
– By Editorial Staff
The Swedish Public Employment Service has already identified approximately 4,000 people who appear to have logged in from a country other than Sweden.
2 minute read

The Swedish Public Employment Service (Arbetsförmedlingen) has begun tracking the IP addresses of unemployed individuals to verify that they are actually located in Sweden. Approximately 4,000 people who logged in from foreign IP numbers now risk losing their benefits.

To be eligible for unemployment insurance (A-kassa) and other forms of compensation linked to being unemployed, certain requirements must be met. One of these requirements is that individuals must be located in Sweden, in order to be available in case a job opportunity arises.

When job seekers log into the Swedish Public Employment Service’s website, their IP address is now checked. If a person logs in from a foreign IP number, this suggests that they are located in another country.

The Swedish Public Employment Service has been tracking job seekers since the end of June, and the agency has already identified approximately 4,000 people who appear to have logged in from a country other than Sweden.

It’s a way to counteract the risk of incorrect payments. We’re talking about people who are abroad even though they should be in Sweden looking for work or participating in labor market policy programs, says Andreas Malmgren, operations controller at the Swedish Public Employment Service, to the Bonnier publication DN.

None of these individuals have been contacted yet, but the agency plans to make contact during September. These people risk having their benefits withdrawn.

Furthermore, the agency has also established a special tool to check whether job seekers are using VPN services, so that no one ends up among those flagged by mistake.

Wifi signals can identify people with 95 percent accuracy

Mass surveillance

Published 21 August 2025
– By Editorial Staff
2 minute read

Italian researchers have developed a technique that can track and identify individuals by analyzing how wifi signals reflect off human bodies. The method works even when people change clothes and can be used for surveillance.

Researchers at La Sapienza University in Rome have developed a new method for identifying and tracking people using wifi signals. The technique, which the researchers call “WhoFi”, can recognize people with an accuracy rate of up to 95 percent, reports Sweclockers.

The method is based on the fact that wifi signals reflect and refract in different ways when they hit human bodies. By analyzing these reflection patterns using machine learning and artificial neural networks, researchers can create unique “fingerprints” for each individual.

Works despite clothing changes

Experiments show that these digital fingerprints are stable enough to identify people even when they change clothes or carry backpacks. The average recognition rate is 88 percent, which researchers say is comparable to other automatic identification methods.

The research results were published in mid-July and describe how the technology could be used in surveillance contexts. According to the researchers, WhoFi can solve the problem of re-identifying people who were first observed via a surveillance camera in one location and then need to be found in footage from cameras in other locations.

Can be used for surveillance

The technology opens up new possibilities in security surveillance, but simultaneously raises questions about privacy and personal security. The fact that wifi networks, which are ubiquitous in today’s society, can be used to track people without their knowledge represents a new dimension of digital surveillance.

The researchers present their discovery as a breakthrough in the field of automatic person identification, but do not address the ethical implications that the technology may have for individuals’ privacy.

Danish students build drone that flies and swims

Published 18 August 2025
– By Editorial Staff
2 minute read

Four students at Aalborg University in Denmark have developed a revolutionary drone that seamlessly transitions between air and water. The prototype uses innovative rotor technology that automatically adapts to different environments.

Four students at Aalborg University in Denmark have created something that sounds like science fiction – a drone that can literally fly down into water, swim around and then jump back up into the air to continue flying, reports Tom’s Hardware.

Students Andrei Copaci, Pawel Kowalczyk, Krzysztof Sierocki and Mikolaj Dzwigalo have developed a prototype as their thesis project that demonstrates how future amphibious drones could function. The project has attracted attention from technology media after a demonstration video showed the drone flying over a pool, crashing down into the water, navigating underwater and then taking off into the air again.

Intelligent rotor technology solves the challenge

The secret behind the impressive performance lies in what the team calls a “variable rotor system”. The individual rotor blades can automatically adjust their pitch angle depending on whether the drone is in air or water.

When the drone flies through the air, the rotor blades work at a higher angle for optimal lift capacity. Underwater, the blade pitch is lowered to reduce resistance and improve efficiency during navigation. The system can also reverse thrust to increase maneuverability when the drone moves through tight passages underwater.

Most components in the prototype have been manufactured by the students themselves using 3D printers, since equivalent parts were not available on the market.

Although the project is still in an early concept stage and exists only as a single prototype, it demonstrates the possibilities for future amphibious vehicles. The technology could have applications in everything from rescue operations to environmental monitoring where vehicles need to move both above and below the water surface.

What I learnt at DEFCON

Why hacker culture is essential if we want to win the privacy war.

Published 16 August 2025
– By Naomi Brockwell
6 minute read

DEFCON is the world’s largest hacker conference. Every year, tens of thousands of people gather in Las Vegas to share research, run workshops, compete in capture-the-flag tournaments, and break things for sport. It’s a subculture. A testing ground. A place where some of the best minds in security and privacy come together not just to learn, but to uncover what’s being hidden from the rest of us. It’s where curiosity runs wild.

But to really get DEFCON, you have to understand the people.

What is a hacker?

I love hacker conferences because of the people. Hackers are notoriously seen as dangerous. The stereotype is that they wear black hoodies and Guy Fawkes masks.

But that’s not why they’re dangerous: They’re dangerous because they ask questions and have relentless curiosity.

Hackers have a deep-seated drive to learn how things work, not just at the surface, but down to their core.

They aren’t content with simply using tech. They want to open it up, examine it, and see the hidden gears turning underneath.

A hacker sees a device and doesn’t just ask, “What does it do?”
They ask, “What else could it do?”
“What isn’t it telling me?”
“What’s under the hood, and why does no one want me to look?”

They’re curious enough to pull back curtains others want to remain closed.

They reject blind compliance and test boundaries.
When society says “Do this,” hackers ask “Why?”

They don’t need a rulebook or external approval.
They trust their own instincts and intelligence.
They’re guided by internal principles, not external prescriptions.
They’re not satisfied with the official version. They challenge it.

Because of this, hackers are often at the fringes of society. They’re comfortable with being misunderstood or even vilified. Hackers are unafraid to reveal truths that powerful entities want buried.

But that position outside the mainstream gives them perspective: They see what others miss.

Today, the word “hack” is everywhere:
Hack your productivity.
Hack your workout.
Hack your life.

What it really means is:
Don’t accept the defaults.
Look under the surface.
Find a better way.

That’s what makes hacker culture powerful.
It produces people who will open the box even when they’re told not to.
People who don’t wait for permission to investigate how the tools we use every day are compromising us.

That insistence on curiosity, noncompliance, and pushing past the surface to see what’s buried underneath is exactly what we need in a world built on hidden systems of control.

We should all aspire to be hackers, especially when it comes to confronting power and surveillance.

Everything is computer

Basically every part of our lives runs on computers now.
Your phone. Your car. Your thermostat. Your TV. Your kid’s toys.
And much of this tech has been quietly and invisibly hijacked for surveillance.

Companies and governments both want your data. And neither want you asking how these data collection systems work.

We’re inside a deeply connected world, built on an opaque infrastructure that is extracting behavioral data at scale.

You have a right to know what’s happening inside the tech you use every day.
Peeking behind the curtain is not a crime. It’s a public service.

In today’s world, the hacker mindset is not just useful. It’s necessary.

Hacker culture in a surveillance world

People who ask questions are a nightmare for those who want to keep you in the dark.
They know how to dig.
They don’t take surveillance claims at face value.
They know how to verify what data is actually being collected.
They don’t trust boilerplate privacy policies or vague legalese.
They reverse-engineer SDKs.
They monitor network traffic.
They intercept outgoing requests and inspect payloads.

And they don’t ask for permission.

That’s what makes hacker culture so important. If we want any hope of reclaiming privacy, we need people with the skills and the willingness to pull apart the systems we’re told not to question.

On top of that, governments and corporations both routinely use outdated and overbroad legislation like the Computer Fraud and Abuse Act (CFAA) to prosecute public-interest researchers who investigate tech. Not because those researchers cause harm, but because they reveal things that others want kept hidden.

Laws like this pressure people towards compliance, and make them afraid to ask questions. The result is that curiosity feels like a liability, and it becomes harder for the average person to understand how the digital systems around us actually work.

That’s why the hacker mindset matters so much: Because no matter how hard the system pushes back, they keep asking questions.

The researchers I met at DEFCON

This year at DEFCON, I met researchers who are doing exactly that.

People uncovering surveillance code embedded in children’s toys.
People doing analysis on facial recognition SDKs.
People testing whether your photo is really deleted after “verification”.
People capturing packets who discovered that the “local only” systems you’re using aren’t local at all, and are sending your data to third parties.
People analyzing “ephemeral” IDs, and finding that your data was being stored and linked back to real identities.

You’ll be hearing from some of them on our channel in the coming months.
Their work is extraordinary, and helping all of us move towards a world of informed consent instead of blind compliance. Without this kind of research, the average person has no way to know what’s happening behind the scenes. We can’t make good decisions about the tech we use if we don’t know what it’s doing.

Make privacy cool again

Making privacy appealing is not just about education.
It’s about making it cool.

Hacker culture has always been at the forefront of turning fringe ideas into mainstream trends. Films like Hackers and The Matrix made hackers a status symbol. Movements like The Crypto Wars (when the government fought Phil Zimmermann over PGP), and the Clipper Chip fights (when they tried to standardize surveillance backdoors across hardware) made cypherpunks and privacy activists aspirational.

Hackers take the things mainstream culture mocks or fears, and make them edgy and cool.

That’s what we need here. We need a cultural transformation and to push back against the shameful language that demands we justify our desire for privacy.

You shouldn’t have to explain why you don’t want to be watched.
You shouldn’t have to defend your decision to protect your communications.

Make privacy a badge of honor.
Make privacy tools a status symbol.
Make the act of encrypting, self-hosting, and masking your identity a signal that says you’re independent, intelligent, and not easily manipulated.

Show that the people who care about privacy are the same people who invent the future.

Most people don’t like being trailblazers, because it’s scary. But if you’re reading this, you’re one of the early adopters, which means you’re already one of the fearless ones.

When you take a stand visibly, you create a quorum and make it safer for others to join in. That’s how movements grow, and we go from being weirdos in the corner to becoming the majority.

If privacy is stigmatized, reclaiming it will take bold, fearless, visible action.
The hacker community is perfectly positioned to lead that charge, and to make it safe for the rest of the world to follow.

When you show up and say, “I care about this,” you give others permission to care too.

Privacy may be on the fringe right now, but that’s where all great movements begin.

Final Thoughts

What I learnt at DEFCON is that curiosity is powerful.
Refusal to comply is powerful.
The simple act of asking questions can be revolutionary.

There are systems all around us extracting data and consolidating control, and most people don’t know how to fight that, and are too scared to try.

Hacker culture is the secret sauce.

Let’s apply this drive to the systems of surveillance.
Let’s investigate the tools we’ve been told to trust.
Let’s explain what’s actually happening.
Let’s give people the knowledge they need to make better choices.

Let’s build a world where curiosity isn’t criminalized but celebrated.

DEFCON reminded me that we don’t need to wait for permission to start doing that.

We can just do things.

So let’s start now.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.