Thursday, August 21, 2025

Polaris of Enlightenment

Google introduces digital fingerprint tracking technology

Published 24 January 2025
– By Editorial Staff
The UK data protection authority ICO has reacted strongly to Google's revised advertising policy.
2 minute read

Google has updated its advertising policy, allowing advertisers to track web users with so-called digital fingerprints. The new policy will come into effect in February.

Digital fingerprinting uses signals such as IP address, location, language, software and operating system to identify devices and users online. This tracking technology is more comprehensive than cookies, as information can be collected without the user’s knowledge or consent. Unlike cookies, which can be easily deleted, digital fingerprint data is stored remotely and cannot be deleted by the user.

The new policy will take effect on February 16, Ghacks.net reports. Another change is that Google has removed a previous rule that prohibited advertisers from sending permanent identifying information, like a cell phone’s unique device ID. This means that advertisers can now identify users based on their devices and pass the information to Google for tracking.

As recently as 2019, Google judged that this type of tracking “undermines users’ freedom of choice and is wrong”, but has since changed its stance. The company now says that techniques to protect user privacy have improved significantly and that more essential services are now funded by advertising.

“Must be used legally and transparently”

The UK’s Information Commissioner’s Office (ICO) was among the first to react to Google’s change in advertising policy, stressing that companies cannot use digital fingerprinting in any way they see fit.

Like all advertising technology, it must be lawfully and transparently deployed – and if it is not, the ICO will act, it wrote in its blog.

Some content blockers can provide protection against digital fingerprinting, according to Ghacks.net. Even browsers such as Firefox and Brave offer built-in protection against this type of tracking.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Wifi signals can identify people with 95 percent accuracy

Mass surveillance

Published today 17:03
– By Editorial Staff
2 minute read

Italian researchers have developed a technique that can track and identify individuals by analyzing how wifi signals reflect off human bodies. The method works even when people change clothes and can be used for surveillance.

Researchers at La Sapienza University in Rome have developed a new method for identifying and tracking people using wifi signals. The technique, which the researchers call “WhoFi”, can recognize people with an accuracy rate of up to 95 percent, reports Sweclockers.

The method is based on the fact that wifi signals reflect and refract in different ways when they hit human bodies. By analyzing these reflection patterns using machine learning and artificial neural networks, researchers can create unique “fingerprints” for each individual.

Works despite clothing changes

Experiments show that these digital fingerprints are stable enough to identify people even when they change clothes or carry backpacks. The average recognition rate is 88 percent, which researchers say is comparable to other automatic identification methods.

The research results were published in mid-July and describe how the technology could be used in surveillance contexts. According to the researchers, WhoFi can solve the problem of re-identifying people who were first observed via a surveillance camera in one location and then need to be found in footage from cameras in other locations.

Can be used for surveillance

The technology opens up new possibilities in security surveillance, but simultaneously raises questions about privacy and personal security. The fact that wifi networks, which are ubiquitous in today’s society, can be used to track people without their knowledge represents a new dimension of digital surveillance.

The researchers present their discovery as a breakthrough in the field of automatic person identification, but do not address the ethical implications that the technology may have for individuals’ privacy.

Danish students build drone that flies and swims

Published 18 August 2025
– By Editorial Staff
2 minute read

Four students at Aalborg University in Denmark have developed a revolutionary drone that seamlessly transitions between air and water. The prototype uses innovative rotor technology that automatically adapts to different environments.

Four students at Aalborg University in Denmark have created something that sounds like science fiction – a drone that can literally fly down into water, swim around and then jump back up into the air to continue flying, reports Tom’s Hardware.

Students Andrei Copaci, Pawel Kowalczyk, Krzysztof Sierocki and Mikolaj Dzwigalo have developed a prototype as their thesis project that demonstrates how future amphibious drones could function. The project has attracted attention from technology media after a demonstration video showed the drone flying over a pool, crashing down into the water, navigating underwater and then taking off into the air again.

Intelligent rotor technology solves the challenge

The secret behind the impressive performance lies in what the team calls a “variable rotor system”. The individual rotor blades can automatically adjust their pitch angle depending on whether the drone is in air or water.

When the drone flies through the air, the rotor blades work at a higher angle for optimal lift capacity. Underwater, the blade pitch is lowered to reduce resistance and improve efficiency during navigation. The system can also reverse thrust to increase maneuverability when the drone moves through tight passages underwater.

Most components in the prototype have been manufactured by the students themselves using 3D printers, since equivalent parts were not available on the market.

Although the project is still in an early concept stage and exists only as a single prototype, it demonstrates the possibilities for future amphibious vehicles. The technology could have applications in everything from rescue operations to environmental monitoring where vehicles need to move both above and below the water surface.

What I learnt at DEFCON

Why hacker culture is essential if we want to win the privacy war.

Published 16 August 2025
– By Naomi Brockwell
6 minute read

DEFCON is the world’s largest hacker conference. Every year, tens of thousands of people gather in Las Vegas to share research, run workshops, compete in capture-the-flag tournaments, and break things for sport. It’s a subculture. A testing ground. A place where some of the best minds in security and privacy come together not just to learn, but to uncover what’s being hidden from the rest of us. It’s where curiosity runs wild.

But to really get DEFCON, you have to understand the people.

What is a hacker?

I love hacker conferences because of the people. Hackers are notoriously seen as dangerous. The stereotype is that they wear black hoodies and Guy Fawkes masks.

But that’s not why they’re dangerous: They’re dangerous because they ask questions and have relentless curiosity.

Hackers have a deep-seated drive to learn how things work, not just at the surface, but down to their core.

They aren’t content with simply using tech. They want to open it up, examine it, and see the hidden gears turning underneath.

A hacker sees a device and doesn’t just ask, “What does it do?”
They ask, “What else could it do?”
“What isn’t it telling me?”
“What’s under the hood, and why does no one want me to look?”

They’re curious enough to pull back curtains others want to remain closed.

They reject blind compliance and test boundaries.
When society says “Do this,” hackers ask “Why?”

They don’t need a rulebook or external approval.
They trust their own instincts and intelligence.
They’re guided by internal principles, not external prescriptions.
They’re not satisfied with the official version. They challenge it.

Because of this, hackers are often at the fringes of society. They’re comfortable with being misunderstood or even vilified. Hackers are unafraid to reveal truths that powerful entities want buried.

But that position outside the mainstream gives them perspective: They see what others miss.

Today, the word “hack” is everywhere:
Hack your productivity.
Hack your workout.
Hack your life.

What it really means is:
Don’t accept the defaults.
Look under the surface.
Find a better way.

That’s what makes hacker culture powerful.
It produces people who will open the box even when they’re told not to.
People who don’t wait for permission to investigate how the tools we use every day are compromising us.

That insistence on curiosity, noncompliance, and pushing past the surface to see what’s buried underneath is exactly what we need in a world built on hidden systems of control.

We should all aspire to be hackers, especially when it comes to confronting power and surveillance.

Everything is computer

Basically every part of our lives runs on computers now.
Your phone. Your car. Your thermostat. Your TV. Your kid’s toys.
And much of this tech has been quietly and invisibly hijacked for surveillance.

Companies and governments both want your data. And neither want you asking how these data collection systems work.

We’re inside a deeply connected world, built on an opaque infrastructure that is extracting behavioral data at scale.

You have a right to know what’s happening inside the tech you use every day.
Peeking behind the curtain is not a crime. It’s a public service.

In today’s world, the hacker mindset is not just useful. It’s necessary.

Hacker culture in a surveillance world

People who ask questions are a nightmare for those who want to keep you in the dark.
They know how to dig.
They don’t take surveillance claims at face value.
They know how to verify what data is actually being collected.
They don’t trust boilerplate privacy policies or vague legalese.
They reverse-engineer SDKs.
They monitor network traffic.
They intercept outgoing requests and inspect payloads.

And they don’t ask for permission.

That’s what makes hacker culture so important. If we want any hope of reclaiming privacy, we need people with the skills and the willingness to pull apart the systems we’re told not to question.

On top of that, governments and corporations both routinely use outdated and overbroad legislation like the Computer Fraud and Abuse Act (CFAA) to prosecute public-interest researchers who investigate tech. Not because those researchers cause harm, but because they reveal things that others want kept hidden.

Laws like this pressure people towards compliance, and make them afraid to ask questions. The result is that curiosity feels like a liability, and it becomes harder for the average person to understand how the digital systems around us actually work.

That’s why the hacker mindset matters so much: Because no matter how hard the system pushes back, they keep asking questions.

The researchers I met at DEFCON

This year at DEFCON, I met researchers who are doing exactly that.

People uncovering surveillance code embedded in children’s toys.
People doing analysis on facial recognition SDKs.
People testing whether your photo is really deleted after “verification”.
People capturing packets who discovered that the “local only” systems you’re using aren’t local at all, and are sending your data to third parties.
People analyzing “ephemeral” IDs, and finding that your data was being stored and linked back to real identities.

You’ll be hearing from some of them on our channel in the coming months.
Their work is extraordinary, and helping all of us move towards a world of informed consent instead of blind compliance. Without this kind of research, the average person has no way to know what’s happening behind the scenes. We can’t make good decisions about the tech we use if we don’t know what it’s doing.

Make privacy cool again

Making privacy appealing is not just about education.
It’s about making it cool.

Hacker culture has always been at the forefront of turning fringe ideas into mainstream trends. Films like Hackers and The Matrix made hackers a status symbol. Movements like The Crypto Wars (when the government fought Phil Zimmermann over PGP), and the Clipper Chip fights (when they tried to standardize surveillance backdoors across hardware) made cypherpunks and privacy activists aspirational.

Hackers take the things mainstream culture mocks or fears, and make them edgy and cool.

That’s what we need here. We need a cultural transformation and to push back against the shameful language that demands we justify our desire for privacy.

You shouldn’t have to explain why you don’t want to be watched.
You shouldn’t have to defend your decision to protect your communications.

Make privacy a badge of honor.
Make privacy tools a status symbol.
Make the act of encrypting, self-hosting, and masking your identity a signal that says you’re independent, intelligent, and not easily manipulated.

Show that the people who care about privacy are the same people who invent the future.

Most people don’t like being trailblazers, because it’s scary. But if you’re reading this, you’re one of the early adopters, which means you’re already one of the fearless ones.

When you take a stand visibly, you create a quorum and make it safer for others to join in. That’s how movements grow, and we go from being weirdos in the corner to becoming the majority.

If privacy is stigmatized, reclaiming it will take bold, fearless, visible action.
The hacker community is perfectly positioned to lead that charge, and to make it safe for the rest of the world to follow.

When you show up and say, “I care about this,” you give others permission to care too.

Privacy may be on the fringe right now, but that’s where all great movements begin.

Final Thoughts

What I learnt at DEFCON is that curiosity is powerful.
Refusal to comply is powerful.
The simple act of asking questions can be revolutionary.

There are systems all around us extracting data and consolidating control, and most people don’t know how to fight that, and are too scared to try.

Hacker culture is the secret sauce.

Let’s apply this drive to the systems of surveillance.
Let’s investigate the tools we’ve been told to trust.
Let’s explain what’s actually happening.
Let’s give people the knowledge they need to make better choices.

Let’s build a world where curiosity isn’t criminalized but celebrated.

DEFCON reminded me that we don’t need to wait for permission to start doing that.

We can just do things.

So let’s start now.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

Facebook’s insidious surveillance: VPN app spied on users

Mass surveillance

Published 9 August 2025
– By Editorial Staff
2 minute read

In 2013, Facebook acquired the Israeli company Onavo for approximately 120 million dollars. Onavo was marketed as a VPN app that would protect users’ data, reduce mobile usage, and secure online activities. Over 33 million people downloaded the app believing it would strengthen their privacy.

In reality, Onavo gave Facebook complete insight into users’ phones – including which apps were used, how long they were open, and which websites were visited.

According to court documents and regulatory authorities, Facebook used this data to identify trends and map potential competitors. By analyzing user patterns in apps like Houseparty, YouTube, Amazon, and Snapchat, the company could determine which platforms posed a threat to its market dominance.

When Snapchat’s popularity began to explode in 2016, Facebook encountered a problem: encrypted traffic prevented insight into users’ behavior, reports Business Today. To circumvent this, Facebook launched an internal operation called “Project Ghostbusters”.

Facebook engineers developed specially adapted code based on Onavo’s infrastructure. The app installed a so-called root certificate on users’ phones – consent was hidden in legal documentation – which enabled Facebook to create fake certificates that mimicked Snapchat’s servers.

This made it possible to decrypt and analyze Snapchat’s traffic internally. The purpose was to use the information as a basis for strategic decisions, product development, or potential acquisitions.

Snapchat said no – Facebook copied instead

Based on data from Onavo, Facebook offered to buy Snapchat for 3 billion dollars. When Snapchat CEO Evan Spiegel declined, Facebook responded by launching Instagram Stories – a direct copy of Snapchat’s most popular feature. This became a decisive move in the competition between the two platforms.

In 2018, Apple removed Onavo from the App Store, citing that the app violated the company’s data protection rules. Facebook responded by launching a new app: Facebook Research, internally called Project Atlas, which offered similar surveillance functions. This time, the company paid users – some as young as 13 – up to 20 dollars per month to install the app.

When Apple discovered this, the company acted forcefully and revoked Facebook’s enterprise development certificates. This meant that all internal iOS apps were temporarily stopped – one of Apple’s most far-reaching measures ever.

In 2020, the Australian Competition and Consumer Commission (ACCC) sued Facebook, now called Meta, for misleading users with false promises about privacy. In 2023, Meta’s subsidiaries were fined a total of 20 million Australian dollars (approximately €11 million) for misleading behavior.

Why it still matters

Business Insider emphasizes that the Onavo story is not just about a misleading app. It also illustrates how one of the world’s most powerful tech companies built a surveillance system disguised as a privacy tool.

The fact that Facebook used the data to map competitors, copy features, and maintain control over the social media market – and also targeted underage users for data collection – raises additional ethical questions.

“Even a decade later, Onavo remains a case study in how ‘data is power’ and how far companies are willing to go to get it”, the publication concludes.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.