Tuesday, September 30, 2025

Polaris of Enlightenment

What I learnt at DEFCON

Why hacker culture is essential if we want to win the privacy war.

Published 16 August 2025
– By Naomi Brockwell
6 minute read

DEFCON is the world’s largest hacker conference. Every year, tens of thousands of people gather in Las Vegas to share research, run workshops, compete in capture-the-flag tournaments, and break things for sport. It’s a subculture. A testing ground. A place where some of the best minds in security and privacy come together not just to learn, but to uncover what’s being hidden from the rest of us. It’s where curiosity runs wild.

But to really get DEFCON, you have to understand the people.

What is a hacker?

I love hacker conferences because of the people. Hackers are notoriously seen as dangerous. The stereotype is that they wear black hoodies and Guy Fawkes masks.

But that’s not why they’re dangerous: They’re dangerous because they ask questions and have relentless curiosity.

Hackers have a deep-seated drive to learn how things work, not just at the surface, but down to their core.

They aren’t content with simply using tech. They want to open it up, examine it, and see the hidden gears turning underneath.

A hacker sees a device and doesn’t just ask, “What does it do?”
They ask, “What else could it do?”
“What isn’t it telling me?”
“What’s under the hood, and why does no one want me to look?”

They’re curious enough to pull back curtains others want to remain closed.

They reject blind compliance and test boundaries.
When society says “Do this,” hackers ask “Why?”

They don’t need a rulebook or external approval.
They trust their own instincts and intelligence.
They’re guided by internal principles, not external prescriptions.
They’re not satisfied with the official version. They challenge it.

Because of this, hackers are often at the fringes of society. They’re comfortable with being misunderstood or even vilified. Hackers are unafraid to reveal truths that powerful entities want buried.

But that position outside the mainstream gives them perspective: They see what others miss.

Today, the word “hack” is everywhere:
Hack your productivity.
Hack your workout.
Hack your life.

What it really means is:
Don’t accept the defaults.
Look under the surface.
Find a better way.

That’s what makes hacker culture powerful.
It produces people who will open the box even when they’re told not to.
People who don’t wait for permission to investigate how the tools we use every day are compromising us.

That insistence on curiosity, noncompliance, and pushing past the surface to see what’s buried underneath is exactly what we need in a world built on hidden systems of control.

We should all aspire to be hackers, especially when it comes to confronting power and surveillance.

Everything is computer

Basically every part of our lives runs on computers now.
Your phone. Your car. Your thermostat. Your TV. Your kid’s toys.
And much of this tech has been quietly and invisibly hijacked for surveillance.

Companies and governments both want your data. And neither want you asking how these data collection systems work.

We’re inside a deeply connected world, built on an opaque infrastructure that is extracting behavioral data at scale.

You have a right to know what’s happening inside the tech you use every day.
Peeking behind the curtain is not a crime. It’s a public service.

In today’s world, the hacker mindset is not just useful. It’s necessary.

Hacker culture in a surveillance world

People who ask questions are a nightmare for those who want to keep you in the dark.
They know how to dig.
They don’t take surveillance claims at face value.
They know how to verify what data is actually being collected.
They don’t trust boilerplate privacy policies or vague legalese.
They reverse-engineer SDKs.
They monitor network traffic.
They intercept outgoing requests and inspect payloads.

And they don’t ask for permission.

That’s what makes hacker culture so important. If we want any hope of reclaiming privacy, we need people with the skills and the willingness to pull apart the systems we’re told not to question.

On top of that, governments and corporations both routinely use outdated and overbroad legislation like the Computer Fraud and Abuse Act (CFAA) to prosecute public-interest researchers who investigate tech. Not because those researchers cause harm, but because they reveal things that others want kept hidden.

Laws like this pressure people towards compliance, and make them afraid to ask questions. The result is that curiosity feels like a liability, and it becomes harder for the average person to understand how the digital systems around us actually work.

That’s why the hacker mindset matters so much: Because no matter how hard the system pushes back, they keep asking questions.

The researchers I met at DEFCON

This year at DEFCON, I met researchers who are doing exactly that.

People uncovering surveillance code embedded in children’s toys.
People doing analysis on facial recognition SDKs.
People testing whether your photo is really deleted after “verification”.
People capturing packets who discovered that the “local only” systems you’re using aren’t local at all, and are sending your data to third parties.
People analyzing “ephemeral” IDs, and finding that your data was being stored and linked back to real identities.

You’ll be hearing from some of them on our channel in the coming months.
Their work is extraordinary, and helping all of us move towards a world of informed consent instead of blind compliance. Without this kind of research, the average person has no way to know what’s happening behind the scenes. We can’t make good decisions about the tech we use if we don’t know what it’s doing.

Make privacy cool again

Making privacy appealing is not just about education.
It’s about making it cool.

Hacker culture has always been at the forefront of turning fringe ideas into mainstream trends. Films like Hackers and The Matrix made hackers a status symbol. Movements like The Crypto Wars (when the government fought Phil Zimmermann over PGP), and the Clipper Chip fights (when they tried to standardize surveillance backdoors across hardware) made cypherpunks and privacy activists aspirational.

Hackers take the things mainstream culture mocks or fears, and make them edgy and cool.

That’s what we need here. We need a cultural transformation and to push back against the shameful language that demands we justify our desire for privacy.

You shouldn’t have to explain why you don’t want to be watched.
You shouldn’t have to defend your decision to protect your communications.

Make privacy a badge of honor.
Make privacy tools a status symbol.
Make the act of encrypting, self-hosting, and masking your identity a signal that says you’re independent, intelligent, and not easily manipulated.

Show that the people who care about privacy are the same people who invent the future.

Most people don’t like being trailblazers, because it’s scary. But if you’re reading this, you’re one of the early adopters, which means you’re already one of the fearless ones.

When you take a stand visibly, you create a quorum and make it safer for others to join in. That’s how movements grow, and we go from being weirdos in the corner to becoming the majority.

If privacy is stigmatized, reclaiming it will take bold, fearless, visible action.
The hacker community is perfectly positioned to lead that charge, and to make it safe for the rest of the world to follow.

When you show up and say, “I care about this,” you give others permission to care too.

Privacy may be on the fringe right now, but that’s where all great movements begin.

Final Thoughts

What I learnt at DEFCON is that curiosity is powerful.
Refusal to comply is powerful.
The simple act of asking questions can be revolutionary.

There are systems all around us extracting data and consolidating control, and most people don’t know how to fight that, and are too scared to try.

Hacker culture is the secret sauce.

Let’s apply this drive to the systems of surveillance.
Let’s investigate the tools we’ve been told to trust.
Let’s explain what’s actually happening.
Let’s give people the knowledge they need to make better choices.

Let’s build a world where curiosity isn’t criminalized but celebrated.

DEFCON reminded me that we don’t need to wait for permission to start doing that.

We can just do things.

So let’s start now.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Swedes turn to private apps as social media sharing declines

Published today 13:33
– By Editorial Staff
Young men are primarily those making less posts on social media.
1 minute read

Fewer than half of Swedes now regularly share their own posts on social media, shows a new report from the Internet Foundation (Internetstiftelsen), a Swedish internet research organization. At the same time, time spent on open platforms is decreasing – instead, people are increasingly turning to private channels.

In Sweden, the five largest platforms are YouTube, Facebook, Instagram, Snapchat and LinkedIn. However, looking at daily usage, LinkedIn is replaced by TikTok, which has become increasingly popular.

In the report “Swedes and the Internet 2025” it shows that fewer and fewer Swedes are posting their own content on the platforms. Only 45 percent regularly share their own posts on social media, which is a decrease of four percentage points compared to last year.

It is primarily young men who are making fewer posts on social media, and the decrease has mainly occurred on Snapchat. Men born in the 1970s have also essentially stopped making their own posts. Women, however, make their own posts to roughly the same extent as the previous year.

The larger the services become, the more they are filled with content that users haven’t asked for. Then you feel more like a consumer than someone who participates and contributes, says Måns Jonasson at the Internet Foundation, to Sweden’s public broadcaster SVT.

Swedes increasingly prefer to be on private platforms instead, such as WhatsApp – which is growing for the third consecutive year. Among children and young people, more use WhatsApp than Facebook Messenger.

Denmark gears up for digital independence: “We’ve been asleep for too long”

Published 21 September 2025
– By Editorial Staff
Denmark's complete dependence on non-European software, hardware and digital services is a very serious problem, according to Danish digitalization minister Caroline Stage Olsen.
2 minute read

Fear that critical IT systems could suddenly be shut down is driving European countries to strengthen their digital capabilities. Denmark is leading the development with pilot projects for open source, while the municipality of Copenhagen maps alternatives to Silicon Valley giants.

— We have been in a Sleeping Beauty slumber in Europe for too long, says Denmark’s digitalization minister Caroline Stage Olsen.

The statement comes amid a growing European debate about digital sovereignty, where Denmark has taken a leading role through concrete initiatives at both national and municipal levels.

The municipality of Copenhagen is now driving an ambitious effort to map alternatives to today’s dominant IT suppliers. Henrik Appel Esbensen, who leads the municipality’s internal audit, draws parallels to the energy sector:

— For gas and electricity we have alternative suppliers. Now we want to see if we can also become as supply-secure for IT as we want to be.

He emphasizes that the focus is on supply security rather than specifically avoiding American solutions: “For us it’s important not necessarily to get rid of American tech specifically, but that the supply security to Copenhagen is good”.

Pilot project with open source

Denmark’s digitalization ministry has started a pilot project exploring alternatives to Silicon Valley giants’ products, primarily through solutions based on open source. The initiative has gained renewed relevance following recent tensions between Denmark and the US regarding Greenland.

— We are dependent on products, services, software, hardware that come from countries outside Europe and that is a problem, states digitalization minister Caroline Stage Olsen.

Denmark is not alone in taking action. In Germany, the state of Schleswig-Holstein plans to replace Windows with Linux and seek domestic cloud providers. Meanwhile, Poland and the Baltic states are developing plans for large-scale AI data centers – a so-called “AI gigafactory” – to secure their own capacity for artificial intelligence.

— Estonia today uses the major American tech companies’ services, but we want to develop alternatives to secure our digital sovereignty, explains Estonia’s economy and industry minister Erkki Keldo to Swedish public television SVT.

“Must dare to invest”

The view on digital independence has undergone a dramatic change in a short time. Tech investor Johan Brenner from venture capital firm Creandum illustrates the shift:

— If you had asked the question a year ago, I would have just laughed at it. But now you don’t know, you might need to have a plan A and a plan B for European companies.

The path toward greater digital autonomy will be neither simple nor quick, according to Henrik Appel Esbensen in Copenhagen:

— I think it will take a long time. But it requires massive investments because there aren’t that many suppliers in the field right now. There’s no doubt that we must dare to invest in this in Europe.

Concern about what happens if critical IT systems are suddenly shut down or contracts are terminated has transformed digital sovereignty from an abstract discussion into a concrete security issue for European countries – a development that has accelerated markedly over the past year.

OpenAI monitors ChatGPT chats – can report users to police

Mass surveillance

Published 20 September 2025
– By Editorial Staff
What has been perceived as private AI conversations can now end up with police.
2 minute read

OpenAI has quietly begun monitoring users’ ChatGPT conversations and can report content to law enforcement authorities.

The revelation comes after incidents where AI chatbots have been linked to self-harm behavior, delusions, hospitalizations and suicide – what experts call “AI psychosis”.

In a blog post, the company acknowledges that they systematically scan users’ messages. When the system detects users planning to harm others, the conversations are directed to a review team that can suspend accounts and contact police.

“If human reviewers determine that a case involves an imminent threat of serious physical harm to others, we may refer it to law enforcement”, writes OpenAI.

The new policy means in practice that millions of users have their conversations scanned and that what many perceived as private conversations with an AI are now subject to systematic surveillance where content can be forwarded to authorities.

Tech journalist Noor Al-Sibai at Futurism points out that OpenAI’s statement is “short and vague” and that the company does not specify exactly what types of conversations could lead to police reports.

“It remains unclear which exact types of chats could result in user conversations being flagged for human review, much less getting referred to police”, she writes.

Security problems ignored

Ironically, ChatGPT has proven vulnerable to “jailbreaks” where users have been able to trick the system into giving instructions for building neurotoxins or step-by-step guides for suicide. Instead of addressing these fundamental security flaws, OpenAI is now choosing extensive surveillance of users.

The surveillance stands in sharp contrast to the tech company’s actions in the lawsuit against the New York Times, where the company “steadfastly rejected” demands to hand over ChatGPT logs citing user privacy.

“It’s also kind of bizarre that OpenAI even mentions privacy, given that it admitted in the same post that it’s monitoring user chats and potentially sharing them with the fuzz”, Al-Sibai notes.

May be forced to hand over chats

OpenAI CEO Sam Altman has recently acknowledged that ChatGPT does not offer the same confidentiality as conversations with real therapists or lawyers, and due to the lawsuit, the company may be forced to hand over user chats to various courts.

“OpenAI is stuck between a rock and a hard place”, writes Al-Sibai. The company is trying to handle the PR disaster from users who have suffered mental health crises, but since they “clearly having trouble controlling its own tech”, they fall back on “heavy-handed moderation that flies in the face of its own CEO’s promises”.

The tech company announces that they are “currently not” reporting self-harm cases to police, but the wording suggests that even this could change. The company has also not responded to requests to clarify what criteria are used for surveillance.

The internet is a manipulation machine

Be careful you're not playing an avatar in someone else’s propaganda war.

Published 20 September 2025
– By Naomi Brockwell
8 minute read

We’re more polarized than ever. Conversations have turned into shouting matches. Opposing ideas feel like threats, not something to debate.

But here’s something many people don’t realize: privacy and surveillance have everything to do with it. Most people never connect those dots.

Why surveillance is the key to polarization

Surveillance is the engine that makes platform-driven polarization work.

Platforms have one overriding goal: to keep us online as long as possible. And they’ve learned that nothing hooks us like outrage. If they can rile us up, we’ll stay, scroll, and click.

Outrage drives engagement. Engagement drives profit. But when outrage becomes the currency of the system, polarization is the natural byproduct. The more the platforms know about us, the easier it is to feed us the content that will push our buttons, confirm our biases, and keep us in a cycle of anger. And that anger doesn’t just keep us scrolling, it also pushes us further apart.

These platforms are not neutral spaces, they are giant marketplaces where influence is bought and sold. Every scroll, every feed, every “recommended” post is shaped by algorithms built to maximize engagement and auction off your attention. And it’s not just companies pushing shoes or handbags. It’s political groups paying to shift your vote. It’s movements paying to make you hate certain people because you think they hate you. It’s hostile governments paying to fracture our society.

Because our lives are so transparent to the surveillance machine, we’re more susceptible to manipulation than ever. Polarization isn’t cultural drift. When surveillance becomes the operating system of the internet, polarization and manipulation are the natural consequences.

The internet is a manipulation machine

Few people are really aware of how much manipulation there is online. We all fancy ourselves to be independent thinkers. We like to think we make up our own mind about things. That we choose for ourselves which videos to watch next. That we discover interesting articles all on our own.

We want to believe we’re in control. But in a system where people are constantly paying to influence us, that independence is hard to defend. The truth is, our autonomy is far more fragile than we’d like to admit.

This influence creeps into our entire online experience.

Every time you load a web page, you’ll notice that the text appears first, alongside empty white boxes, and there’s a split second before those boxes are filled up. What’s going on in that split second is an auction, as part of what’s called a real-time bidding (RTB) system.

For example, in Google’s RTB system, what’s going on behind the scenes in that split second is Google is announcing to their list of Authorized Buyers, who are the bidders plugged into Google’s ad exchange:

“Hey, this person just opened up her webpage, here’s everything we know about her. She has red hair. She rants a lot about privacy. She likes cats. Here’s her device, location, browsing history, and this is her inferred mood. Who wants to bid to put an ad in front of her?”

These authorized buyers have milliseconds to decide whether to bid and how much.

This “firehose of data” is sprayed at potentially thousands of entities. And the number of data points included can be staggering. Google knows a LOT about you. Only one buyer wins the ad slot and pays, but potentially thousands will get access to that data.

Google doesn’t make their Authorized Buyers list public, but they do publish a list of Certified External Vendors list, which is a public-facing list of vendors like demand-side platforms, ad servers, analytics providers, etc. that Google has certified to interact with their ad systems. This CEV list is the closest proxy the public gets to knowing who is involved in this real-time bidding system.

And if you scroll through the names of some of these vendors, you won’t even find a Wikipedia page for many of them. A huge number have scrubbed themselves from the internet. It’s a mix of ad companies, data brokers, even government shell companies. And many of them you can bet are just sitting quietly in these auctions so they can scrape this data, to share or sell elsewhere, or use for other purposes. Regardless of what Google’s own Terms of Service say, once this data leaves Google’s hands, they have no control.

This real-time bidding system is just one behind-the-scenes mechanisms of the influence economy. But this machinery of influence is everywhere, not just when you load a webpage.

When you go to watch a video, there are thumbnails next to the video suggesting what you should watch next, and you click on one if it looks interesting. Those video thumbnails were not accidental.

When you scroll a social media timeline, the posts that populate are intentional. Everywhere you go, you’re seeing things that people have paid to put in front of you, hoping to nudge you one way or another. Even search results, which feel like neutral gateways to information, are arranged according to what someone else wants you to see.

This system of manipulation isn’t limited to simple commercial influence, where companies just want to get us to buy a new pair of shoes.

There are faceless entities paying to shape our thoughts, shift our behavior, and sway our votes. They work to bend our worldview, to manipulate our emotions, even to make us hate other people by convincing us those people hate us.

Where privacy comes in

This is where privacy comes into play.

The more a company or government knows about us, the easier it is to manipulate us.

  • If we allow every email to be scanned and analyzed, every message to be read, every like, scroll, and post to be fed into a profile about us…
  • If companies scrape every browser click, every book we read, every piece of music we listen to, every film we watch…
  • When faceless entities know everywhere we go, whom we meet, what we do, and then they trace who those people meet, where they go, and what they do, and our entire social graph is mapped…

In this current reality, the surveillance industrial complex knows us better than we know ourselves, and it becomes easy to figure out exactly what will make us click.

“Oh, Naomi is sad today. She’ll be more susceptible to this kind of messaging. Push it to her now.”

Profiles aren’t just about facts. They’re about state of mind. If the system can see that you’re tired, lonely, or angry, it knows exactly when to time the nudge.

Who are the players?

This isn’t just about platforms experimenting with outrage to keep us online. Entire government departments now study these manipulation strategies. When something goes viral, they try to trace where it started: “Was it seeded by a hostile nation, a domestic political shop, or a corporation laying the groundwork for its next rent-seeking scheme?”

Everyone with resources uses these tools. Governments, parties, corporations, activist networks. The mechanism is the same, and the targets are us.

The entire internet runs on a system where people are competing for our attention, and some of the agendas of those involved are downright nefarious.

These systems don’t just predict what we like and hate, they actively shape it, and we have to start realizing that sometimes division itself is the intended outcome.

Filter bubbles were only the beginning

For years, the filter bubble was the go-to explanation for polarization. Algorithms showed us more of what we already agreed with, so we became trapped in echo chambers. We assumed polarization was just the natural consequence of people living in separate informational worlds.

But that story is only half right, and dangerously incomplete.

The real problem isn’t just that we see different things.
It’s that we are being deliberately targeted.

Governments, corporations, and movements know so much about us that they can do more than keep us in bubbles. They can reach inside those bubbles to provoke us, push us, and agitate us.

Filter bubbles were about limiting information. Surveillance-driven targeting is about exploiting information. With enough data, platforms and their partners can predict what will outrage you, when you’re most vulnerable, and which message will make you react.

And that’s the crucial shift. Polarization today isn’t just a byproduct of passive algorithms. It’s the direct result of an influence machine that knows us better than we know ourselves, and uses that knowledge to bend us toward someone else’s agenda.

Fakes, fragments, and manufactured consensus

We live in a world of deepfakes.

We live in a world of soundbites taken out of context.

We live in an era where it’s easier than ever to generate AI fluff. If someone wants to make a point of view seem popular, they can instantly create thousands of websites, all parroting the same slightly tweaked narrative. When we go searching for information, it looks like everyone is in consensus.

Volume now looks like truth, and repetition now looks like proof. And both are cheap.

Remember your humanity

In this era of artificial interactions, manipulation, and engineered outrage, we can’t forget our humanity.

That person that you’re fighting with might not actually be a human, they might be a bot.

That story about that political candidate might have been taken completely out of context, and deliberately targeted at you to make you angry.

Online, we dehumanize each other. But we should instead remember how to talk. Ideas can be discussed without becoming triggers. They don’t have to send us spiraling after four hours of doomscrolling.

Fear is the mindkiller. When something online pushes you to react, pause. Ask whose agenda this serves. Ask what context you might be missing.

The path forward

We are more polarized than ever, largely because we’ve become so transparent to those who profit from using our emotions against us.

Privacy is our ally in this fight. The less companies and governments know about us, the harder it is for them to manipulate us. Privacy protects our autonomy in the digital age.

And we need to see each other as humans first, not as avatars in someone else’s propaganda war. The person you’re arguing with was probably targeted by a completely opposite campaign.

We’ll all be better off if we lift the veil on this manipulation, and remember that we are independent thinkers with the power to make up our own minds, instead of being led by those who want to control us.

 

Yours in Privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.