We’re more polarized than ever. Conversations have turned into shouting matches. Opposing ideas feel like threats, not something to debate.
But here’s something many people don’t realize: privacy and surveillance have everything to do with it. Most people never connect those dots.
Why surveillance is the key to polarization
Surveillance is the engine that makes platform-driven polarization work.
Platforms have one overriding goal: to keep us online as long as possible. And they’ve learned that nothing hooks us like outrage. If they can rile us up, we’ll stay, scroll, and click.
Outrage drives engagement. Engagement drives profit. But when outrage becomes the currency of the system, polarization is the natural byproduct. The more the platforms know about us, the easier it is to feed us the content that will push our buttons, confirm our biases, and keep us in a cycle of anger. And that anger doesn’t just keep us scrolling, it also pushes us further apart.
These platforms are not neutral spaces, they are giant marketplaces where influence is bought and sold. Every scroll, every feed, every “recommended” post is shaped by algorithms built to maximize engagement and auction off your attention. And it’s not just companies pushing shoes or handbags. It’s political groups paying to shift your vote. It’s movements paying to make you hate certain people because you think they hate you. It’s hostile governments paying to fracture our society.
Because our lives are so transparent to the surveillance machine, we’re more susceptible to manipulation than ever. Polarization isn’t cultural drift. When surveillance becomes the operating system of the internet, polarization and manipulation are the natural consequences.
The internet is a manipulation machine
Few people are really aware of how much manipulation there is online. We all fancy ourselves to be independent thinkers. We like to think we make up our own mind about things. That we choose for ourselves which videos to watch next. That we discover interesting articles all on our own.
We want to believe we’re in control. But in a system where people are constantly paying to influence us, that independence is hard to defend. The truth is, our autonomy is far more fragile than we’d like to admit.
This influence creeps into our entire online experience.
Every time you load a web page, you’ll notice that the text appears first, alongside empty white boxes, and there’s a split second before those boxes are filled up. What’s going on in that split second is an auction, as part of what’s called a real-time bidding (RTB) system.
For example, in Google’s RTB system, what’s going on behind the scenes in that split second is Google is announcing to their list of Authorized Buyers, who are the bidders plugged into Google’s ad exchange:
“Hey, this person just opened up her webpage, here’s everything we know about her. She has red hair. She rants a lot about privacy. She likes cats. Here’s her device, location, browsing history, and this is her inferred mood. Who wants to bid to put an ad in front of her?”
These authorized buyers have milliseconds to decide whether to bid and how much.
This “firehose of data” is sprayed at potentially thousands of entities. And the number of data points included can be staggering. Google knows a LOT about you. Only one buyer wins the ad slot and pays, but potentially thousands will get access to that data.
Google doesn’t make their Authorized Buyers list public, but they do publish a list of Certified External Vendors list, which is a public-facing list of vendors like demand-side platforms, ad servers, analytics providers, etc. that Google has certified to interact with their ad systems. This CEV list is the closest proxy the public gets to knowing who is involved in this real-time bidding system.
And if you scroll through the names of some of these vendors, you won’t even find a Wikipedia page for many of them. A huge number have scrubbed themselves from the internet. It’s a mix of ad companies, data brokers, even government shell companies. And many of them you can bet are just sitting quietly in these auctions so they can scrape this data, to share or sell elsewhere, or use for other purposes. Regardless of what Google’s own Terms of Service say, once this data leaves Google’s hands, they have no control.
This real-time bidding system is just one behind-the-scenes mechanisms of the influence economy. But this machinery of influence is everywhere, not just when you load a webpage.
When you go to watch a video, there are thumbnails next to the video suggesting what you should watch next, and you click on one if it looks interesting. Those video thumbnails were not accidental.
When you scroll a social media timeline, the posts that populate are intentional. Everywhere you go, you’re seeing things that people have paid to put in front of you, hoping to nudge you one way or another. Even search results, which feel like neutral gateways to information, are arranged according to what someone else wants you to see.
This system of manipulation isn’t limited to simple commercial influence, where companies just want to get us to buy a new pair of shoes.
There are faceless entities paying to shape our thoughts, shift our behavior, and sway our votes. They work to bend our worldview, to manipulate our emotions, even to make us hate other people by convincing us those people hate us.
Where privacy comes in
This is where privacy comes into play.
The more a company or government knows about us, the easier it is to manipulate us.
- If we allow every email to be scanned and analyzed, every message to be read, every like, scroll, and post to be fed into a profile about us…
- If companies scrape every browser click, every book we read, every piece of music we listen to, every film we watch…
- When faceless entities know everywhere we go, whom we meet, what we do, and then they trace who those people meet, where they go, and what they do, and our entire social graph is mapped…
In this current reality, the surveillance industrial complex knows us better than we know ourselves, and it becomes easy to figure out exactly what will make us click.
“Oh, Naomi is sad today. She’ll be more susceptible to this kind of messaging. Push it to her now.”
Profiles aren’t just about facts. They’re about state of mind. If the system can see that you’re tired, lonely, or angry, it knows exactly when to time the nudge.
Who are the players?
This isn’t just about platforms experimenting with outrage to keep us online. Entire government departments now study these manipulation strategies. When something goes viral, they try to trace where it started: “Was it seeded by a hostile nation, a domestic political shop, or a corporation laying the groundwork for its next rent-seeking scheme?”
Everyone with resources uses these tools. Governments, parties, corporations, activist networks. The mechanism is the same, and the targets are us.
The entire internet runs on a system where people are competing for our attention, and some of the agendas of those involved are downright nefarious.
These systems don’t just predict what we like and hate, they actively shape it, and we have to start realizing that sometimes division itself is the intended outcome.
Filter bubbles were only the beginning
For years, the filter bubble was the go-to explanation for polarization. Algorithms showed us more of what we already agreed with, so we became trapped in echo chambers. We assumed polarization was just the natural consequence of people living in separate informational worlds.
But that story is only half right, and dangerously incomplete.
The real problem isn’t just that we see different things.
It’s that we are being deliberately targeted.
Governments, corporations, and movements know so much about us that they can do more than keep us in bubbles. They can reach inside those bubbles to provoke us, push us, and agitate us.
Filter bubbles were about limiting information. Surveillance-driven targeting is about exploiting information. With enough data, platforms and their partners can predict what will outrage you, when you’re most vulnerable, and which message will make you react.
And that’s the crucial shift. Polarization today isn’t just a byproduct of passive algorithms. It’s the direct result of an influence machine that knows us better than we know ourselves, and uses that knowledge to bend us toward someone else’s agenda.
Fakes, fragments, and manufactured consensus
We live in a world of deepfakes.
We live in a world of soundbites taken out of context.
We live in an era where it’s easier than ever to generate AI fluff. If someone wants to make a point of view seem popular, they can instantly create thousands of websites, all parroting the same slightly tweaked narrative. When we go searching for information, it looks like everyone is in consensus.
Volume now looks like truth, and repetition now looks like proof. And both are cheap.
Remember your humanity
In this era of artificial interactions, manipulation, and engineered outrage, we can’t forget our humanity.
That person that you’re fighting with might not actually be a human, they might be a bot.
That story about that political candidate might have been taken completely out of context, and deliberately targeted at you to make you angry.
Online, we dehumanize each other. But we should instead remember how to talk. Ideas can be discussed without becoming triggers. They don’t have to send us spiraling after four hours of doomscrolling.
Fear is the mindkiller. When something online pushes you to react, pause. Ask whose agenda this serves. Ask what context you might be missing.
The path forward
We are more polarized than ever, largely because we’ve become so transparent to those who profit from using our emotions against us.
Privacy is our ally in this fight. The less companies and governments know about us, the harder it is for them to manipulate us. Privacy protects our autonomy in the digital age.
And we need to see each other as humans first, not as avatars in someone else’s propaganda war. The person you’re arguing with was probably targeted by a completely opposite campaign.
We’ll all be better off if we lift the veil on this manipulation, and remember that we are independent thinkers with the power to make up our own minds, instead of being led by those who want to control us.
Yours in Privacy,
Naomi