Wednesday, October 15, 2025

Polaris of Enlightenment

The Twitter documents – how internal discussions went when the Hunter Biden affair was censored

Published 8 December 2022
– By Editorial Staff
Left: leaked film from Hunter Biden's computer, Right: Jack Dorsey, CEO of Twitter before Elon Musk's takeover.
3 minute read

Elon Musk continues to release new revelations about Twitter’s censorship tools as well as the company’s behind-the-scenes decision-making process. The first thing he chose to address is the how Twitter censored the story of Hunter Biden’s laptop.

It was at the end of November that Twitter’s new owner, Elon Musk, promised to reveal to the public how the platform had practiced strict censorship of its users before his takeover. Twitter began releasing the results of a major in-company investigation this month, which includes thousands of documents that have come to be known as the “Twitter files.”

What has been revealed is that the tools were initially used to remove spam and money fraudsters, for example. But this initial form of censorship slowly evolved and began to assume other forms, with Twitter executives and employees finding more and more ways to use these tools, which were later made available to other companies as well.

For example, some political organizations had a network of contacts who had access to these censorship tools, allowing them to request that certain posts be removed or at least reviewed. In the United States, both Democrats and Republicans had access to it. Requests were made by both the Trump and Biden campaigns in 2020, the documents indicate. Although since Twitter’s values were primarily shaped by employees who were sympathetic to the Democrats, this meant that “the [censorship] system was not balanced.”

“Because Twitter was and is overwhelmingly staffed by people with a political bent, there were more channels, more ways to complain, open to the left (well, Democrats) than to the right,” writes Matt Taibbi, who is one of those reporting on the Twitter documents.

In this context, it’s not particularly surprising that Twitter then did its best to suppress the story of Hunter Biden’s laptop during the ongoing US presidential campaign. It resorted to several methods to ensure that the New York Post article about the then-candidate’s son would not spread, such as by removing links or marking such tweets as “unsafe.” It even went so far as to block links to the article in direct messages, a tool otherwise used for child pornography, among other things.

For example, Kayleigh McEnany, who was then the White House Press Secretary, was blocked from her account merely for addressing the article in a tweet, prompting her to be contacted by the White House.

The employees who had made the decision blamed it on “hacking,” meaning that it was believed that the New York Post had used hacked material for the article, which would have violated Twitter’s “hacked materials policy.”

“‘Hacking’ was the excuse, but within a few hours almost everyone realized it wouldn’t hold up,” a former employee of the platform stated. “But no one had the guts to turn it around.”

There was even an internal discussion on the subject, questioning the decision.

“I have a hard time understanding the political basis for marking this as unsafe,” wrote Trenton Kennedy, the company’s then-Communications Director, for example.

Democratic Congressman Ro Khanna even wrote to Twitter about the censorship to question it, also mentioning in his letter that it was possibly a violation of the US Constitution’s First Amendment. He was actually the only prominent Democrat to question the censorship of the Hunter Biden laptop article, sharing his reasoning in an internal discussion with Twitter executives on why this “does more harm than good.”

“Even if the New York Post is Right-wing, restricting the dissemination of newspaper articles during the current presidential campaign will backfire more than it will help,” Khanna reasoned, asking that the discussion be kept internal between Twitter’s then-CEO, Jack Dorsey, and the Democrats and not discussed with other employees.

 

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Safety apps normalize surveillance of children

Mass surveillance

Published today 8:51
– By Editorial Staff
Swedish researcher Katarina Winter warns that surveillance of children has become normalized when technology is packaged as care rather than control.
3 minute read

Apps promised to increase safety are often used for everyday logistics – and normalize secret surveillance.

Researchers at Stockholm University have examined 48 Swedish safety apps and warn that the technology is packaged as care while ethical questions disappear.

In two research projects at Stockholm University in Sweden, researchers are investigating various safety technologies in Sweden – everything from digital safety maps and security sensors to apps marketed as tools for creating safer communities. But instead of measuring whether the technology works, the researchers critically examine its consequences.

— It’s important to ask what kind of safety we’re after, and for whom? What is worth calling safety? Which actors and interests determine what constitutes safety in a society? The project on safety apps shows, among other things, how surveillance becomes normalized when we use this technology, says Katarina Winter, associate professor and senior lecturer in criminology and doctor in sociology at Stockholm University.

She leads the projects, which are conducted in collaboration with researchers from the University of Gävle and Södertörn University. The researchers have mapped 48 Swedish safety apps and interviewed both developers and users, including parents who use apps to keep track of their children.

“The technology is so kindly framed”

A central finding is how normalized it has become to monitor children, often without their knowledge.

— One example is how normalized it has become to monitor your children even though they don’t know about it, although some have an agreement with their children. Because the technology is so kindly framed – that it’s about protecting the children – it doesn’t become something you have to stand up for as a parent. The normalization can therefore happen under the radar. When technology is packaged as care, we easily lose sight of the ethical questions, she explains.

The surveillance also affects family relationships.

— Many use the apps to avoid nagging their children, and in the short term it may be convenient and simplify family logistics. But something happens on an interpersonal level, we cut off part of the interaction between each other. It’s seen as deviant behavior if you don’t want to share your location, which I think is negative.

Confusing messages during adult education center shooting

The researchers see a clear discrepancy between developers’ ideals about a safer society and how the apps are actually used. For private individuals, it’s often about completely different things than safety.

— In a way, these parents reproduce an insecurity in society related to crime and vulnerability when they justify why they use an app. But in reality, it’s often extremely connected to everyday logistics – when should I start cooking the pasta depending on where my child is? explains the criminologist.

The researchers have also examined the school safety app CoSafe, which was used during the shooting at Campus Risbergska, an adult education center in Malmö, southern Sweden. The app was criticized for sending contradictory alerts about both evacuation (leaving the building) and lockdown (staying inside and seeking shelter). Of the total eleven people killed, two students followed the instruction to evacuate instead of seeking shelter indoors.

— The Risbergska case demonstrates the complexity of technical solutions for crisis situations. While the app may have helped some seek shelter, the incident raises important questions about responsibility distribution and technical reliability when it comes to life and death, Winter notes.

Private actors profit from insecurity

The researcher also sees how private companies use the public debate about insecurity to sell their solutions, particularly to municipalities.

— We have both a political landscape that focuses on insecurity and a market that takes it on because it’s in focus. It’s logical that opportunities for entrepreneurship are found in the societal debate we’re in, but it becomes more brutal when it comes to safety than with other phenomena. Partly because actors profit from portraying society as unsafe, and partly because companies are generally interested in specific user groups that may not have many safety problems.

She calls for a critical attitude toward technological optimism.

— It’s important to pause on these questions that otherwise tend to rush ahead in a kind of faith that ‘now everything will be better because we have new technology’. When the overarching word is safety, questions about surveillance and privacy risk being deprioritized.

Netherlands invokes emergency law to seize Chinese chip manufacturer

Published yesterday 12:46
– By Editorial Staff
Nexperia, a Dutch semiconductor manufacturer, specializes in large-scale production of microchips for the automotive industry and consumer electronics.
2 minute read

The Netherlands has seized the Chinese-owned semiconductor manufacturer Nexperia by invoking a never-before-used emergency law.

Owner Wingtech Technology is hitting back hard, calling the decision “excessive intervention driven by geopolitical bias” – and stating that the company has followed all laws and regulations.

Late Sunday evening, the Dutch Ministry of Economic Affairs announced that an emergency law had been used to nationalize Nexperia. The decision is justified by the risk that access to the company’s microchips could “become unavailable in an emergency” – something the government says threatens economic security in both the Netherlands and the EU.

Amsterdam describes the measure as “highly exceptional” and refers to “recent and acute signals of serious governance shortcomings and actions” within the company.

Nexperia produces microchips in large volumes for the automotive and electronics industries. The company was previously part of Dutch electronics giant Philips before being acquired by Chinese company Wingtech Technology.

The news had an immediate effect on the stock market. Wingtech shares fell 10 percent in Shanghai on Monday and were forced into a trading halt after reaching the daily decline limit.

In a later filing to the Shanghai Stock Exchange, Wingtech announced that the company’s control over Nexperia would be temporarily restricted due to the Dutch decision and court rulings affecting decision-making and operational efficiency.

Trade tensions between EU and China

Wingtech condemned the Dutch government’s action in a now-deleted WeChat post. The company called the decision “excessive intervention driven by geopolitical bias, rather than a fact-based risk assessment”.

The technology company added that it intends to take action to protect its rights and seek government support.

The nationalization comes amid a period of sharply increased trade tensions between the EU and China. Over the past year, the countries have clashed over what the EU claims is Beijing’s dumping of key goods and industrial overproduction. China has in turn accused the EU of protectionism.

Last week, China tightened its restrictions on exports of rare earth metals and magnets – a move that could further damage European industries dependent on these materials.

You’re being tracked – everywhere you drive

Published 11 October 2025
– By Naomi Brockwell
6 minute read

You might have noticed cameras quietly going up all over your city or town. They blend in, mounted next to traffic lights or tucked onto poles by the roadside. They’re easy to miss. But they’re part of a growing network designed to track where everyone drives, at all times.

If you drive to a protest, that trip is logged.
Visit an opposition party meeting? That visit is in a searchable government database.
Go to a shooting range, a reproductive health clinic, a mosque, your lover’s home, your child’s school… every movement is documented.

Welcome to the world of automatic license plate readers, or ALPRs.

How it works

Automatic license plate readers were once a niche police tool. But today, thanks to companies like Flock Safety, they’ve spread into neighborhoods, HOAs, schools, and businesses across the country.

They’re not just cameras, they’re part of a cloud-based surveillance system. Every scan is uploaded to Flock’s servers. The footage (including the license plate, the time, the location, and even the make, model, and color of your vehicle) becomes part of a centralized, searchable database.

Police are one of Flock’s primary clients. When a department buys a Flock subscription, they’re not just getting access to cameras in their city. They’re getting access to a national database of vehicle movements.

Here’s how it works:
If a police department agrees to share its own Flock camera data, it can then search data from every other participant that has done the same.

The result is a real-time surveillance grid.
One that logs where millions of people drive.

Wait… Is this even legal?

Let’s talk about the Fourth Amendment. It was written to protect us from exactly this: blanket surveillance without a warrant.

“The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized”.

The Fourth Amendment and its warrants requirements is a safeguard against unchecked power. The point is to protect you from government searches (whether that’s your location, your movements, or other personal information) unless there’s a specific reason, backed by probable cause, and approved by a judge.

If police want to track your phone for more than a moment, or access your location history, they need a warrant. The Supreme Court made that clear in Carpenter v. United States.

The court ruled that the government can’t access historical location data from your phone without a warrant, even if that data was collected by a third party.

They also ruled in U.S. v. Jones that tracking your car with a GPS device counts as a Fourth Amendment search.

Put those two decisions together, and the logic is clear:
If tracking your car in real time requires a warrant…
And accessing location data held by third parties requires a warrant…
Then using license plate readers to retroactively trace your movements without a warrant seems like a clear constitutional violation.

And yet, that’s exactly what’s happening.

Police can search databases like those aggregated by Flock without any judicial oversight.

Even if the camera that captured your car was owned by an HOA, a school, or a private business, in many cases, the footage is fed into the same system law enforcement accesses, with no warrant required and no notification to the person being searched.

The government isn’t supposed to go back and retrace your movements using data it didn’t have a warrant for at the time. And the government is not legally allowed to retrace your steps with data that it collected up to usually a few days. Courts have ruled that this kind of retroactive surveillance violates your reasonable expectation of privacy.

So how is this current situation even allowed? It’s not clear that it is. But until we get a ruling, agencies are relying on some semantic acrobatics to justify it.

The loophole

Well, here’s the loophole: the government isn’t building the dragnet.

Private companies are.

The government can’t track your location, but private companies can.
Flock can legally install cameras in neighborhoods, collect location data, and store it all in their own systems.
But they can’t install those cameras on light poles, traffic intersections, or public roads without government approval.

So what happens?
The government gives them permission.
In many cases, the city even pays for the cameras.
And in return, the government gets access to the data.

It’s a perfect circle:
The government can’t build a dragnet directly, because the Constitution forbids it.
Private companies can’t scale a dragnet without government infrastructure.
So they join forces, each side helping the other bypass the restrictions meant to protect our civil liberties.

How law enforcement justifies it

Here’s how it typically works:

A city contracts with Flock or another ALPR vendor. They buy a certain number of cameras. The vendor installs them, usually on city infrastructure like streetlights or utility poles.
Some cities even mount them on garbage trucks or buses, so they sweep entire neighborhoods as they move.

The vendor maintains the cameras, and all footage is uploaded to the cloud.
And if you’re a police department that has opted in to share footage, then you in turn get access to all the footage from everyone who has also opted in.

If law enforcement were the ones actively setting up and operating these cameras, it would be much harder to argue this isn’t a government search, one that triggers a warrant requirement under the Constitution.

So what do they do instead? They argue:

“We’re not collecting this data — the vendor is. We’re just accessing publicly available information”.

But there’s nothing passive about that.
If you’re procuring the cameras, approving the locations, and contracting to receive the footage, you’re not a bystander — you’re an active participant in the surveillance.
In fact one could argue you’re actually building the system.

The mosaic theory

There’s also something called the “mosaic theory” of privacy.

The mosaic theory says that while one tile might not show much, if you put enough tiles together you see the whole picture of someone’s life.

In terms of constitutionality, individual little bits of information can be legally gathered, but once you combine all those bits of information, now you have a full picture and then it becomes illegal.

For example, it might be legal to take a picture of someone’s car in public. But imagine a scenario where someone takes thousands of pictures of your car, and from these pictures is able to recreate your travel patterns.
At that point, it’s not just “observation”. It’s a search, and the Constitutional protections should kick into gear.

At what cost?

Supporters of automatic license plate readers often cherry pick their success stories. ALPRs are marketed as tools to stop crime and protect children.

But we can’t just look at the benefits of this technology. We must also weigh the costs.

The problem with mass surveillance infrastructure is what happens when the wrong people inherit this system:

Imagine the most dangerous person you can think of in power.
Now imagine they inherit the surveillance network you just said yes to.

The stakes are too high to ignore

We need to get back to warrant requirements.

We need real checks and balances.
Because a dragnet system that monitors hundreds of millions of innocent people is a huge danger to freedom.

Jen Barber from Jen’s Two Cents said it plainly:

“I now live in a community where I cannot get in or out of my neighborhood without a Flock camera watching me. I don’t need Big Brother building a lifetime record of my whereabouts. It’s none of their business”.

This isn’t just about your car.
It’s about whether privacy and freedom can exist outside your front door.

Freedom of movement isn’t really free if you can’t go anywhere without being tracked.
And I’m not quite ready to give up my freedom of movement yet.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

Researcher: We risk losing ourselves to AI gods

The future of AI

Published 10 October 2025
– By Editorial Staff
"We sell our ability to be courageous and buy security from the machines", argues researcher Carl Öhman.
3 minute read

The upcoming book “Gods of data” examines AI from a religion-critical perspective. Carl Öhman, researcher at the Department of Government at Uppsala University in Sweden, argues that today’s AI systems can be compared to humanity’s relationship with gods – and researches what would happen if power were completely handed over to AI.

As AI has developed, the tool that was initially used as a more personal version of Google has now also taken a place as an advisor in homes. AI is increasingly being used to ask more personal questions, such as healthcare advice, psychology and even as relationship counseling.

Öhman argues that AI has begun to become like gods, that is, a “kind of personified amalgamation of society’s collective knowledge and authority”, and in a research project he studies what humanity would lose if it allowed itself to be completely governed by technology – even a flawless one.

In a thought experiment, he explains how AI can affect, for example, a couple in everyday life who have started arguing due to different values, who get help from an AI relationship counselor.

They ask: ‘Hi, should we continue being together?’ The AI has access to all their data: their DNA, childhood photos, everything they’ve ever written and searched for and so on, and has been trained on millions of similar couples. It says: ‘with 98 percent probability this will end in catastrophe. You should break up with each other today. In fact, I’ve already found replacement partners for you who are much better matches’, he says in the Research Podcast.

Buying security

Öhman argues that even if there are no rational reasons why the couple shouldn’t obey the AI and break up, one gets a feeling here of having lost something. And in this particular case, the couple would lose faith in themselves and their relationship.

Love is always a risk. All interpersonal relationships involve a risk of being betrayed, saddened, that something goes wrong. We can absolutely use technology to minimize that risk, perhaps even completely reduce it. The point is that something is then lost. We sell our ability to be brave and buy security from the machines, he says.

World daddy in AI form

The research project also examines other relationships where AI has taken an increasingly larger role, for example parenthood. Today there are a number of AI apps designed to help adults handle their relationship with their children. Among other things, this can involve AI giving personalized responses or trying to prevent conflicts from arising.

Just like in the example of the young loving couple, something is lost here. In this particular chapter I use Sigmund Freud and his idea that belief in God is a kind of refusal to be an adult. That there is some kind of world daddy who ultimately always has the right answers. And here it becomes somewhat the same. There is a world daddy in the form of AI who then becomes the real parent in your relationship with the children. And you increasingly identify as a kind of child to the AI parent who has the final answers, he says.

Handing over power over ourselves

Öhman argues that it might feel nice to be able to avoid getting your heart broken, or to prevent conflicts with your children, but that one must be aware that there is a price when AI gets the power. He argues that when people talk about AI coming and taking over, it often happens violently and that “the machines come and take our lives from us.”

But the point in my book, and this project, is that it is we who hand over power over our lives, our courage, faith, and ultimately ourselves, he says.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.