Friday, October 17, 2025

Polaris of Enlightenment

Telegram temporarily blocked in Spain

Published 26 March 2024
– By Editorial Staff
Telegram has previously been blocked in China, Thailand and Pakistan, among other countries.
1 minute read

Spain’s supreme court has ordered the temporary blocking of popular messaging app Telegram throughout the country. The court’s decision comes after media companies in the country claimed that the platform was using copyrighted material.

Media companies Mediaset, Atresmedia, Movistar and Egeda complained that Telegram was distributing content created by the companies as well as copyrighted content without the permission of the creators. The court, headed by Judge Santiago Pedraz, contacted the owners of the platform and asked them to provide information related to the case.

When the company behind Telegram did not respond to the request, Judge Pedraz decided to block the application on Friday, reports Euronews. The information would have concerned who was behind some of the accounts. However, the block will only be temporary and will last for a few days.

In Spain, about 18 percent of the population uses Telegram, which corresponds to about eight million people. The decision has been criticized by many users, such as Facua, an organization that monitors consumer issues in the country. According to Facua, the ruling will cause “enormous damage” to millions of Telegram users.

In 2015, the app was suspended in China and later banned in Thailand, Pakistan, Iran and Cuba.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

How young people are manipulated into shopping more

Published today 7:33
– By Editorial Staff
Every time young people pick up their phones, they're met with purchase prompts from fashion companies – a constant stream of manipulative content.
2 minute read

Several fashion companies use so-called “purchase trigger mechanisms” on their websites to manipulate customers into buying more, according to a survey by the Swedish Society for Nature Conservation (Naturskyddsföreningen). Another study shows that young girls are constantly exposed to a stream of these purchase triggers on their mobile phones.

The survey examined the presence of purchase triggers on the websites of nine major fashion companies. The sites reviewed were H&M, Lindex, KappAhl, NA-KD, Nelly, Boozt, Zalando, Ellos, and Shein.

Purchase triggers refer to content that simply aims to increase consumption on the sites by, for example, exploiting psychological and cognitive weaknesses in consumers.

In total, eleven such purchase triggers were examined, identified as: urgency/time scarcity, scarcity, popularity, exclusivity, inspiration, low risk, incentives/offers, shortcuts, lock-in mechanisms, good deals, and user-generated content.

The results show that all reviewed fashion companies use purchase triggers, but to varying degrees and intensity. Nelly, Boozt, Zalando, and Shein, for example, use all identified triggers, while H&M uses them to a more limited extent. Shein was the worst offender, followed by Ellos.

It’s frightening that all the major fashion companies we looked at use so many purchase triggers to get us to shop more and more. These triggers are based on psychological shortcuts that make us act on our buying impulses and awaken our desire to have things, which in turn leads to an unhealthy consumption pattern. This affects both the environment and climate, as well as our well-being, says Beatrice Rindevall, chair of the Swedish Society for Nature Conservation, in a press release.

“Commercial media”

In another study, the Swedish Society for Nature Conservation commissioned Ungdomsbarometern (Youth Barometer) to investigate how eight fashion-interested girls aged 16–17 are exposed to purchase triggers. The girls were asked to observe and take screenshots over several days, as well as answer questions about how they acted on what appeared on their phones in social media and their inbox.

The results show that the girls had a constant stream of purchase triggers on their phones with content that encouraged purchases.

It shouldn’t be called social media anymore, but more accurately commercial media. It’s high time that politicians realize how vulnerable young people are and that stricter regulations are introduced for how digital consumption environments may be designed, says Rindevall.

Safety apps normalize surveillance of children

Mass surveillance

Published yesterday 8:51
– By Editorial Staff
Swedish researcher Katarina Winter warns that surveillance of children has become normalized when technology is packaged as care rather than control.
3 minute read

Apps promised to increase safety are often used for everyday logistics – and normalize secret surveillance.

Researchers at Stockholm University have examined 48 Swedish safety apps and warn that the technology is packaged as care while ethical questions disappear.

In two research projects at Stockholm University in Sweden, researchers are investigating various safety technologies in Sweden – everything from digital safety maps and security sensors to apps marketed as tools for creating safer communities. But instead of measuring whether the technology works, the researchers critically examine its consequences.

— It’s important to ask what kind of safety we’re after, and for whom? What is worth calling safety? Which actors and interests determine what constitutes safety in a society? The project on safety apps shows, among other things, how surveillance becomes normalized when we use this technology, says Katarina Winter, associate professor and senior lecturer in criminology and doctor in sociology at Stockholm University.

She leads the projects, which are conducted in collaboration with researchers from the University of Gävle and Södertörn University. The researchers have mapped 48 Swedish safety apps and interviewed both developers and users, including parents who use apps to keep track of their children.

“The technology is so kindly framed”

A central finding is how normalized it has become to monitor children, often without their knowledge.

— One example is how normalized it has become to monitor your children even though they don’t know about it, although some have an agreement with their children. Because the technology is so kindly framed – that it’s about protecting the children – it doesn’t become something you have to stand up for as a parent. The normalization can therefore happen under the radar. When technology is packaged as care, we easily lose sight of the ethical questions, she explains.

The surveillance also affects family relationships.

— Many use the apps to avoid nagging their children, and in the short term it may be convenient and simplify family logistics. But something happens on an interpersonal level, we cut off part of the interaction between each other. It’s seen as deviant behavior if you don’t want to share your location, which I think is negative.

Confusing messages during adult education center shooting

The researchers see a clear discrepancy between developers’ ideals about a safer society and how the apps are actually used. For private individuals, it’s often about completely different things than safety.

— In a way, these parents reproduce an insecurity in society related to crime and vulnerability when they justify why they use an app. But in reality, it’s often extremely connected to everyday logistics – when should I start cooking the pasta depending on where my child is? explains the criminologist.

The researchers have also examined the school safety app CoSafe, which was used during the shooting at Campus Risbergska, an adult education center in Malmö, southern Sweden. The app was criticized for sending contradictory alerts about both evacuation (leaving the building) and lockdown (staying inside and seeking shelter). Of the total eleven people killed, two students followed the instruction to evacuate instead of seeking shelter indoors.

— The Risbergska case demonstrates the complexity of technical solutions for crisis situations. While the app may have helped some seek shelter, the incident raises important questions about responsibility distribution and technical reliability when it comes to life and death, Winter notes.

Private actors profit from insecurity

The researcher also sees how private companies use the public debate about insecurity to sell their solutions, particularly to municipalities.

— We have both a political landscape that focuses on insecurity and a market that takes it on because it’s in focus. It’s logical that opportunities for entrepreneurship are found in the societal debate we’re in, but it becomes more brutal when it comes to safety than with other phenomena. Partly because actors profit from portraying society as unsafe, and partly because companies are generally interested in specific user groups that may not have many safety problems.

She calls for a critical attitude toward technological optimism.

— It’s important to pause on these questions that otherwise tend to rush ahead in a kind of faith that ‘now everything will be better because we have new technology’. When the overarching word is safety, questions about surveillance and privacy risk being deprioritized.

Netherlands invokes emergency law to seize Chinese chip manufacturer

Published 14 October 2025
– By Editorial Staff
Nexperia, a Dutch semiconductor manufacturer, specializes in large-scale production of microchips for the automotive industry and consumer electronics.
2 minute read

The Netherlands has seized the Chinese-owned semiconductor manufacturer Nexperia by invoking a never-before-used emergency law.

Owner Wingtech Technology is hitting back hard, calling the decision “excessive intervention driven by geopolitical bias” – and stating that the company has followed all laws and regulations.

Late Sunday evening, the Dutch Ministry of Economic Affairs announced that an emergency law had been used to nationalize Nexperia. The decision is justified by the risk that access to the company’s microchips could “become unavailable in an emergency” – something the government says threatens economic security in both the Netherlands and the EU.

Amsterdam describes the measure as “highly exceptional” and refers to “recent and acute signals of serious governance shortcomings and actions” within the company.

Nexperia produces microchips in large volumes for the automotive and electronics industries. The company was previously part of Dutch electronics giant Philips before being acquired by Chinese company Wingtech Technology.

The news had an immediate effect on the stock market. Wingtech shares fell 10 percent in Shanghai on Monday and were forced into a trading halt after reaching the daily decline limit.

In a later filing to the Shanghai Stock Exchange, Wingtech announced that the company’s control over Nexperia would be temporarily restricted due to the Dutch decision and court rulings affecting decision-making and operational efficiency.

Trade tensions between EU and China

Wingtech condemned the Dutch government’s action in a now-deleted WeChat post. The company called the decision “excessive intervention driven by geopolitical bias, rather than a fact-based risk assessment”.

The technology company added that it intends to take action to protect its rights and seek government support.

The nationalization comes amid a period of sharply increased trade tensions between the EU and China. Over the past year, the countries have clashed over what the EU claims is Beijing’s dumping of key goods and industrial overproduction. China has in turn accused the EU of protectionism.

Last week, China tightened its restrictions on exports of rare earth metals and magnets – a move that could further damage European industries dependent on these materials.

You’re being tracked – everywhere you drive

Published 11 October 2025
– By Naomi Brockwell
6 minute read

You might have noticed cameras quietly going up all over your city or town. They blend in, mounted next to traffic lights or tucked onto poles by the roadside. They’re easy to miss. But they’re part of a growing network designed to track where everyone drives, at all times.

If you drive to a protest, that trip is logged.
Visit an opposition party meeting? That visit is in a searchable government database.
Go to a shooting range, a reproductive health clinic, a mosque, your lover’s home, your child’s school… every movement is documented.

Welcome to the world of automatic license plate readers, or ALPRs.

How it works

Automatic license plate readers were once a niche police tool. But today, thanks to companies like Flock Safety, they’ve spread into neighborhoods, HOAs, schools, and businesses across the country.

They’re not just cameras, they’re part of a cloud-based surveillance system. Every scan is uploaded to Flock’s servers. The footage (including the license plate, the time, the location, and even the make, model, and color of your vehicle) becomes part of a centralized, searchable database.

Police are one of Flock’s primary clients. When a department buys a Flock subscription, they’re not just getting access to cameras in their city. They’re getting access to a national database of vehicle movements.

Here’s how it works:
If a police department agrees to share its own Flock camera data, it can then search data from every other participant that has done the same.

The result is a real-time surveillance grid.
One that logs where millions of people drive.

Wait… Is this even legal?

Let’s talk about the Fourth Amendment. It was written to protect us from exactly this: blanket surveillance without a warrant.

“The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized”.

The Fourth Amendment and its warrants requirements is a safeguard against unchecked power. The point is to protect you from government searches (whether that’s your location, your movements, or other personal information) unless there’s a specific reason, backed by probable cause, and approved by a judge.

If police want to track your phone for more than a moment, or access your location history, they need a warrant. The Supreme Court made that clear in Carpenter v. United States.

The court ruled that the government can’t access historical location data from your phone without a warrant, even if that data was collected by a third party.

They also ruled in U.S. v. Jones that tracking your car with a GPS device counts as a Fourth Amendment search.

Put those two decisions together, and the logic is clear:
If tracking your car in real time requires a warrant…
And accessing location data held by third parties requires a warrant…
Then using license plate readers to retroactively trace your movements without a warrant seems like a clear constitutional violation.

And yet, that’s exactly what’s happening.

Police can search databases like those aggregated by Flock without any judicial oversight.

Even if the camera that captured your car was owned by an HOA, a school, or a private business, in many cases, the footage is fed into the same system law enforcement accesses, with no warrant required and no notification to the person being searched.

The government isn’t supposed to go back and retrace your movements using data it didn’t have a warrant for at the time. And the government is not legally allowed to retrace your steps with data that it collected up to usually a few days. Courts have ruled that this kind of retroactive surveillance violates your reasonable expectation of privacy.

So how is this current situation even allowed? It’s not clear that it is. But until we get a ruling, agencies are relying on some semantic acrobatics to justify it.

The loophole

Well, here’s the loophole: the government isn’t building the dragnet.

Private companies are.

The government can’t track your location, but private companies can.
Flock can legally install cameras in neighborhoods, collect location data, and store it all in their own systems.
But they can’t install those cameras on light poles, traffic intersections, or public roads without government approval.

So what happens?
The government gives them permission.
In many cases, the city even pays for the cameras.
And in return, the government gets access to the data.

It’s a perfect circle:
The government can’t build a dragnet directly, because the Constitution forbids it.
Private companies can’t scale a dragnet without government infrastructure.
So they join forces, each side helping the other bypass the restrictions meant to protect our civil liberties.

How law enforcement justifies it

Here’s how it typically works:

A city contracts with Flock or another ALPR vendor. They buy a certain number of cameras. The vendor installs them, usually on city infrastructure like streetlights or utility poles.
Some cities even mount them on garbage trucks or buses, so they sweep entire neighborhoods as they move.

The vendor maintains the cameras, and all footage is uploaded to the cloud.
And if you’re a police department that has opted in to share footage, then you in turn get access to all the footage from everyone who has also opted in.

If law enforcement were the ones actively setting up and operating these cameras, it would be much harder to argue this isn’t a government search, one that triggers a warrant requirement under the Constitution.

So what do they do instead? They argue:

“We’re not collecting this data — the vendor is. We’re just accessing publicly available information”.

But there’s nothing passive about that.
If you’re procuring the cameras, approving the locations, and contracting to receive the footage, you’re not a bystander — you’re an active participant in the surveillance.
In fact one could argue you’re actually building the system.

The mosaic theory

There’s also something called the “mosaic theory” of privacy.

The mosaic theory says that while one tile might not show much, if you put enough tiles together you see the whole picture of someone’s life.

In terms of constitutionality, individual little bits of information can be legally gathered, but once you combine all those bits of information, now you have a full picture and then it becomes illegal.

For example, it might be legal to take a picture of someone’s car in public. But imagine a scenario where someone takes thousands of pictures of your car, and from these pictures is able to recreate your travel patterns.
At that point, it’s not just “observation”. It’s a search, and the Constitutional protections should kick into gear.

At what cost?

Supporters of automatic license plate readers often cherry pick their success stories. ALPRs are marketed as tools to stop crime and protect children.

But we can’t just look at the benefits of this technology. We must also weigh the costs.

The problem with mass surveillance infrastructure is what happens when the wrong people inherit this system:

Imagine the most dangerous person you can think of in power.
Now imagine they inherit the surveillance network you just said yes to.

The stakes are too high to ignore

We need to get back to warrant requirements.

We need real checks and balances.
Because a dragnet system that monitors hundreds of millions of innocent people is a huge danger to freedom.

Jen Barber from Jen’s Two Cents said it plainly:

“I now live in a community where I cannot get in or out of my neighborhood without a Flock camera watching me. I don’t need Big Brother building a lifetime record of my whereabouts. It’s none of their business”.

This isn’t just about your car.
It’s about whether privacy and freedom can exist outside your front door.

Freedom of movement isn’t really free if you can’t go anywhere without being tracked.
And I’m not quite ready to give up my freedom of movement yet.

 

Yours in privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.