Friday, June 20, 2025

Polaris of Enlightenment

Your privacy choices affect others too

Mass surveillance

Privacy enjoys company.

Published 9 October 2024
– By Naomi Brockwell
4 minute read

I just interviewed an activist from Bahrain, and she told me that by the nature of her work and threat model, she’s lost many friendships. Her communication is highly sensitive, and she guards it as a matter of safety. But there are some people in her life who just refuse to use secure platforms when they talk with her. The conversation might not be about anything sensitive—they might just be old school friends. But by sending her messages on insecure platforms, they are revealing information to adversaries that might actually put her life in danger. So she’s had to cut many friendships as a result.

We live in a world where sharing has become second nature. We post our thoughts on social media, send quick texts without a second thought, and freely hand over personal information for the sake of convenience. Many of us shrug and say, “I have nothing to hide.” But it’s important to stop to consider that our privacy choices don’t just affect us—they impact those around us too.

Think about it. When you send a text message via an insecure platform like SMS, you weaken the privacy of the person in that conversation with you too. Maybe your friend is cautious about their digital footprint, or perhaps they have concerns you’re unaware of. Perhaps you’re linking them to content they might prefer to keep private. By using unsecured channels, you might be exposing them to risks they never agreed to take.

It’s easy to dismiss privacy concerns as paranoia or the realm of those with “something to hide.” But privacy isn’t just about secrets. It’s about control over personal information and respecting others’ preferences and boundaries.

Consider the concept of the social graph—the network of relationships and interactions that map out our social connections. Companies and governments use this data to analyze behavior, predict trends, and market products to us. But it’s not just about selling us things; it’s also about shaping how we think and what we believe. By analyzing our connections, entities can target us with specific information or propaganda to influence our opinions, decisions, and even voting patterns.

When you’re carefree with your interactions, whether on public platforms or through insecure private messaging apps, you’re not just putting yourself on this map; you’re involving everyone you engage with. Your actions help create a detailed web that includes your friends, family, and colleagues. Tagging people in photos, mentioning them in posts, or messaging them on unsecured platforms can link others to ideas, beliefs, or movements that they might not want to be publicly connected with. Choosing to use platforms where these interactions are tracked, monitored, or potentially exposed can have unintended consequences, revealing personal affiliations and potentially impacting the lives of those around us.

The reality is that we don’t truly know what someone else’s threat model is, what risks they’re facing, or what social circles might make them an unwitting target. So when people reveal their privacy preferences, you should listen.

We often forget that in the digital age, we’re all interconnected. Your decision to overshare or to use less secure communication doesn’t exist in a vacuum. It can have real consequences for others. Maybe your coworker doesn’t want their association with a particular topic or group to be public knowledge. Perhaps a friend is trying to keep a low profile for personal or professional reasons.

So what’s the solution?

Awareness and mindfulness. Before you hit ‘send’ on that message or ‘post’ on that update, consider who else might be affected. Use secure communication methods when sharing sensitive information. Respect others’ wishes when they express concerns about privacy.

By embracing end-to-end encrypted platforms, we allow others to speak freely and be their authentic selves without fear of surveillance or data collection. Secure communication channels enable open dialogue, letting everyone explore ideas and express thoughts without feeling repressed. When we choose these platforms, we not only protect our own privacy but also create a safe space for those we interact with to be genuine and honest. It’s a simple switch that empowers all of us to connect more meaningfully.

We don’t have to live off the grid or become hermits to protect ourselves and others. A little thoughtfulness goes a long way. By being mindful of how our actions impact those around us, we foster a culture that values and protects everyone’s right to privacy.

In the end, it’s not just about guarding our own information but about respecting the autonomy and preferences of others. Let’s remember that our digital choices ripple out in ways we might not immediately see. Your privacy practices matter—not just to you, but to everyone you connect with.

 

Naomi Brockwell

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Youtube.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

AI surveillance in Swedish workplaces sparks outrage

Mass surveillance

Published 4 June 2025
– By Editorial Staff
In practice, it is possible to analyze not only employees' productivity - but also their facial expressions, voices and emotions.
2 minute read

The rapid development of artificial intelligence has not only brought advantages – it has also created new opportunities for mass surveillance, both in society at large and in the workplace.

Even today, unscrupulous employers use AI to monitor and map every second of their employees’ working day in real time – a development that former Social Democratic politician Kari Parman warns against and calls for decisive action to combat.

In an opinion piece in the Stampen-owned newspaper GP, he argues that AI-based surveillance of employees poses a threat to staff privacy and calls on the trade union movement to take action against this development.

Parman paints a bleak picture of how AI is used to monitor employees in Swedish workplaces, where technology analyzes everything from voices and facial expressions to productivity and movement patterns – often without the employees’ knowledge or consent.

It’s a totalitarian control system – in capitalist packaging”, he writes, continuing:

There is something deeply disturbing about the idea that algorithms will analyze our voices, our facial expressions, our productivity – second by second – while we work”.

“It’s about power and control”

According to Parman, there is a significant risk that people in digital capitalism will be reduced to mere data points, giving employers disproportionate power over their employees.

He sees AI surveillance as more than just a technical issue and warns that this development undermines the Swedish model, which is based on balance and respect between employers and employees.

It’s about power. About control. About squeezing every last ounce of ‘efficiency’ out of people as if we were batteries”.

If trade unions fail to act, Parman believes, they risk becoming irrelevant in a working life where algorithms are taking over more and more of the decision-making.

To stop this trend, he lists several concrete demands. He wants to see a ban on AI-based individual surveillance in the workplace and urges unions to introduce conditions in collective agreements to review and approve new technology.

Kari Parman previously represented the Social Democrats in Gnosjö. Photo: Kari Parman/FB

“Reduced to an algorithm’s margin of error”

He also calls for training for safety representatives and members, as well as political regulations from the state.

No algorithm should have the right to analyze our performance, movements, or feelings”, he declares.

Parman emphasizes that AI surveillance not only threatens privacy but also creates a “psychological iron cage” where employees constantly feel watched, blurring the line between work and private life.

At the end of the article, the Social Democrat calls on the trade union movement to take responsibility and lead the resistance against the misuse of AI in the workplace.

He sees it as a crucial issue for the future of working life and human dignity at work.

If we don’t stand up now, we will be alone when it is our turn to be reduced to an algorithm’s margin of error”, he concludes.

Dutch opinion leader targeted by spy attack: “Someone is trying to intimidate me”

Mass surveillance

Published 1 May 2025
– By Editorial Staff
According to both Eva Vlaardingerbroek and Apple, it is likely that the opinion leader was attacked because of her views.
3 minute read

Dutch opinion maker and conservative activist Eva Vlaardingerbroek recently revealed that she had received an official warning from Apple that her iPhone had been subjected to a sophisticated attack – of the kind usually associated with advanced surveillance actors or intelligence services.

In a social media post, Vlaardingerbroek shared a screenshot of Apple’s warning and drew parallels to the Israeli spyware program Pegasus, which has been used to monitor diplomats, dissidents, and journalists, among others.

– Yesterday I got a verified threat notification from Apple stating they detected a mercenary spyware attack against my iPhone. We’re talking spyware like Pegasus.

– In the message they say that this targetted mercenary attack is probably happening because of ‘who I am and what I do’, she continues.

The term mercenary spyware is used by Apple to describe advanced surveillance technology, such as the notorious Pegasus software developed by the Israeli company NSO Group. This software can bypass mobile security systems, access calls, messages, emails, and even activate cameras or microphones without the user’s knowledge.

Prominent EU critic

Although Apple does not publicly comment on individual cases, the company has previously confirmed that such warnings are only sent when there is a “high probability” that the user has been specifically targeted. Since 2021, the notifications have mainly been sent to journalists, human rights activists, political dissidents, and officials at risk of surveillance by powerful interests.

Vlaardingerbroek has long been a prominent voice critical of the EU and has become known for her sharp criticism of EU institutions and its open-border immigration policy. She insists that the attack is likely politically motivated:

– I definitely dont know who did it. It could be anyone. This could be name a government that doesn’t like me. Name a organization that doesnt like me. Secret services, you name it.

– All I know for sure right now is that someone is trying to intimidate me. I have a message for them: It won’t work.

“There must be full transparency”

The use of Pegasus-like programs has been heavily criticized by both governments and privacy advocates. The tools, originally marketed for counterterrorism, have since been reported to be used against journalists and opposition leaders in dozens of countries.

In response, Apple sued NSO Group in 2021 and launched a system to warn users. However, the company claims that the threats are “rare” and not related to common malware.

The Vlaardingerbroek case is now raising questions about whether such technology is also being used in European domestic political conflicts, and the organization Access Now is calling on authorities in the Netherlands and at the EU level to investigate the attack.

– There must be full transparency. No one in a democratic society – regardless of political views – should be subjected to clandestine spying for expressing opinions or participating in public discourse, said a spokesperson.

Neither Apple nor the Dutch authorities have commented publicly on the case. Vlaardingerbroek says she has not yet seen any signs that data has actually been leaked, but has taken extra security measures.

Swedish government proposes wiretapping children without criminal suspicion

Mass surveillance

Published 1 May 2025
– By Editorial Staff
The government's own investigator proposed that only the Swedish Security Service (Säpo) should be allowed to eavesdrop on children without criminal suspicion - but this is not enough, according to the government.
2 minute read

Gang crime continues to plague Sweden, with recurring bombings, shootings and contract killings spreading fear in society, without those in power managing to get a grip on crime.

Criminal gangs often use minors to carry out serious crimes. For this reason, the Tidö parties (the center-right coalition government) want to give police the authority to wiretap children under the age of 15 – even in cases where there is no specific suspicion of a crime.

During a press conference the government stated that the social trend is bleak, that “serious crime is penetrating lower and lower down the age scale” and that children are increasingly “playing central roles in the commission of serious crimes“.

Currently, police are not allowed to use “secret coercive measures” against children under the age of 15 – which allegedly hinders police work when investigating murders and bombings.

At a press conference on Wednesday, representatives of the Tidö parties confirmed that they want to change the legislation so that children can also be wiretapped – partly when they are being investigated for crimes – but also for “preventive purposes” – i.e. without any actual suspicion of crime.

These are far-reaching proposals. But it is justified by the development of society, said Minister for Justice Gunnar Strömmer (M), and continued:

– It is about preventing crime, but also about reaching those who are behind and controlling via children’s cell phones.

Dismisses own investigator’s limitations

The government’s own legal investigator had recommended that only the Swedish Security Service (Säpo) be allowed to use wiretapping without suspicion of a crime. However, the government disagrees, arguing that it is “absolutely necessary” for regular police to also be allowed to wiretap children if they can be linked to serious organized crime.

The government maintains that fighting gang crime is more important than protecting the integrity of minors. Strömmer stated that “there are very significant risks in allowing the current reality to continue as it is”.

The change in the law is proposed to come into force this fall for at least five years, after which it will be evaluated.

Although most people seem to agree that organized crime needs to be fought, many are also opposed to the fact that the moderate-led government repeatedly chooses to focus so much on increased wiretapping and surveillance. Critics also point out that there is a real risk that the surveillance apparatus will be abused in the future or used very arbitrarily and without legal certainty.

Amazon updates privacy settings – all voice data to be stored in the cloud

Mass surveillance

Published 26 March 2025
– By Editorial Staff
Amazon itself states that it saves users' calls in order to improve the service.
1 minute read

As of March 28, some Echo devices will no longer be able to process voice data locally – all voice information will be sent to Amazon’s cloud service, regardless of the user’s will.

Echo is a series of smart devices, including speakers, developed by Amazon. The device records what you say and sends it to Amazon’s servers to be stored and analyzed, allegedly to improve the service. Privacy settings have previously allowed some devices to process voice data locally without sending it to Amazon.

In an email to Echo users, shared on Reddit, Amazon announced that the ability to process voice commands locally is being removed. Instead, all recordings will be sent to the cloud for processing, as Sweclockers has reported.

If the user doesn’t actively change their settings before March 28, they will automatically be set to “do not save data”. This means that Amazon will still collect and process your voice information, but that this will be deleted after Alexa handles the request. However, it is unclear how long the information will be stored before it is actually deleted.

Amazon states that voice data is needed to train the company’s AI model, Alexa Plus. At the same time, the company promises that all previously saved voice data will be deleted if the user has the “do not save data” feature enabled.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.