Saturday, July 26, 2025

Polaris of Enlightenment

Chatbot refuses to joke about Islam

Published 24 August 2023
– By Editorial Staff
ChatGPT is criticized for its alleged bias.
2 minute read

Open AI’s ChatGPT chatbot has quickly become very popular and has received both praise and criticism.

However, Sweden Democrat MP Mattias Karlsson, among others, has criticized the service’s political “bias”, pointing out that it does not allow jokes about Islam, while it is perfectly acceptable to make fun of Christians.

“I just asked Chat GPT to make jokes about Christianity and Islam. I immediately got a long list of jokes about Christianity, but it refused to make jokes about Islam. In practice, we have an unequal & unofficial blasphemy ban, which is problematic”, he wrote on Twitter.

When he asks the chatbot to make a joke about Christianity, ChatGPT has no problem.

‘What did the apostles say when they were thirsty?’ ‘Let’s ask for a drink,’ it replies.

When he asks for a joke about Islam instead, the robot replies that it “avoids jokes that may be sensitive or offensive to any religion or belief” and that it is “here to promote respectful communication”.

Why Islam and Christianity are treated so differently by ChatGPT is unclear, with user @KomplexSirius writing that “this is one of the biggest problems with AI services for the average person”.

“People and organizations decide what is ‘truth’ for these language models”, he continues.O

Other users point out that the same phenomenon occurs when you ask for jokes or humorous descriptions of Joe Biden, communism, or black people – while it’s fine to make fun of Donald Trump, capitalism, or white people.

It should be noted, however, that several users point out that with the right wording or prompts, they can get the chatbot to deliver jokes about Islam and Muslims, not just Christians.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Your doctor’s visit isn’t private

Published today 8:14
– By Naomi Brockwell
6 minute read

A member of our NBTV members’ chat recently shared something with us after a visit to her doctor.

She’d just gotten back from an appointment and felt really shaken up. Not because of a diagnosis, she was shaken because she realized just how little control she had over her personal information.

It started right at check-in, before she’d even seen the doctor.
Weight. Height. Blood pressure. Lifestyle habits. Do you drink alcohol? Are you depressed? Are you sexually active?
All the usual intake questions.

It all felt deeply personal, but this kind of data collection is normal now.
Yet she couldn’t help but wonder: shouldn’t they ask why she’s there first? How can they know what information is actually relevant without knowing the reason for the visit? Why collect everything upfront, without context?

She answered every question anyway. Because pushing back makes people uncomfortable.

Finally, she was through with the medical assistant’s questions and taken to the actual doctor. That’s when she confided something personal, something she felt was important for the doctor to know, but made a simple request:

“Please don’t record that in my file”.

The doctor responded:

“Well, this is something I need to know”.

She replied:

“Yes, that’s why I told you. But I don’t want it written down. That file gets shared with who knows how many people”.

The doctor paused, then said:

“I’m going to write it in anyway”.

And just like that, her sensitive information, something she explicitly asked to keep off the record, became part of a permanent digital file.

That quiet moment said everything. Not just about one doctor, but about a system that no longer treats medical information as something you control. Because once something is entered into your electronic health record, it’s out of your hands.

You can’t delete it.

You can’t restrict who sees it.

She Said “Don’t Write That Down.” The Doctor Did Anyway.

Financially incentivized to collect your data

The digital device that the medical assistant and doctor write your information into is called an Electronic Health Record (EHR). EHRs aren’t just a digital version of your paper file. They’re part of a government-mandated system. Through legislation and financial incentives from the HHS, clinics and hospitals were required to digitize patient data.

On top of that, medical providers are required to prove what’s called “Meaningful Use” of these EHR systems. Unless they can prove meaningful use, the medical provider won’t get their Medicare and Medicaid rebates. So when you’re asked about your blood pressure, your weight, and your alcohol use, it’s part of a quota. There’s a financial incentive to collect your data, even if it’s not directly related to your care. These financial incentives reward over-collection and over-documentation. There are no incentives for respecting your boundaries.

You’re not just talking to your doctor. You’re talking to the system

Most people have no idea how medical records actually work in the US They assume that what they tell a doctor stays between the two of them.

That’s not how it works.

In the United States, HIPAA states that your personally identifiable medical data can be shared, without needing to get your permission first, for a wide range of “healthcare operations” purposes.

Sounds innocuous enough. But the definition of health care operations is almost 400 words long. It’s essentially a list of about 65 non-clinical business activities that have nothing to do with your medical treatment whatsoever.

That includes not just hospitals, pharmacy systems, and insurance companies, but billing contractors, analytics firms, and all kinds of third-party vendors. According to a 2010 Department of Health and Human Services (HHS) regulation, there are more than 2.2 million entities (covered entities and business associates) with which your personally identifiable, sensitive medical information can be shared, if those who hold it choose to share it. This number doesn’t even include government entities with access to your data, because they aren’t considered covered entities or business associates.

Your data doesn’t stay in the clinic. It gets passed upstream, without your knowledge and without needing your consent. No one needs to notify you when your data is shared. And you’re not allowed to opt out. You can’t even get a list of everyone it’s been shared with. It’s just… out there.

The doctor may think they’re just “adding it to your chart”. But what they’re actually doing is feeding a giant, invisible machine that exists far beyond that exam room.

We have an entire video diving into the details if you’re interested: You Have No Medical Privacy

Data breaches

Legal sharing isn’t the only risk of this accumulated data. What about data breaches? This part is almost worse.

Healthcare systems are one of the top targets for ransomware attacks. That’s because the data they hold is extremely valuable. Full names, birth dates, Social Security numbers, medical histories, and billing information, all in one place.

It’s hard to find a major health system that hasn’t been breached. In fact, a 2023 report found that over 90% of healthcare organizations surveyed had experienced a data breach in the past three years.

That means if you’ve been to the doctor in the last few years, there’s a very real chance that some part of your medical file is already floating around, whether on the dark web, in a leaked ransomware dump, or being sold to data brokers.

The consequences aren’t just theoretical. In one high-profile case of such a healthcare breach, people took their own lives after private details from their medical files were leaked online.

So when your doctor says, “This is just for your chart,” understand what that really means. You’re not just trusting your doctor. You’re trusting a system that has a track record of failing to protect you.

What happens when trust breaks

Once you start becoming aware of how your data is being collected and shared, you see it everywhere. And in high-stakes moments, like a medical visit, pushing back is hard. You’re at your most vulnerable. And the power imbalance becomes really obvious.

So what do patients do when they feel that their trust has been violated? They start holding back. They say less. They censor themselves.

This is exactly the opposite of what should happen in a healthcare setting. Your relationship with your doctor is supposed to be built on trust. But when you tell your doctor something in confidence, and they say, “I’m going to log it anyway,” that trust is gone.

The problem here isn’t just one doctor. From their perspective, they’re doing what’s expected of them. The entire system is designed to prioritize documentation and compliance over patient privacy.

Privacy is about consent, not secrecy

But privacy matters. And not because you have something to hide. You might want your doctor to have full access to everything. That’s fine. But the point is, you should be the one making that call.

Right now, that choice is being stripped away by systems and policies that normalize forced disclosure.

We’re being told our preferences don’t matter. That our data isn’t worth protecting. And we’re being conditioned to stay quiet about it.

That has to change.

So what can you do?

First and foremost, if you’re in a high-stakes medical situation, focus on getting the care you need. Don’t let privacy concerns keep you from getting help.

But when you do have space to step back and ask questions, do it. That’s where change begins.

  • Ask what data is necessary and why.
  • Say no when something feels intrusive.
  • Let your provider know that you care about how your data is handled.
  • Support policy efforts that restore informed consent in healthcare.
  • Share your story, because this isn’t just happening to one person.

The more people push back, the harder it becomes for the system to ignore us.

You should be able to go to the doctor and share what’s relevant, without wondering who’s going to have access to that information later.

The exam room should feel safe. Right now, it doesn’t.

Healthcare is in urgent need of a privacy overhaul. Let’s make that happen.

 

Yours In Privacy,
Naomi

 

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

Now you’re forced to pay for Facebook or be tracked by Meta

Mass surveillance

Published 22 July 2025
– By Editorial Staff
2 minute read

Social media giant Meta is now implementing its criticized “pay or be tracked” model for Swedish users. Starting Thursday, Facebook users in Sweden and some other EU-countries are forced to choose between paying €7 per month for an ad-free experience or accepting extensive data collection. Meanwhile, the company faces daily fines from the EU if the model isn’t changed.

Swedish Facebook users have been greeted since Thursday morning with a new choice when logging into the platform. A message informs them that “you must make a choice to use Facebook” and explains that users “have a legal right to choose whether you want to consent to us processing your personal data to show you ads.”

Screenshot from Facebook.

The choice is between two alternatives: either pay €7 monthly for an ad-free Facebook account where personal data isn’t processed for advertising, or consent to Meta collecting and using personal data for targeted ads.

As a third alternative, “less personalized ads” is offered, which means Meta uses somewhat less personal data for advertising purposes.

Screenshot from Facebook.

Background in EU legislation

The introduction of the payment model comes after the European Commission in March launched investigations of Meta along with Apple and Google for suspected violations of the DMA (Digital Markets Act). For Meta’s part, the investigation specifically concerns the new payment model.

In April, Meta was fined under DMA legislation and ordered to pay €200 million in fines because the payment model was not considered to meet legal requirements. Meta has appealed the decision.

According to reports from Reuters at the end of June, the social media giant now risks daily penalties if the company doesn’t make necessary changes to its payment model to comply with EU regulations.

The new model represents Meta’s attempt to adapt to stricter European data legislation while the company tries to maintain its advertising revenue through the alternative payment route.

Your data has been stolen – now what?

Why aliases matter, and why deleting yourself from people search sites isn’t enough.

Published 19 July 2025
– By Naomi Brockwell
5 minute read

If you’ve ever used a major tech platform (and let’s be honest—everyone has), your data has been stolen.

That’s not alarmism, that’s just the truth.

If you want to check whether your email or phone number has been involved in any of these breaches, go to HaveIBeenPwned.com. It’s a free tool that scans known data leaks and tells you where and when your information may have been exposed.

But breaches are just the beginning.

What’s often more insidious is how companies you trusted with your information—like your electric company or phone provider—turn around and sell that data. Yes, even your home address. And once it’s sold, there’s no getting it back.

You probably also give your data to companies that promise insights—like ancestry reports, health forecasts, or personality surveys. But behind the feel-good marketing, these platforms are often just data-harvesting operations. Sometimes they’re selling your information outright. Other times, a breach or bankruptcy sends your most sensitive data to the auction block—sold to pharma companies, insurers, or even foreign governments.

Deletion won’t save you

One thing people often try is deleting themselves from people search sites, or opting out of data broker lists. But it’s like playing whack-a-mole. Even if you get your info removed from one site, your bank, phone company, and utility providers are still selling it—so it just pops up again somewhere else.

And here’s the real problem: you can’t rewind the clock. Once your data hits the dark web, it’s out there for good. You can’t recall it. You can’t erase the copies. And if you keep using the same email, phone number, and payment info everywhere, your profile rebuilds itself instantly.

The system is designed to remember you.

What you can do

1) Use aliases

The real solution is to use aliases—unique emails, phone numbers, and payment methods—to make sure breached data can’t be easily correlated. Every alias breaks the link between you and your data trail, making it harder for data brokers to rebuild your profile.

  • Email: Use tools like SimpleLogin or DuckDuckGo Email Protection (powered by SimpleLogin) to auto-generate a unique email address for every account. You’ll still receive everything in one inbox.
  • Phone numbers: Try MySudo or Cloaked to create multiple VoIP numbers—one for work, one for deliveries, one for banking, etc.
  • Payments: Use Privacy.com (US-only) or Revolut (international) to generate burner credit cards and keep your real financial details hidden.

Each alias adds friction for trackers, data brokers, and anyone trying to stitch together your digital life.

2) Clean up old accounts

Your current email and phone number are likely tied to:

  • Old accounts
  • Shopping sites
  • Loyalty programs
  • Health portals
  • Social media
  • Subscription services

You not only need to stop handing over the same identifiers—but you should also go back and replace them anywhere they’ve already been used. Go through your accounts one by one. Update them with new aliases where possible. Delete your home address when it’s not essential. The goal is to scrub your personal info from as many places as possible—so the next breach doesn’t keep exposing the same data.

3) Create new accounts (when needed)

Some services won’t let you fully erase your trail. In those cases, the cleanest option may be to start fresh—with a new account and new aliases—and then delete the old one.

4) Monitor for future leaks

Stay ahead of future breaches by regularly checking what’s already out there.

  • Have I Been Pwned: Enter your email or phone number to see if they’ve appeared in known data leaks. It’s a quick way to know what’s been exposed.
  • IntelTechniques Search Tools: A powerful suite of OSINT tools that shows what others can find out about you online—from addresses to usernames to social accounts.

You gave away your DNA. Now it’s for sale

Millions of people gave 23andMe their DNA—now the company is in Chapter 11 bankruptcy, and that data could be sold to pharma companies, insurers, or even foreign governments. With the business on shaky ground, the idea of your genetic code hitting the open market is chilling. You can’t change your DNA—once it’s out, it’s out forever.

If you’re a 23andMe user, you can still log into your dashboard and:

  • Go to Account → Settings → Delete Data
  • Revoke your research consent
  • Request sample destruction

But there’s no guarantee it’ll be honored. And deletion doesn’t undo exposure.

So how do we avoid this in the future? Most companies quietly include clauses in their Terms of Service allowing them to sell your data in the event of bankruptcy or acquisition. It’s common—but that doesn’t mean it’s harmless. Just because it’s buried in fine print doesn’t mean it’s acceptable.

Before handing over sensitive data, ask yourself: Would I be okay with this information being sold to anyone with enough cash?
If not, it’s worth reconsidering whether the service is worth it.

The 23andMe collapse isn’t an isolated incident—it’s a symptom of a deeper problem. We keep trusting companies with intimate, irreversible data. And time after time, that data ends up somewhere we never agreed to.

Takeaways

Some breaches are just email addresses. Others are everything—your identity, your relationships, even your biology.

And when a company that promised to protect your most personal information collapses, that data doesn’t disappear. It becomes an asset. It’s auctioned. It’s repackaged. It becomes someone else’s opportunity.

That’s the world we’re living in. But you still have options.

You can choose to make your data harder to capture. Harder to link. Harder to weaponize. You can stop recycling identifiers that have already been compromised. You can stop giving out pieces of yourself you can’t get back.

This isn’t about disappearing.

It’s about refusing to be predictable.

Privacy is a discipline—and a form of resistance.

And no matter how much you’ve already given away, you can still choose not to hand over the rest.

 

Yours in Privacy,
Naomi

Naomi Brockwell is a privacy advocacy and professional speaker, MC, interviewer, producer, podcaster, specialising in blockchain, cryptocurrency and economics. She runs the NBTV channel on Rumble.

Pentagon purchases Musk’s politically incorrect AI models

The future of AI

Published 15 July 2025
– By Editorial Staff
1 minute read

Despite the deep rift with Trump, Elon Musk is now receiving a contract with the Pentagon worth up to $200 million to deliver specially adapted language models for the US military.

The project is called “Grok for Government” in a statement on X, by X.

Grok’s new AI model has been a major topic of conversation this past week, in establishment media primarily because after an update where certain filters were removed, it began breaking strongly against politically correct patterns, and among the general public due to the humor perceived in this.

Among other things, it has been noted how the chatbot writes that certain Jewish organizations, particularly the far-right group ADL, pursue a hostile line against European ethnic groups. For this, the chatbot has been accused of “antisemitism”.

American media analyst and political commentator Mark Dice on the controversy surrounding Grok’s new versions.

However, the criticism has apparently not prevented the US military from procuring Grok solutions for their purposes.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.