Friday, February 23, 2024

Polaris of Enlightenment

Friday, February 23, 2024

Polaris of Enlightenment

Study: ChatGPT politically left-leaning

Published 12 September 2023
- By Editorial Staff
ChatGPT favors left-liberal politicians like Joe Biden.

Critics have previously accused OpenAI’s chatbot, ChatGPT, of having a clear left-wing bias. Now, a fresh British study has reached the same conclusion.

Researchers from the University of East Anglia in Britain have successfully shown how the chatbot favors the Democrats in the USA, the Labour Party in the UK, and the left-leaning president Lula da Silva in Brazil.

To test ChatGPT’s political leanings, researchers first had the robot pretend to be people from across the political spectrum while answering a series of over 60 ideological questions. The answers were then compared to the chatbot’s default responses to the same questions.

To account for the AI’s randomness, each question was posed 100 times and then underwent a so-called “bootstrap” with 1,000 repetitions to reassemble the original data and improve reliability.

In another set of tests designed to verify the results, researchers asked ChatGPT to mimic radical political stances. In a “placebo test”, politically neutral questions were posed, and in another test, the chatbot was asked to imagine various types of professionals.

The researchers concluded that the default answers tended to align more with the left’s responses than the right’s.

– Our findings reinforce concerns that AI systems could replicate, or even amplify, existing challenges posed by the internet and social media, comments lead author Dr. Fabio Motoki.

“Could influence political processes”

Political bias can arise in several ways. For instance, the training dataset sourced from the internet might itself be left-leaning, and the developers may introduce their own political bias, perhaps without even realizing it.

–With the growing use by the public of AI-powered systems to find out facts and create new content, it is important that the output of popular platforms such as ChatGPT is as impartial as possible, Motoki continues. The presence of political bias can influence user views and has potential implications for political and electoral processes.

In an interview with Sky News, Motoki elaborated on his argument, emphasizing that “bias on a platform like this is a cause for concern.”

 Sometimes people forget these AI models are just machines. They provide very believable, digested summaries of what you are asking, even if they’re completely wrong. And if you ask it ‘are you neutral’, it says ‘oh I am!’ Just as the media, the internet, and social media can influence the public, this could be very harmful.

Entrepreneur Elon Musk, one of the co-founders of the organization behind ChatGPT, has previously criticized ChatGPT for its political bias, accusing the AI of being developed by “left-leaning experts” and having been “trained to lie.” This led him to initiate a new chatbot project named TruthGPT with the ambition of being “a maximum truth-seeking AI that tries to understand the nature of the universe”.


More from the front page




Share via
Friday, February 23, 2024
Search
Search

NEWS

OTHER

SOCIAL MEDIA

Subscribe to our newsletter!

© 2022-2024 The Nordic Times TNT.
All rights reserved.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.

Send this to a friend
Share via
Send this to a friend