Thursday, June 19, 2025

Polaris of Enlightenment

Telegram CEO: “Criminals are abusing our platform”

Internet censorship

Published 7 September 2024
– By Editorial Staff
Critics fear that Durov has been pressured to introduce more censorship and surveillance.
4 minute read

Telegram’s founder and CEO, Pavel Durov, now regrets that criminals are exploiting his platform and promises to do what he can to make Telegram a “safer” place – a process that has reportedly already begun internally.

Meanwhile, Telegram has quietly updated its FAQ, replacing promises that chats are private and that data from them will not be shared with outsiders with instructions on how users can report “suspicious” messages.

In late August, Mr. Durov was arrested by French police but released after four days if he promised not to leave the country while the investigation against him was ongoing.

The reason for Durov’s arrest was that he was not considered to be doing enough to censor, monitor and otherwise prevent criminals from using Telegram for criminal activities – and that, as CEO, he was allegedly personally responsible for crimes committed on Telegram, which is why he reportedly faced a multi-year prison sentence.

Now, critics say that the pressure from the French police has paid off, and Durov has been pressured to introduce more censorship and surveillance on the messaging platform. Telegram has quietly updated its FAQ to include the sentence “All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them” seems to have disappeared or been hidden.

Instead, users are now greeted with the text “All Telegram apps have ‘Report’ buttons” that will give a way for users to flag illegal content for the app’s moderators”, followed by instructions on how users should go about reporting posts they want reviewed.

“Always open to dialogue”

That more censorship and surveillance may be in store for Telegram also seems to be confirmed by a statement from Durov himself yesterday.

“Last month I got interviewed by police for 4 days after arriving in Paris. I was told I may be personally responsible for other people’s illegal use of Telegram, because the French authorities didn’t receive responses from Telegram”, he begins, saying that he was “surprised” to learn about this.

Durov stresses that official representatives already respond to requests from the EU and other authorities, and that there have been various opportunities for disgruntled countries to initiate legal proceedings against Telegram if they were “unhappy” with the way the platform was being run.

“We’ve been committed to engaging with regulators to find the right balance. Yes, we stand by our principles: our experience is shaped by our mission to protect our users in authoritarian regimes. But we’ve always been open to dialogue”, he continues, explaining that they have left countries where they have not been able to work with authorities to find “the right balance between privacy and security”.

“When Russia demanded we hand over “encryption keys” to enable surveillance, we refused — and Telegram got banned in Russia. When Iran demanded we block channels of peaceful protesters, we refused — and Telegram got banned in Iran. We are prepared to leave markets that aren’t compatible with our principles, because we are not doing this for money. We are driven by the intention to bring good and defend the basic rights of people, particularly in places where these rights are violated”, Durov added.

“Already started the process”

At the same time, he says that we “we hear voices saying that it’s not enough”, and that it has become easier for criminals to abuse the platform as it has grown very quickly.

“That’s why I made it my personal goal to ensure we significantly improve things in this regard. We’ve already started that process internally, and I will share more details on our progress with you very soon. I hope that the events of August will result in making Telegram — and the social networking industry as a whole — safer and stronger”.

Exactly how or what will be “improved” is not clear, but many interpret the move as Durov being pressured or persuaded to introduce more censorship and surveillance on Telegram – and speculate that this is either to avoid the app being shut down and banned in France and other EU countries, or because Durov does not want to risk being imprisoned, paying large damages, or otherwise being personally punished for messages written on the platform.

TNT is truly independent!

We don’t have a billionaire owner, and our unique reader-funded model keeps us free from political or corporate influence. This means we can fearlessly report the facts and shine a light on the misdeeds of those in power.

Consider a donation to keep our independent journalism running…

Swedish police urge parents to delete chat apps from children’s phones

organized crime

Published 13 June 2025
– By Editorial Staff
2 minute read

Ahead of the summer holidays, the Swedish police are warning that criminal gangs are using social media to recruit young people into crime. On Facebook, the authorities have published a list of apps that parents should keep a close eye on – or delete immediately.

Critics argue, however, that the list is arbitrary and that it is strange for the police to urge parents to delete apps that are used by Swedish authorities.

During the summer holidays, adults are often less present in young people’s everyday lives, while screen time increases. According to the police, this creates increased vulnerability. Criminal networks then try to recruit young people to handle weapons, sell drugs, or participate in serious violent crimes such as shootings and explosions.

To prevent this, a national information campaign has been launched in collaboration with the County Administrative Board. The police, together with the County Administrative Board, have compiled a list of mobile apps that they believe pose a significant risk:

  • Delete immediately: Signal, Telegram, Wickr Me
  • Keep control over: Snapchat, WhatsApp, Discord, Messenger
  • Monitor closely: TikTok, Instagram

Digital parental presence

Maja Karlsson, municipal police officer in Jönköping, also emphasizes the importance of digital parental presence:

We need to increase digital control and knowledge about which apps my child is using, who they are in contact with, and why they have downloaded different types of communication apps.

The police recommend that parents talk openly with their children about what they do online and use technical aids such as parental controls.

– There are tools available for parents who find it difficult. It’s not impossible, help is available, Karlsson continues.

Parents are also encouraged to establish fixed routines for their children and ensure they have access to meaningful summer activities.

“Complete madness”

However, the list has been met with harsh criticism from several quarters. Users point out that the Signal app is also used by the Swedish Armed Forces and question why the police list it as dangerous.

If general apps like Signal are considered dangerous, the phone app and text messaging should be first on the list”, writes another user.

Critics argue that it is not the apps themselves but how they are used that is crucial, and find it remarkable that the police are arbitrarily and without deeper justification telling parents which messaging apps are okay to use and which are not.

Complete madness to recommend uninstalling chat apps so broadly. You should know better”, comments another upset reader.

Macron seeks to ban children from social media

Internet censorship

Published 12 June 2025
– By Editorial Staff
While most people agree that children need to be protected online, many worry about arbitrary censorship and lack of legal certainty.
3 minute read

French President Emmanuel Macron wants to ban social media for children under the age of 15. At the same time, the European Commission has stated that such decisions are a national matter.

Macron advocates an EU-wide age verification system, but the Commission believes that responsibility lies with individual member states.

The president’s statement came late on Tuesday in response to a tragic knife attack in a Paris suburb where a teacher’s assistant was stabbed to death by a 14-year-old student.

Macron, who has previously advocated a ban on social media for younger users, now raised the tone further and called on the EU and its member states to act quickly.

– I’m giving us a few months to achieve European mobilization. Otherwise, I will negotiate with the Europeans so that we can do it ourselves in France, said the president.

However, the EU Commission’s response was clear: it is up to the French authorities to decide on the issue.

– Let’s be clear… wide social media ban is not what the European Commission is doing. It’s not where we are heading to. Why? Because this is the prerogative of our member states, Commission spokesman Thomas Regnier told reporters yesterday.

Big problem in Denmark

According to the EU’s General Data Protection Regulation (GDPR), member states have the right to set their own minimum age for when social media platforms can process personal data, as long as it is above 13 years.

The GDPR is an EU law that regulates the handling of personal data and allows for national adaptations for example, data may be processed for younger users if their parents give their consent.

– Of course, member states can go for that option, Regnier continued.

But introducing such a ban is easier said than done. Technical challenges make it difficult to verify users’ ages. In Denmark, for example, almost half of all children under the age of ten already have social media accounts. By the age of 13, almost everyone is registered, according to the country’s Minister for Digitalization, Caroline Stage Olsen.

Digital Services Act

In addition to the GDPR, the DSA (Digital Services Act) also plays an important role. The DSA is an EU law that regulates digital services and platforms and gives the Commission responsibility and powers to supervise large social media platforms. The law also requires that minors be protected online.

– We want to make the digital space safe but also need to tackle risks coming from it. This is where the DSA comes into place, Regnier claimed.

The Commission is currently working on EU-wide guidelines on how platforms should comply with the DSA on issues relating to the protection of minors. These guidelines are expected to be finalised before the summer break. At the same time, an age verification app is being developed and will be tested in five countries, including France.

Risk of censorship

Despite ongoing initiatives, France and several other EU countries have expressed frustration with the Commission’s pace of work. Denmark, which takes over the presidency of the EU Council of Ministers from July to December, plans to push for better protection for minors online in the coming months.

Although the Digital Services Act is praised by its proponents, the law has also been criticized for threatening the rule of law and freedom of expression. Critics warn that the DSA, which requires the rapid removal of illegal content, risks leading to arbitrary censorship and overblocking, where platforms delete even legal material for fear of sanctions.

There are also concerns that the rules could be abused to silence opposition and political dissent and that protecting children is not really the issue at stake. Since legal review often takes place after the fact, the protection of fundamental rights is also being called into question.

Why does France want to ban social media for children?

French officials raised several reasons for a ban for children under 15:

  • Mental health: Concerns about increasing mental health problems among young people, linked to the impact of social media on self-esteem, sleep and concentration.
  • Bullying and harassment: Social media is often used as a platform for cyberbullying, which hits children particularly hard.
  • Exposure to harmful content: Children are at risk of being exposed to violent, sexual or extreme content without being able to handle it.
  • Data protection and privacy: Children's personal data is handled by commercial platforms without sufficient control or understanding.
  • School-related violence: The recent knife attack at a school was used as an example of how digital environments can contribute to radicalization or aggressive behaviour.
  • Parental responsibility and control: Macron says the current system makes it difficult for parents to know what their children are doing online.

.

US shuts down Biden’s censorship agency

Donald Trump's USA

Published 18 April 2025
– By Editorial Staff
Members of the Trump administration have long expressed concerns about how freedom of expression is being restricted and curtailed in various ways.
2 minute read

The United States has now officially shut down an agency that, according to Secretary of State Marco Rubio, was used by the Biden administration to systematically censor US citizens with uncomfortable views.

The Global Engagement Center (GEC) was established in 2016 within the US Department of State, with a mission to “recognize, understand, expose, and counter foreign state and non-state propaganda and disinformation”.

In December, the center was renamed Counter Foreign Information Manipulation and Interference (R/FIMI), but on Wednesday, Marco Rubio announced that it had been permanently shut down.

– Under the previous administration, this office, which cost taxpayers more than $50 million per year, spent millions of dollars to actively silence and censor the voices of Americans they were supposed to be serving, Rubio said.

– This is antithetical to the very principles we should be upholding and inconceivable it was taking place in America.

In an interview published Wednesday with conservative activist Mike Benz, Rubio explained that the GEC was initially intended as a tool to combat extremist propaganda from groups like al-Qaeda and ISIS, but that the operation later began “going after individual American voices”.

– We ended government-sponsored censorship in the United States through the State Department, he declared.

“Worst offender in US government censorship”

Rubio added that the Biden administration had supported groups that “literally tagging and labeling voices in American politics – Ben Shapiro, The Federalist, others – tagging them as foreign agents”.

The GEC had an annual budget of $61 million and employed about 120 people. In December, Republican members of Congress refused to provide continued funding for the unit.

President Donald Trump and his supporters have long accused Democrats of using government institutions to silence conservative views online. In 2023, tech billionaire Elon Musk also criticized the GEC, calling it “worst offender in US government censorship & media manipulation” and “a threat to our democracy”.

Journalist Matt Taibbi also accused the center of trying to suppress discussions on COVID-19 under the pretext of fighting “Russian personas and proxies”.

Already last year, a group of Republican members of Congress harshly criticized the GEC in a letter to then Secretary of State Antony Blinken. The letter accused the Center of bias in favor of “American progressives” and of trying to silence opinions that were “deemed politically inconvenient or disagreeable”.

US tech giants cave to EU censorship demands

Internet censorship

Published 29 January 2025
– By Editorial Staff
Most major tech giants have chosen to adopt the EU Code of Conduct - which is supposedly voluntary.
2 minute read

US tech companies Facebook, X and YouTube have agreed to step up their efforts to combat alleged online hate.

The agreement comes under the updated EU Code of Conduct, which is now integrated into the Union’s regulatory framework, the Digital Services Act (DSA).

Meta, Elon Musk’s X (formerly Twitter), Google’s YouTube and several other tech companies have agreed to strengthen their efforts against so-called hate content on their platforms. This includes enhanced efforts to detect and address unauthorized online speech and posts under the updated EU Code of Conduct.

– In Europe there is no place for illegal hate, either offline or online. I welcome the stakeholders’ commitment to a strengthened Code of conduct under the Digital Services Act (DSA), commented EU Tech Commissioner Henna Virkkunen.

The revised code, which is said to be voluntary for companies to sign up to, requires, among other things, faster handling of reports of suspected “cyber hate”. Companies commit to working with non-profit organizations and public bodies to review at least two-thirds of incoming reports within 24 hours. In addition, automated tools will be used to reduce the spread of so-called hate content, and companies will also provide detailed information on the role of algorithms and content recommendations.

In addition to the major platforms, other affiliates include TikTok, LinkedIn and Twitch. The EU stresses that compliance with this code of conduct could influence how Union regulators apply the rules of the DSA, which entered into force in 2022 and aims, among other things, to combat illegal content and protect users’ safety online.

Threatened annulment of elections

The EU’s new measures are part of its broader strategy to regulate the tech sector and ensure that companies act in line with what the EU itself claims are “democratic values”. In the past, the EU has also introduced the Digital Markets Act (DMA), which aims to limit the dominance of tech giants and promote competition.

Another example of the EU’s regulatory zeal is recently reported statements by Thierry Breton, former European Commissioner, that the Union can use the DSA to annul elections if there is suspicion of foreign influence. Breton mentioned, among others, Elon Musk’s platform X as a potential risk of influence during the upcoming German elections.

In addition, Google has been criticized for its introduction of “digital fingerprinting”, a technology that critics say undermines users’ privacy. The UK’s Information Commissioner’s Office (ICO) has expressed concerns about the technology and warned that it could be used for widespread surveillance.

Although the EU claims that the aim of the DSA and the updated code of conduct is to combat hate speech and protect democracy, critics have warned that the Union’s rules could severely restrict citizens’ freedom of expression.

By imposing strict requirements on platforms to monitor and filter content, there is a risk of creating a digital landscape where controversial views are censored and the climate of debate is negatively affected.

Our independent journalism needs your support!
We appreciate all of your donations to keep us alive and running.

Our independent journalism needs your support!
Consider a donation.

You can donate any amount of your choosing, one-time payment or even monthly.
We appreciate all of your donations to keep us alive and running.

Dont miss another article!

Sign up for our newsletter today!

Take part of uncensored news – free from industry interests and political correctness from the Polaris of Enlightenment – every week.