Ukraine has been called a “test lab for the future of war”. With the help of the US-based company Palantir, headed by a high-profile member of the steering committee of the infamous Bilderberg Group, drones have now been developed that automate killing and use AI to identify Russian soldiers.
Peter Thiel is not only a member of the steering committee of the globalist power network Bilderberg Group. His company Palantir, described by Time magazine as “the AI arms dealer of the 21st century”, has developed technology that allows drones to detect Russian soldiers and other “hostile targets” without human intervention. Based on characteristics such as uniforms, weapons and human movement patterns, the drones can not only identify targets for their attacks, but also carry them out autonomously.
Ukrainian-American David Kirichenko, a researcher in cyberwarfare and military strategy and active at the neoconservative think tank Henry Jackson Society, believes that AI and technology without direct human intervention will play a more important role on the battlefields of the future and that the Ukrainian war is a harbinger of this.
“Over time, the battlefield is becoming a clash of algorithms. As the world becomes more digitised, technology’s influence in warfare will only grow. Cheap drones have already transformed the battlefield, accelerating both sides’ need to adapt and develop new technological advancements“, he said.
“Ultimately, the Russia-Ukraine war has highlighted the need for NATO countries to embrace and adapt to the technological advancements seen in Ukraine, many of which are emerging from off-the-shelf commercial technologies. NATO must prepare for the future of warfare, where the first large-scale drone war is rapidly transitioning into the first AI-driven war“, he further writes.
“21st century AI arms dealer”
AI technology is said to have improved the ability of drones to “hit the enemy” from just under 50% in 2023 to nearly 80% in 2024 – and Palantir is described by Time magazine as crucial to this development.
One example is the SAKER reconnaissance drone, which uses Palantir’s AI and is said to be able to identify different types of enemy targets on its own and then relay this in real time to its command post, which chooses when and how to hit the enemy.
Ukraine and its allies are said to be particularly excited about the AI’s ability to improve its capabilities on its own and “learn” to identify targets by watching video clips of Russian forces.
The Ukrainian Saker Scout drone is equipped with artificial intelligence. It can identify even camouflaged enemy military vehicles, locate their coordinates and transmit the data to the command center.
Once enemy targets have been acquired, the AI-powered reconnaissance drone… pic.twitter.com/5Wu5hPUoDA
— Lecomte (@Eric_Lecomte_) January 25, 2024
Serious moral aspects
Although the Ukrainian side is keen to use the new technology to better resist a numerically and militarily superior Russia, critics point out that there are serious moral aspects that should be considered.
For example, the fact that artificial intelligence is suddenly given the mandate to decide who is the “enemy”, and also to act and thus automate the killing, could, according to analysts, lead to disastrous consequences. Among the many aspects mentioned are the risks to the civilian population if they are misidentified, for example because of their “movement patterns”, and suddenly marked as hostile targets.
Defenders of the technology argue that it is ultimately humans who decide which targets to destroy, while critics point out that there is a high risk of relying too much on a tool that does not have the capacity to assess nuances on a battlefield in the same way as a human operator.