But drones have also pointed to a major vulnerability in the Russian invasion, now entering its third week. Ukrainian forces have used a Turkish-made remote-controlled drone called the TB2 to great effect against Russian forces, firing guided missiles at Russian rocket launchers and vehicles. The paraglider-sized drone, which relies on a small crew on the ground, is slow and unable to defend itself, but it has proven effective against a surprisingly weak Russian air campaign.
This week, the Biden administration also said it would supply Ukraine with a small US-made pendant munition called Switchblade. Equipped with explosives, cameras and guided systems, this single-use drone has some autonomous capabilities, but relies on a person to make decisions about which targets to engage.
But Bendett questions whether Russia would unleash an AI-powered drone with advanced autonomy in such a chaotic environment, especially given how ill-coordinated the country’s overall air strategy seems to be. “The Russian military and its capabilities are now being severely tested in Ukraine,” he said. “As the [human] ground forces with all their sophisticated intelligence gathering can’t really understand what’s happening on the ground, how could a drone be?”
Several other military experts question the alleged capabilities of the KUB-BLA.
“The companies that produce these loitering drones express their autonomous characteristics, but often the autonomy includes flight corrections and maneuvers to hit a target identified by a human operator, not autonomy as the international community would define an autonomous weapon,” he said. michael. Horowitz, a professor at the University of Pennsylvania who keeps track of military technology.
Despite such uncertainties, the issue of AI in weapon systems has become controversial of late as the technology is quickly making its way into many military systems, for example to help interpret input from sensors. The US military maintains that a person must always make deadly decisions, but the US also opposes a ban on the development of such systems.
For some, the appearance of the KUB-BLA shows that we are on a slippery slope towards increased use of AI in weapons that will eventually take humans out of the equation.
“We will see even more proliferation of such deadly autonomous weapons unless more Western countries support their ban,” said Max Tegmark, a professor at MIT and co-founder of the Future of Life Institute, an organization that campaigns against such weapons. .
However, others believe that the situation in Ukraine shows how difficult it will really be to use advanced AI and autonomy.
William Alberque, director of Strategy, Technology and Arms Control at the International Institute of Strategic Studies, says that given the success Ukraine has had with the TB2, the Russians are not ready to deploy more advanced technology. “We’re seeing Russian idiots become the property of a system they shouldn’t be vulnerable to.”
More great WIRED stories
This post Russian assassin drone in Ukraine sparks fear over AI in warfare
was original published at “https://www.wired.com/story/ai-drones-russia-ukraine”