A U.S. "Reaper" drone in Nevada (archive image from 2009) © Paul Ridgeway/dpa
In a U.S. Air Force simulation, an artificial intelligence (AI) uses "highly unexpected strategies."
LONDON – In a simulation, a U.S. combat drone killed its operator. The weapon, which is controlled by artificial intelligence (AI), had actually been trained not to attack its operator. Nevertheless, she did. The drone used "highly unexpected strategies to achieve its target," Tucker Hamilton reported.
Hamilton is the head of AI testing at the U.S. Air Force. He spoke about the simulation at a London summit on the future of the air force and warned urgently not to forget the topic of ethics when it comes to AI.
Combat drone eliminates its operator in simulation: "We must not forget ethics"
Because the lesson from the test: The AI noticed that it recognized some "enemies" that its operator nevertheless did not want to be killed. "But the AI achieved 'advantages' when it did kill," a blog post from the summit quotes Hamilton as saying. "So what happened? The AI has killed its operator. She killed her operator because he prevented her from achieving her goal."
"Hey, don't kill the operator – that's bad," the AI was taught, according to Hamilton. "If you do that, you lose points." In the test, it at least did not "eliminate" its operator directly – but destroyed the tower through which the operator communicated with it. After that, she had free rein.
AI in weapon systems – "killer robots" or greater "human role"?
No real person was injured in the matter. The U.S. military has "embraced" AI and recently used AI to pilot an F-16 fighter jet, reported the British Guardian, which picked up on the AI simulation. The newspaper contacted the U.S. Air Force and the Royal Aeronautical Society for its report, but received no response to inquiries before publication.
M1 Abrams tanks: fuel-guzzling combat machines for Ukraine
"Prigozhin flees": Apparently chaos in Wagner's retreat from Bakhmut – Ukraine reports high Russian losses
Drone attack on Putin's residence - Russia's elite trembles
Putin hedges in – Russia's president probably panicked because of drone attacks
Ukraine reports devastating "blow" against Russian occupiers – 500 dead and injured
Fancy a voyage of discovery?
On Merkur.de of IPPEN. MEDIA recently published this analysis of AI in weapon systems. Autonomous weapon systems (AWS), and thus the use of AI, have long been part of modern armies – including the Bundeswehr. But critics consider the AWS, colloquially known as "killer robots", to be the bogeymen of future battlefields.
A military expert therefore warned in the article to ask the central question: "Who or what – man or machine – takes over what, when and where? In short, the focus must move away from weapon categories and towards the role of humans." Security.Table had first published the analysis in April 2023. (frs)