Governments warned on Monday that regulators looking to bring a new generation of artificially intelligent killing machines under control may not have much time left.
As autonomous weapons systems proliferate rapidly, including on the battlefields of Ukraine And GazaAlgorithms and drones are already helping military planners decide whether or not to hit targets. Soon, this decision can be completely transferred to machines.
“This is the Oppenheimer moment of our generation,” said Austrian Foreign Minister Alexander Schallenberg, referring to J. Robert Oppenheimer, who helped invent the atomic bomb in 1945 and later became an advocate for control of the proliferation of nuclear weapons.
Civilian, military and technology officials from more than 100 countries gathered in Vienna on Monday to discuss how their economies can control the merging of artificial intelligence with military technology, two sectors that have been galvanizing investors lately, helping drive stock prices to all-time highs.
The spread of global conflict, coupled with financial incentives for companies promoting artificial intelligence, is making it harder to control killer robots, according to Jaan Tallinn, an early investor in Alphabet Inc.’s artificial intelligence platform. DeepMind Technologies.
“The incentives of Silicon Valley may not align with the interests of the rest of humanity,” Tallinn said.
Governments around the world have taken steps to collaborate with companies integrating artificial intelligence tools into defense. The Pentagon is investing millions of dollars in AI startups. The European Union last week paid Thales SA to create image database to help assess targets on the battlefield.
Tel Aviv-based +972 magazine reported this month that Israel is using an artificial intelligence program called “Lavender” to identify assassination targets. Following this story, which Israel disputes, UN Secretary-General Antonio Guterres said he “deeply concerned” reports of the use of AI in the military campaign in Gaza and that no part of life and death decisions should be delegated to the cold calculations of algorithms.
“Future killer bots is already here,” said Anthony Aguirre, a physicist who has predicted the trajectory the technology will take in the future. short film 2017. watched by more than 1.6 million viewers. “We need an arms control treaty agreed to by the UN General Assembly.”
But supporters of diplomatic solutions are likely to be disappointed, at least in the short term, according to Alexander Kmentt, Austria’s top disarmament official and the architect of this week’s conference.
“The classic approach to arms control doesn’t work because we’re not talking about one weapon system, but a combination of dual-use technologies,” Kmentt said in an interview.
Rather than negotiate a new magnum opus treaty, Kmentt suggested that countries might be forced to follow through with legal tools already at their disposal. Strengthening export controls and humanitarian legislation could help curb the spread of artificial intelligence systems, he said.
In the long term, as technology becomes available to non-state actors and possibly terrorists, countries will be forced to write new rules, predicts Arnoldo Andre Tinoco, Costa Rica’s foreign minister.
“The easy availability of autonomous weapons removes the limitations that kept few people in the arms race,” he said. “Now students with a 3D printer and basic programming knowledge can create drones capable of causing mass casualties. Autonomous weapons systems have forever changed the concept of international stability.”