As I, and no doubt everyone else, suspected, the urgencies of winning wars override concerns about killer robots. David Hambling reports in NewScientist (16 July 2022, paywall):
International attempts to regulate the use of autonomous weapons, sometimes called “killer robots”, are faltering and may be derailed if such weapons are used in Ukraine and seen to be effective.
No country is known to have used autonomous weapons yet. Their potential use is controversial because they would select and attack targets without human oversight. Arms control groups are campaigning for the creation of binding international agreements to cover their use, like the ones we have for chemical and biological weapons, before they are deployed. Progress is being stymied by world events, however.
Russia’s need to win in Ukraine, whether it be due to Putin’s egotism, or his alleged devotion to a dead Russian Orthodox mystic, or a realization that the world’s overpopulation suggests food sources, such as Ukraine, a very large food exporter, need to be secured in order to guarantee his legacy is viewed as positive, makes concerns about killer robots secondary.
A United Nations’ Group of Governmental Experts is holding its final meeting on autonomous weapons from 25 to 29 July. The group has been looking at the issue since 2017, and according to insiders, there is still no agreement. Russia opposes international legal controls and is now boycotting the discussions, for reasons relating to its invasion of Ukraine, making unanimous agreement impossible.
So the question becomes Which evolutionary technological path will see the emergence of “killer robots”, aka solely AI directed battlefield weapons? And what special undesirable characteristics will accompany them? At the moment, and I think in line with expectations, drones are a leading candidate. I’ve been hearing about ‘loitering munitions’ for months in reports on Putin’s War, these being drones lurking above for periods of time, utilized only when the operators see, or are informed by spotters, of a target. Everyone worries that the human element could be excluded in favor of an on-board “AI”, or recognition and decision making elements. But that part may be unavoidable:
[Gregory Allen at the Center for Strategic and International Studies] says the extensive use in Ukraine of radio-frequency jamming, which breaks contact between human operators and drones, will increase the interest in autonomous weapons, which don’t need a link to be maintained.
Defensive tactics and technologies are no doubt under development even as we speak, but I haven’t heard much beyond this report.