The other side of autonomous weapons : using artificial intelligence to enhance IHL compliance
Author zone:
Peter Margulies
In:
The impact of emerging technologies on the law of armed conflict
Editor:
Oxford : Oxford University Press, 2019
Physical description:
p. 147-174
Languages:
English
Abstract:
The role of autonomy and artificial intelligence (AI) in armed conflict has sparked heated debate. The resulting controversy has obscured the benefits of autonomy and AI for compliance with international humanitarian law (IHL). Compliance with IHL often hinges on situational awareness: information about a possible target’s behavior, nearby protected persons and objects, and conditions that might compromise the planner’s own perception or judgment. This paper argues that AI can assist in developing situational awareness technology (SAT) that will make target selection and collateral damage estimation more accurate, thereby reducing harm to civilians. This chapter breaks down SAT into three roles. Gatekeeper SAT ensures that operators have the information they need. Cancellation SAT can respond to contingent events, such as the unexpected presence of civilians. The most advanced system, behavioral SAT, can identify flaws in the targeting process and remedy confirmation bias. In each of these contexts, SAT can help fulfill IHL’s mandate of “constant care” in the avoidance of harm to civilian persons and objects.
By entering this website, you consent to the use of technologies, such as cookies and analytics, to customise content, advertising and provide social media features. This will be used to analyse traffic to the website, allowing us to understand visitor preferences and improving our services. Learn more