Eve Massingham, Simon McKenzie and Rain Liivoja, members of the Law and the Future of War Research Group at the University of Queensland Law School discuss the increasing role of AI and machine learning in military hardware in response to the International Committee of the Red Cross (ICRC) 2019 report.
The ICRC’s 2019 report on International Humanitarian Law and the Challenges of Contemporary Armed Conflict acknowledges that AI and machine learning are playing an increasing role in military hardware across all domains. The ICRC is primarily concerned about autonomous weapon systems and any potential for ‘automatic target recognition’. But the ICRC’s proposal of a ‘human‐centred, and humanity‐centred, approach to the use of these technologies in armed conflict’ to ‘preserve human control’ has the potential to shape how these technologies interact with a range of international legal regimes. It forces us to confront an important conceptual issue: what is meant by ‘control’, and how does it relate to the military and legal concept of ‘command’? The answer to this question is key to understanding the law as it applies to the deployment and use of AI-enabled and autonomous platforms.
Read the full article where the researchers examine the limits of autonomy, how command structures play a crucial role in ensuring accountability and the need to develop appropriate legal standards sooner rather than later as more states engage with the technology.