Human oversight necessary for weapon systems with artificial intelligence

News - 04 April 2024 - Webredactie

Artificial intelligence has become indispensable in our society. Think of ChatGPT, automated driving, and diagnosing in healthcare. The Ministry of Defence also utilizes it, notably in image processing and air defence systems. Human oversight is required for these applications. Lieutenant Colonel Ilse Verdiesen researched how this can be implemented. On April 3, she defended her thesis on this subject at TU Delft.

Weapon systems equipped with artificial intelligence, known as 'autonomous weapon systems', are increasingly deployed in warfare situations. Consider, for example, radar-controlled drones in Ukraine.

Such weapons systems offer many advantages but can also pose security risks. This is because the responsibility for their decisions, actions, and effects is not always clear. According to Verdiesen, this should be clarified in the decision-making process. After all, a machine should not make decisions about life and death. "The structure I have developed assigns responsibility for the actions of the weapon system. This is done by identifying a supervisor. This reduces the chance of unintended consequences."

Verdiesen made this structure applicable in a simulation environment. She identified and programmed a number of criteria that are important in military operations. These could include the boundaries of an operational area, areas to be avoided, and weather conditions. Using her structure, a human supervisor can assess whether an autonomous weapon adheres to these criteria. If not, it can be investigated what went wrong so that it can be prevented in the future.

Clear agreements

The Ministry of Defence is pleased that Verdiesen is defending her thesis on this topic. Artificial intelligence will influence all military operations. Therefore, the organisation considers it important to have clear international agreements on this matter. Verdiesen's thesis makes a valuable contribution to the international and national discussion on this subject.

TU Delft sees that Dutch and European intentions to defend their own and allied territory require a stable foundation of knowledge institutions and companies, which ensure that the Netherlands has the right military knowledge, technology, and capabilities. Scientific insights into responsible human oversight of technologies such as AI systems in the Ministry of Defence contribute to this.

Verdiesen hopes that her research will contribute to a responsible application of autonomous weapons systems and possibly can also be translated to other AI applications in sectors such as healthcare, cybersecurity, or transportation.

Read the dissertation.