The Metaphysics of Military Robots
Robot twins wearing striped pyjamas https://labs.openai.com/s/Z32aY1CEYZhPbFUlkkk36ZFu

The Metaphysics of Military Robots

A group of prominent robotics companies have recently pledged that their general purpose robots will not be weaponised, specifically:

"We pledge that we will not weaponize our advanced-mobility general-purpose robots or the software we develop that enables advanced robotics and we will not support others to do so. When possible, we will carefully review our customers’ intended applications to avoid potential weaponization."

A further letter in IEEE Spectrum with Boston Dynamics (8 Oct 2022) opens up the opportunity to think more about military vs. civilian robotics--as well as robotics used in the grey zone:

“It’s not about military robots per se, and that’s made quite clear in the letter. We’re not taking issue with weapons systems that are already governed by an international legal framework. The focus of the letter is on these new, widely accessible general-purpose commercial robots, where in some cases we’ve lately seen people potentially misusing them by weaponizing them. And that’s the issue for us, because that’s an ethical concern as well as the risk of a loss of public trust in robotics—that the public will begin to feel that all of these companies developing these highly advanced mobile robots are just one step away from deploying weapons in our communities, when in fact the whole point is to create robots that help people and do good thing”

What should manufacturers who make (or intend to make) general purpose and military robots take away from this discussion?

Non-weaponised advanced-mobility general-purpose robots

The pledge sets an ethical standard for how non-weaponised advanced-mobility general-purpose robots should be used in the market after production and allows companies who sell them, to condemn after market adaptations that add weapons or weapons systems to them.

What does it mean if a military customer uses these general purpose platforms for ISR or logistics purposes and then integrates data feeds or logistic functions to enable use of a weapons system? What about a GPS-denial capability, mobile mesh network or cyber attack capability? Tech companies such as Google supply products and services to enable military operations and need to negotiate these actions though their commitment to AI ethics principles. Human-Machine teams are likely to draw on a myriad of data, robotic and artificial intelligence assets from tech companies to inform their decision-making, making it hard to draw a clear metaphysical boundary between what is and what is not a proper part of a weapons system.

From a PR perspective, robotics companies may not want general-purpose robotic platforms to be seen as part of military demonstrations, exercises and operations (including military and grey zone operations), as the public may not be able to discern the difference between weaponised or non-weaponised versions of the products. There is an opportunity for companies to mark general-purpose robots as explicitly non-weaponised via specific morphology, function, signalling and semiotics.

The follow on letter from Boston Dynamics indicates that weaponised military robots are ok (aka Boston Dynamics are not taking issue with) that are already governed by IHL, so let's unpack the process of IHL governance for military robots a little.

Weaponised military robots

Military robots must be capable of being used lawfully. However, demonstrating compliance remains challenging for industry; not because they lack will or knowledge (though both may be true), but because there is not a detailed, comprehensive, or accepted public framework to ensure that weaponised military robots comply with IHL as per Article 36 of Additional Protocol 1

No alt text provided for this image

As Patrick Griffiths says (ASPI, 2019)

“While Article?36 sets the obligation, it doesn’t lay down a practical path for conducting legal reviews of new weapons. This lack of detail poses challenges of legal interpretation, but also of policy. Who should be responsible for the review? Who should participate? When will legal reviews occur? How will decisions be made and records kept? These practical questions are compounded by challenges arising from the environment in which the reviews operate”.

Perhaps even more urgently, Klaudia Klonowska (ICRC Blog, 2022) argues that

"we need to move on from describing Article 36 as strictly requiring a weapons review and acknowledge that the choice of non-weaponized technologies may influence militaries’ offensive and defensive capabilities just as much as the choice of weapons. We need not a review of weapons, but a review of ‘technologies of warfare’

So, industry, even well-meaning and well-resourced, may be surprised by the lack of public frameworks that they can use to help them demonstrate IHL compliance and also lack in-house legal expertise. Indeed, industry are likely to be forging a new path for legal reviews of their unique technology offerings and may find the classification of their technology, particularly those bringing robotics, autonomous systems and artificial intelligence into digital military ecosystems not clear cut.?

One the one hand, being a legal review pioneer brings competitive advantage and likely acceleration through acquisitions. But, on the other hand, if industry doesn’t bring in the right legal expertise into their team early, they may miss the opportunity to fix critical design risks in the early, more malleable phases of development.?

Delineating robots

Industry who want to offer modifications or versions of their general purposes robots for military purposes, including weaponisation, will need to set very clear design and marketing differentiators on their models. Eg. What kind of demarcation between products will the general public be ok with? I.e. the S800-C vs the S800-M? The ‘M’ model features special military capabilities and governance that the ‘C’ ‘civilian’ model does not?

The public may expect very different look, feel and function between models. Not just a model number shift. Perhaps the public don't want Cs and Ms made on the same production line? Or made within the same organisational structures including governance and oversight; or physically co-produced in factories? The public may not want companies building public good robots making robots for military at all--encouraging companies to create legal subsidiaries in order to meet military client needs. This segregation may be enough for public opinion.

Still, robots are made of many parts and draw on a diverse supply chain. Robotics companies rarely make all their own components. Many robotics companies already share components across the range of their own products as well as sharing with external companies. So, an AI-driven sensing and perception suite may be found across many robots used across industry’s such as agriculture, defence and mining. Will the public demand (and industry commit to) limits of componentry supply across sectors and products? ?

Robot Ethics

Finally, is IHL compliance enough for our weaponised military robots? As Marc Galasco argues, compliance with IHL in use of force is a low bar to achieve; a minimum not a maximum ethical standard. Achieving IHL compliance (such as distinction and proportionality) leaves many ethical questions unanswered, such as whether force should even be applied in a specific instance? Human-robot teams for military purposes should be both ethical and lawful.

In the end, companies need a strong metaphysical story to tell that disambiguates their robots. The fast-changing make-up of robots means that the public should remain vigilant in how robotic systems are being used as part of broader military capabilities; and robotics companies should signal clearly the differences between their products to consumers and to the public.

Paul Pappas

Retired from Department of Defence

2 年

The Three Laws of Robotics will be arriving soon.

回复
Kathryn Brimblecombe-Fox, PhD.

Creative thinker & practitioner with expertise in teasing out potential risks associated with contemporary military and civilian technology in our hyperconnected world. Thinking about risks can also reveal opportunities.

2 年

Really interesting overview of important questions prompted by the letter. Made me think about the differences between weaponised, militarised, martialised. And, as you comment, these robotic systems can/are employed within a system chain, where ultimate outcomes can be lethal. And, how to react to situations where non-state aberrant actors do weaponise robotic systems. Watching people react to robotic quadrupeds at Land Forces last year, and this year, was very interesting. Human reactivity poses another set of questions.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了