Cybersecurity, 5th Dimension in Warfare: A Strategic Approach Series #3 AI, Ethics Lethal Autonomous Weapon Systems, AWS
High tech nations including Russia, China, UK, USA, Israel and others such as Turkey have been developing autonomous weapons that can detect and kill targets without human intervention.
Last year, in Libya, a Turkish manufactured autonomous drone STM-Kargu-2 was thought to have “identified and hunted “down retreating soldiers that were loyal to the Libyan General ?from Libya’s capital Tripoli. The drone uses machine learning for object classification.
The IAI Harop is a loitering kamikaze munition developed by the MBT division of Israel Aerospace Industries. It is an anti-radiation drone that can autonomously home in on radio emissions and can attack targets by self-destructing into them. The drone can either operate fully autonomously, using its anti-radar homing system, or it can take a human-in-the-loop mode. If a target is not engaged, the drone will return and land itself back at base.
According to Stockholm International Peace Research Institute (SIPRI), a think-tank, the Harop was one of 49 deployed systems which could detect possible targets and attack them without human intervention.
SIPRI published a report on specific control measures that can compensate for the unpredictability of AWS and lower the risks for civilians and fighters, taking into account different scenarios. Three types of control measure on (L) AWS are identified as necessary for ensuring human control over the use of force 1. ?focusing on the design of the weapon 2. the environment it is used in, and the 3. way that the user interacts with it.
A case against killer robots was first brought to the CCW in 2013 by a coalition of nine NGOs called the Campaign to Stop Killer Robots.?Since then it has grown to 89 NGO’s from 49 countries. The progress of CCW is not keeping pace with the development and risks.
?In 2017 a negotiating mandate for the 2019?CCW meetings was blocked by the?US, Israel, Russia and Australia.
In 2018, all of the states, except Russia, were prepared to accept a proposal for two full weeks of deliberations on killer robots next year. Russian officials stated that one week was not enough as they didn’t agree with the predictions around emergence of fully autonomous weapons systems in upcoming years. They also negated the value of the discussion saying no one is actually developing these weapons.
Meanwhile, Russian President, Vladimir Putin had been quoted by Russia Today in 2018 as saying that the future belongs to artificial intelligence, and whoever masters it first will rule the world.
“Artificial intelligence is the future, not only for Russia, but for all humankind. It comes with colossal opportunities, but also threats that are difficult to predict. Whoever becomes the leader in this sphere will become the ruler of the world,”.?Besides Putin, General Gerasimov, the chief of the General staff of the Russian armed forces said that Russia seeks to completely automate the battlefield.
The United Nations Convention on Certain Conventional Weapons (CCW or CCWC) discussed laws pertaining to these lethal autonomous weapons last in December 2021. UN Secretary-General talked about ?a coordinated response by the convention and swift response to autonomous weapons. There has been no formal position published as of now.
Zachary Kallenborn wrote ?an article on “Applying arms-control frameworks to autonomous weapons”. He poses 9 questions that in my opinion can help achieve a risk based approach and include Ethics in Autonomous weapons.
1.????How does an autonomous weapon decide who to kill??
2.????What role do humans have?
3.????What payload does an autonomous weapon have?
4.????What is the weapon targeting?
领英推荐
5.????How many autonomous weapons are being used?
6.????Where are autonomous weapons being used?
7.????How well tested is the weapon?
8.????How have adversaries adapted?
9.????How widely available are autonomous weapons??
You can read his article at https://www.brookings.edu/techstream/applying-arms-control-frameworks-to-autonomous-weapons/
?Artificial Intelligence briefly explained
Artificial intelligence, AI, is the ability for a computer to simulate human intelligence as applied to visual perception, natural language processing and language translation, speech recognition and making decisions.
There is a difference between AI and machine learning in that machine learning is a subset of AI. We are already living in a world where we are surrounded with machine learning.?It provides systems the ability to automatically learn without being explicitly programmed.?Machine learning focuses on computer programs?that can access data and use it to learn for themselves without being programmed.
Machine learning enables analysis of massive quantities of data and when combined with AI and cognitive technologies it can be even more effective in processing large volumes of information. Machines can now recognize objects and translate speech in real time.
Deep learning is a subset of machine learning and usually refers to deep artificial neural networks which are a set of algorithms used for image recognition, sound recognition, etc.
Pamela Gupta, President of OutSecure is a global Cyber Security and AI Risk strategist. She is ?leading Technology and Governance initiatives in Security & Privacy for Data Protection and emerging risks such as IoT and AI.?
A leader in governance and cyber technologies with a record of creating unprecedented risk mitigation initiatives to help organizations achieve their ?business objectives. This includes some of the top Global Fortune 500 companies.
In Q4 2020, she founded WiCyS Trusted AI Affiliate for global non-profit, Women in Cybersecurity (WiCyS), to launch a initiative to tackle a critical and extraordinarily complex problem - build Trust in AI.
She serves as Co-Chair of NIST Smart City (GCTC) Security & Privacy
Ms. Gupta has published a revolutionary risk-based holistic AI governance framework, an Artificial Intelligence Transparency, Integrity, Privacy & Security, AI TIPS Model
???? Helping Canadian business owners access tax-advantaged financial strategies for long-term success.
3 年Great insights here, Pamela Gupta Thanks you for sharing.
Institutional & Corporate Diversity Leader and Trainer
3 年Thank you for your recent insightful piece on?the development of autonomous weapons, Pamela. Your article, for me, raises important questions around the unpredictability of human behavior, a situation that could be acerbated with High tech nations including Russia, China, UK, USA, Israel, and others..., are keying into developing autonomous weapons that can detect and kill targets without human intervention." The question I have, though, is, what is the United Nations doing to avoid destructive behaviors that could stem from unpredictable individuals or countries using such technology to harm others?
Senior advisor in dataprotection / infosec / cybersec / privacy enhancing technologies
3 年Could not be more relevant, thanks Pamela Gupta
Engineer, Innovator, & Author. Inspired to create kids STEAM books like "Exploring Smart Cities for Kids,""Goodnight Moon Base"
3 年Thanks for the article Pamela. We have a serious and complex long term problem in search of a solution. I don’t trust any government to do what is right. I don’t trust the UN as an oversight agency either. Where does that leave us?