Killer robots

Killer robots

For most of human history war has been a giant meat grinder. Masses of humans and their sidekicks trying desperately to kill each-other with sticks, rocks, swords and now bombs and guns. The promise of precision weapons is to reduce the overall horror of war — but with this trajectory is Skynet about to go online?

Is it acceptable for a machine to make life and death decisions in war? There are many situations in war that are grey, and it’s a real challenge when it comes to autonomous weapons. How would a robot know the difference in what’s legal and what’s right? How would it know?when?to tell the difference?

When people worry about autonomous weapons one of the fears would be robots running amok and killing civilians. The sort of things you would see in Terminator movies. Autonomous weapons have actually been around for a long time. For example, landmines — they’re not intelligent, but once they’re armed and placed by a human, they work on their own. One of the arguments for autonomous weapons is that just like someday self-driving cars could reduce incidents on roads, perhaps machines could do the same in war. Autonomous weapons could avoid civilian casualties making war more precise and more humane. In World War II, nations bombed entire cities and killed hundreds of thousands of civilians. One of the unfortunate realities of the technology at the time was bombing was not very precise. Today, advanced militaries can actually place a bomb precisely where they want to.

Moving forward, militaries are developing next-generation robotic combat systems that would use more autonomy. How much autonomy is not always clear when looking at their plans. Senior Russian military leaders have said that their intention is to build fully-roboticized units that are capable of independent operations. The US military now has in its naval fleet a ship called the Sea Hunter, it’s a totally autonomous ship. Swarms are the next step in military robotics. People are experimenting with what this might look like and how it would dramatically change military tactics and warfare. How do you fight with a swarm? What are the commands to give a swarm? How do swarms even fight each other? What are the best tactics for swarm warfare?

The idea of handing a lethal weapon to a computer would seem, on the face of it potentially risky and problematic. We want to be mindful of risks but we also don’t want to foreclose the opportunity that there may be ways to make more precise and more humane and save civilian lives in the process.

As this technology evolves, we have to think about how involved humans should be when technology takes a life. Are there things we don’t want machines to decide without us? Some may argue that there are some things machines can’t deal some situations that are too complex and require human judgement and moral understanding. For example, the laws of war don’t set an age for combatants, a robot would not know the difference between a five year old or a thirty-five year old. If you programmed a robot to comply perfectly with the laws of war they would kill regardless of age.

As this technology evolves, we have to think about how involved humans should be when technology takes a life. Are there things we don’t want machines to decide without us? Some may argue that some things machines can’t deal with, situations that are too complex and require human judgement and moral understanding. For example, the laws of war don’t set an age for combatants, a robot would not know the difference between a five year old or a thirty-five year old. If you programmed a robot to comply perfectly with the laws of war they would kill regardless of age.

If we had a war and no one felt bad about the fact that they were killing what would that say about us, about the fact that we were killing other human beings, if no one slept uneasy at night afterwards?

Right now the United States are not building lethal autonomous weapons. War presents a lot of complex moral challenges that people face and I don’t really think that any machines that we see today or in the foreseeable future will be able to deal with those challenges. Not because the machines cannot make those decisions, but because they shouldn’t.

These moments of humanity seem important. They seem like something we wouldn’t want to give up. It’s an opportunity for humans to exercise empathy and recognise the humanity of the other side. If the history of war has taught us anything, it’s that just because we don’t want a technology to be used for killing people, that doesn’t mean it won’t be.

We’ve seen through our history that there are many times when in peacetime, militaries say “we’re not going to use this kind of technology” — before World War I, “we won’t use poison gas”, yet once the shooting started and poison gas was available the military started using it. Prior to World War II, military said they wouldn’t do aerial attacks on cities…. In a course of interviewing experts for “Army of None”, the former director of research at the Office of Naval Research, Larry Schuette asked the really important question on the attack Pearl Harbour — “when we’re thinking about autonomous weapons, is it December 6th or December 8th?”. Is it before or after these game-changing events that might change the way we think about risk and what we’re willing to tolerate in warfare?



Piyush Patel

Office & HR Manager | Mobile Disc Jockey

2 年

Maybe they can make ethical decisions but only if they are made about the best and worst of us. How was my answer Mimi Hammad?

回复

要查看或添加评论,请登录

Mimi Hammad的更多文章

  • Live forever

    Live forever

    If I told you that what I was about to tell you was the truth then I’d be lying, but you’d probably judge me anyway…

    9 条评论
  • The rise of the microchip

    The rise of the microchip

    An intense debate is underway over the benefits and drawbacks of human microchips. They’re already used to identify…

    1 条评论
  • Synthetic biology

    Synthetic biology

    Microbes are microscopic organisms that we use to bake bread, brew beer and lately engineer with synthetic DNA to…

  • Weaponizing AI

    Weaponizing AI

    In the past decade, the rise of artificial intelligence (both in theory and in practice) has revolutionized the way we…

  • Are we there yet?

    Are we there yet?

    Life in the 21st century is one of constant radical transformation. The internet and the rapid rate of which new…

    3 条评论
  • The dark side of technology

    The dark side of technology

    As we continue our sprint through the digital valley, we reach a point globally where it’s become taboo to not be using…

    4 条评论
  • Wireless emotions

    Wireless emotions

    As some of you may, or may not have noticed I took a break from writing. Not because I had nothing to say, but, because…

    3 条评论
  • Building a successful culture

    Building a successful culture

    Sun beams through thoughtless clouds. If you pay attention, if you look close enough, you might just be able to catch a…

    1 条评论
  • Virtual Humanity

    Virtual Humanity

    I watch the days slowly digitize and the wind unwind into glitter as we take one further step into Design Valley. It’s…

    4 条评论
  • The Wild West

    The Wild West

    Dear Diary, It’s the fourth week of January and, already, I’ve almost broken my arm, walked into two lampposts, had my…

社区洞察

其他会员也浏览了