课程: AI Accountability: Build Responsible and Transparent Systems

今天就学习课程吧!

今天就开通帐号,24,100 门业界名师课程任您挑!

Moral and relational reasoning

Moral and relational reasoning

- [Instructor] You may be familiar with the three laws of robotics that Isaac Asimov wrote about in many of his science fiction books. The idea is that these were rules that were hardwired into robots to make them helpful and not dangerous to humanity. The first one, law one, was essentially do no harm. Specifically, a robot may not enter a human being or through inaction, allow a human being to come to harm. Law number two was obey orders. A robot must obey the orders given it by human beings except where such orders would conflict with the first law. And law three, protect itself. A robot must protect its own existence as long as such protection does not conflict with the first or second laws. And the idea is that you've got these three things. You've got this really wonderful, benevolent, helpful machine. Of course, this is science fiction and so a lot of people have riffed on this and come up with other…

内容