Keeping Robots and Humans Separate

Keeping Robots and Humans Separate

At the beginning of this month, the Financial Times posted an article about the setup of hybrid systems—where automated processes and humans work together—and of the different approaches that they often take.

The story begins by telling of how a pedestrian was killed by one of Uber’s self-driving cars. It was an accident that occurred when the pedestrian was crossing the road and the system was unable to react properly, defaulting to driver control. That driver, local police concluded, was distracted—potentially because they were watching a TV show on their smartphone. It was an accident that may have been unavoidable even if the driver’s attention had been fully on the road, given that research from Stanford University says it takes six seconds for a human driver to recover awareness and take back control of the vehicle.

Three approaches taken by those designing automation are outlined. The first was in use in the Uber case, in which a human is used as a ‘backup’ to the automated technology; the second is when sensitive decisions are always left to the decision-making skills of a flesh-and-blood person; and the third is when the AI is not able to handle a task on its own and is merely an aid to a person.

In previous reports on this blog, I have written extensively about how robots and humans can work together, taking as a starting point the position that robots are not there to replace humans but will instead help them by taking on the routine, mundane tasks that require little creativity and are dependent on data. That would be the third approach outlined above. The other approaches have not been tackled since they do not fall within the operating parameters of current Retresco technology.

So where in the past I have spoken about how automated technology and humans can come together, it is also important to talk about how robots and humans can be separated.

It is important to keep in mind that separation is not always feasible. This is because some tasks have potential consequences that are so serious that human oversight is a necessity. How serious? Look at the first paragraph of this post. This was not the first fatality involving self-driving cars, nor is it likely to be the last. And outside of this article, there are interesting questions as to where fault lies when such incidents occur. But, as Fortune points out in another article, the biggest risk with self-driving cars still comes from humans, not the cars themselves.

There will be a lot of debate over which jobs can be farmed out to automated technology, and whether they should be farmed out totally or in part. And, if so, how much should be farmed out. But there are a few basic principles that we should look to if we want to go down the path of separation.

Firstly, any tasks not overseen by a human should carry no risk of serious harm. If a machine can do something, that is great. But if the consequences of those actions could be serious and adverse, a rethink is necessary. Likewise, an automated system should not be in the position where it could create issues of libel—again, this is a judgement call and one that needs careful and considered thought.

The key to solving this is to have a clear and robust development process, planned correctly and with a definite objective. Blind spots should be taken into account at the planning stage. The possible interpretation of the data should be solid and leave no room for ambiguity. This requires conceptual work, but that helps to prepare the development of such systems for certain contingencies.

The data so far, however, shows that self-driving vehicles and automated content are still much, much less prone to error than their human-sourced counterparts. But it will pay to be conscientious and realistic about the limitations of what we offer.


This article was originally published on Retresco's blog.













要查看或添加评论,请登录

Pete Carvill的更多文章

  • Most creative-writing courses are BS

    Most creative-writing courses are BS

    A good barometer of where the writing and editing sector stands are the number of courses devoted to the same subject…

  • In Defence of Detachment

    In Defence of Detachment

    “Detachment does not need more defence. Just more defenders.

  • Bicycle trailers for children should not be dangerous. This one from Berlin Brands Group is.

    Bicycle trailers for children should not be dangerous. This one from Berlin Brands Group is.

    Like everyone else posting personal stuff here on LinkedIn, I don’t like to do this sort of thing normally. And…

    2 条评论
  • The Fight of Erik Skoglund's Life

    The Fight of Erik Skoglund's Life

    Recently, I went up to Nyk?ping in Sweden, about an hour south of Stockholm. It was December and cold, and temperatures…

  • Pete Carvill (1981 - ?)

    Pete Carvill (1981 - ?)

    Recently, as part of an application process for a grant to write a longform travel article, I was asked to write a bio.…

    1 条评论
  • And on that note...

    And on that note...

    Last week, I wrote my last column for this year for Expert Investor. And in it, I said, “2021 has been a rough year, as…

  • My Corona Letter from the Future

    My Corona Letter from the Future

    Hi Pete, I hope this letter finds you well. That’s a relative concept these days; you’re still in the middle of a…

    2 条评论
  • A Day in the Office

    A Day in the Office

    One of the lessons I’ve learned over the years from covering different beats and industries is never to write as a…

  • The Best Sentence I Ever Wrote was Never Published—Until Today

    The Best Sentence I Ever Wrote was Never Published—Until Today

    I think most reporters have them. They’re the ones that got away—the story, or interview, or paragraph, the quote, the…

  • Some Reflections on Turning Thirty-Eight

    Some Reflections on Turning Thirty-Eight

    I turned thirty-eight yesterday. That’s a weird number.

    2 条评论

社区洞察

其他会员也浏览了