Robots Get Personal
? Daniel Burrus
Technology Futurist Keynote Speaker, Business Strategist and Disruptive Innovation Expert
Robots are about to take a big leap forward, and the price tag associated with having an amazing artificial intelligence by your side is about to drop dramatically. I’ve written before about the proliferation of industrial robots over the last three decades, which do everything from pick our fruit to build our cars. But in this article, I’ll focus on the explosion of development that is taking place in the area of consumer robotics, which will make Roombas look like Model-T’s.
What will be driving the expansion of consumer robots? Simply, the robots “brains”, “sensors,” and “bodies” are getting better and cheaper, and if we look at the not-too-distant future, they’re on a collision course with human-level functionality.
The Brain
Let’s examine the robots’ growing “brains,” by which I mean on-board processors linked via wireless connection to a smart Siri-like e-assistant that can give it a voice. The software that creates digital personalities and artificial “thought” in e-assistants is growing increasingly more sophisticated. Siri was the first consumer e-assistant to hit the big time, with Google and Microsoft quickly releasing their own versions, but there are a few on the horizon that’ll give the current list especially hard competition.
One that I’ve talked about here before is IBM’s Watson, a cognitive supercomputer with artificial intelligence that has natural language processing ability and therefore the capacity to derive meaning from huge swaths of both structured and unstructured written information. Watson is designed to attack problems and interact with humans with the same kind of attention to nuance we expect from a human brain. I’m sure you noticed that recently Apple and IBM have teamed up, and based on what you have just read, it’s not hard to see what they might be doing together.
Some other examples in this space include Viv from Viv Labs (a startup founded by the engineers who created Siri initially at Apple), and Google Now (an artificial intelligence that has all of Google’s web search proficiency plus some philosophical insight courtesy of futurist Ray Kurzweil).
The growing sophistication of e-assistants in our lives is a trend I’ve written about over the past twenty years, and it’s one that was easy to spot, because I employed my Hard Trend model of prediction that I have used in my consulting work and writing for years. In addition to identifying e-assistants as a Hard Trend (a trend that will happen), I use the Three Digital Accelerators to put accurate timeframes on their advancements. The Three Accelerators are bandwidth, digital storage, and processing power. As those three accelerators are always increasing exponentially in capacity and availability, while the price drops at the same rate, it’s a sure bet that whatever technologies resulting from their increased capacity are going to come about in a predictable way.
So the “brains” of the robots are rapidly getting more and more adept at interacting with human beings, understanding and anticipating our needs, and becoming more and more specialized for tasks in a variety of sectors, including the domestic, personal-assistant arena. But when it comes to interacting with robots on a day-to-day basis, just having a big brain isn’t enough.
The Body
The second factor that will drive the mass adoption of consumer robots is the robots’ “bodies,” by which I mean the human-robot interface. Can we ever truly approach and interact with robots with the same ease and comfort we do our friends, advisers, and colleagues?
Japanese innovators saw a need among elderly consumers for low-maintenance companion animals with whom they could make an emotional connection. The solution was to create robotic pets designed to look and feel like small dogs and even baby seals – essentially, loving creatures that place a low demand on their “caregivers,” don’t make messes, and don’t need to be fed or taken on walks.
Way back in 1999, I was one of the first to purchase the Sony AIBO, a robot dog with artificial intelligence that could see in color, hear in stereo, feel with its feet, walk on its four legs, understand commands, and most impressively, it had the ability to learn and adapt to its environment. Sony sold 130,000 units in its seven-year product cycle, but what they really wanted to do was learn about the human-robot interface on a large scale. That was back then, before smart phones and Google.
Now, before I go any further, do I think real live cats and dogs are going to go the way of the rotary phone? Absolutely not. Not any more than I expect Watson will put all the worlds’ physicians out of work. New tech does not necessarily extinguish old tech completely. The key to unlocking technology's potential is to integrate the old with the new to increase the value of both.
I’m bringing the subject of robotic pets into the picture as an example that, when you really think about it, isn’t so far-out after all. If you think “a robot could never possibly fulfill that need,” well, chances are you’re wrong. Robots are going to insinuate themselves into many aspects of our lives that seem unimaginable now, but I would argue are virtually inevitable.
The development of robot-human interaction through touch alone has led to the development of even more advanced artificial pets, dubbed “cuddlebots.” If we can make a robot that feels and purrs like a cat, how long before we have one that smiles and hugs like a human? The answer is much sooner than you think.
In the years to come, we are much more likely to purchase and interact with a variety of consumer robots that can be used to get real work done as well as play the role of cute companion pets. That’s because the interfaces between robots and human are becoming more intuitive, tactile, and emotional.
So if you have an interest in taking advantage of the growing trend of robots and artificial intelligence getting cozy and personal in our everyday lives, now is a good time to begin.
?2015 Burrus Research, Inc. All Rights Reserved.
###
DANIEL BURRUS is considered one of the world’s leading technology forecasters and innovation experts, and is the founder and CEO of Burrus Research, a research and consulting firm that monitors global advancements in technology driven trends to help clients understand how technological, social and business forces are converging to create enormous untapped opportunities. He is the author of six books including The New York Times best seller Flash Foresight.
If we've moved away from exploitative and violent intent, why would it suddenly emerge from robots work AI? Particularly those with the capacity to learn at rates far beyond the limitations of our biological processing?
On the first matter, that was exactly my point. The second lends to the first: improvement, rapid progress, understanding, universality, a transcendental shift away from fear based ideological discourse. Exactly the dystopian thing you convey.
CISO | Cybersecurity & Digital Transformation Leader | Speaker | Mentor
10 年"We aren't de-evolving ourselves..." what does this mean and isn't it less about evolution than values? If I have bad intent, then I can programme with bad intent. Just because we can...doesn't mean we should. "pure and ethical" who decides what is pure and ethical?
Why wouldn't they be ethical? They're an evolved form of our progress. They've learned from the point of our culture and understanding of the basic tenants in humanity. We aren't de-evolving ourselves, so why would something eschewed from our core values do so? All I see is improvement, cohesion and an ascension to a higher consciousness. Reasoning beyond our current limitations in an open, accelerated, pure and ethical manner. Welcome to the dawn of the Singularity. Buckle up :)