Paying attention to the ridiculous
Bob Fesmire
Writing/Editing | Content Creation | Strategic Communications | Media Relations | Executive Comms | Messaging Strategy | Crisis Comms | Digital Communications | Content Marketing
One of the more thought-provoking discussions at ABB Customer World (held in Houston Mar 4-7) was a panel featuring ABB Chief Technology Officer Bazmi Husain, Chief Digital Officer Guido Jouret and EVgo Chief Technology Officer Ivo Steklac. The trio discussed technology, innovation, industry disruption and the value of low-dose paranoia during a lively hour moderated by ABB’s Allen Burchett. The session covered a lot of ground, but there were several key themes that emerged.
The migratory patterns of technology have been changed
Those of us of a certain age remember a time—it wasn’t that long ago, really—when exciting new technologies developed for industry would eventually make their way into consumer products. The rise of digital has inverted that process.
“Everything comes to our wrist before it goes anywhere else,” Husain observed.
Jouret added that the consumer tech industry has taken over what used to be the domain of large companies, and government before that. R&D at companies like Bell Labs, IBM and Xerox—whether on behalf of the government or their own product development efforts—produced a fantastical parade of technological innovation. Then things started to change.
On a visit to Xerox’s Palo Alto Research Center in 1979, Steve Jobs is famously reported to have seen prototypical versions of networked computing, the graphical user interface and the mouse. The experience blew his mind, as the story goes, and shaped the development of the Macintosh into the machine that made its debut in that iconic Super Bowl commercial years later.
It’s a great Promethean story, stealing fire from the tech gods and giving it to the people, and even if it’s not exactly true it illustrates the emergence of consumer industries as the new wellspring of tech innovation. Xerox had developed all of what Jobs saw but failed to see the potential in the technology they’d created and put it on the laboratory shelf.
In 2019, forty years on from Jobs’ pilgrimage to Xerox PARC, the flow of tech from consumer applications to industry is manifest in more ways than we could have imagined. ABB’s TXplore? service, for example, uses a submersible cloud-connected robot equipped with multiple cameras to inspect the oil-filled inside of large power transformers. It’s a dramatic improvement in downtime, cost and safety from the traditional method of draining the oil and sending a person into the tank. And the hardware used to pilot the robot? It’s a video game controller—an inexpensive, reliable and easily sourced device that is already familiar to many a field technician.
From handheld devices to cloud-based services, technology first commercialized for consumer applications is being adapted to industrial purposes at a dizzying pace, and this is likely to continue because some industries in particular have a lot of catching up to do.
We are living in the golden age of industrial digitalization
If you plot various industries along the classic technology adoption S-curve, you’ll find sectors like consumer finance, media and telecom well into the upper right quadrant of the graph. These industries have been the early adopters, and have seen the most disruption (think online banking, mobile phones, Netflix…)
So why hasn’t digital rewritten other industries like mining or manufacturing in the same way?
Jouret answered this question by pointing out that connectivity challenges, the need for much larger computing capacity and the fact that these industries are more complex than consumer markets have all conspired to slow the pace of digitalization in many of ABB’s client industries.
Now the innovations in the consumer world are the building blocks for “old economy” industries to realize the same potential that digital has brought to other sectors, but there is also danger ahead.
Be concerned about the ridiculous
History is littered with the wreckage of companies and entire industries that didn’t see their disrupters coming. The cardinal sin in most of these cases was a kind of myopia, seeing competition as coming only from existing competitors. The message for digital latecomers is that the real threat lies not in the offices of your rival but in a suburban garage or college dorm room.
“You will be disrupted by someone outside your market, by companies you’re not even aware of,” warned Jouret. “You should be concerned about the ridiculous.”
The problem is that we humans aren’t wired to see disruption coming. By the time we recognize the potential of a new technology, business model or industry, it’s already too late.
“Humans don’t understand exponential change,” Jouret continued. “It takes the same amount of time to go from zero to 1% adoption [of a new technology] as it does to go from 1% to 80%. We need to look at the rate of change.”
He also noted it’s useful to maintain a certain amount of paranoia because digital works in mysterious ways. It can make small things big, for example by aggregating the ROI of thousands of rooftop solar installations and serving them up to investors under a single instrument. And there are no sacred cows.
Jouret asked the audience members if they could envision a manufacturing plant that could switch from making one type of product to another, or even a completely different one. A sedan factory switching to SUVs? Sounds… ridiculous, doesn’t it? But if you extrapolate the potential for 3D printing from where we are today—and remember the exponential nature of the change we’re talking about—it begins to look more and more plausible for a manufacturing plant to quickly re-tool for a totally different product.
Certainly, there will be carnage. Some business models will collapse under the pressure of technological disruption, but even in the age of Expedia we still have travel agencies to organize walking tours of Provence, cooking classes in Thailand and other individually tailored offerings. Technology is both a threat and an opportunity, but the threats are mostly to complacent incumbent firms—people are more resilient.
Disruption destroys jobs, but it usually creates many more. McKinsey recently analyzed the jobs lost and gained by the introduction of the personal computer, beginning in 1980. They found that while the loss of 3.5 million jobs could be attributed to the arrival of the PC, the creation of 19 million more could too, more than a 5-to-1 ratio.
Still, we would be well advised to not lose sight of the human element in the excitement of all of this creative destruction.
It’s still all about people
Isaac Asimov, the prolific author of books ranging from science fiction to saucy limericks, set down three fundamental rules for robots that, though conceived in the context of fiction, are worth considering sincerely as we move into the age of artificial intelligence.
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
That sounds good for the robots, but what about the humans?
“More digital means more exposure to bad actors,” said Steklac, observing that security and safety must remain paramount in industrial systems.
Speaking about the rise of autonomous vehicles, he presented a vexing question: how much harm should an autonomous vehicle cause in order to avoid greater harm? Given the choice of hitting a pedestrian or an oncoming vehicle, what should the car do?
The answer is anything but clear, but the question puts in stark terms the challenge of turning raw data into something more useful, like a decision.
The currency of digitalization is data but the “money” is domain expertise
Toward the end of the panel, an audience member (ok, this audience member) asked Jouret what firms like ABB could build value on once data and digital tech were fully democratized. After all, our phones now contain innumerable tools and capabilities, some of which are ostensibly free. So, what’s left when everyone has everything they need in their pocket?
“There’s always another layer,” he replied.
Operating systems begat databases, which begat analytics packages, which begat software as a service, and so on, Jouret explained. Basically, every time we think we’ve reached the top of the hierarchy, someone puts another pancake on the stack.
“Hardware incumbency is actually an advantage,” Jouret continued. “Like having EV chargers located in the best locations. Competitiveness is a combination of data, analytics and the physical world.”
So, the future is bright for digitalization. Industries just starting on their digital journey can learn from those that have gone before. They can leverage technologies developed for consumer markets. They can use their specific expertise to deliver value even when all their competitors have access to the same tools. They just need to keep an eye on the rearview mirror for something that looks ridiculous.
When I was little I listened to stories - now I'm telling them.
5 年Nice read!
I help executive leaders bring their visions to life and produce short films designed to spark intimate discussions.
5 年Thought provoking and looking forward to reading more.
Director, North American Markets at ROCSYS, Effortless Charging of Electric Vehicles | Leading US Team
5 年Well done article!