Pervasive Uncertainty
When Karl Marx undertook his critique of the capitalistic model of production, the model was in its infancy.?Yet, Marx directed his efforts in such a way as to give them prognostic value.?He analyzed the basic elements and conditions underlying capitalistic production and extrapolated their interactions to show what could be expected of capitalism in the future.?His result was that one could expect capitalism not only to exploit the proletariat with increasing intensity, but ultimately to create conditions which would make it impossible to abolish capitalism itself.
This thesis seemed self-evident to many of Marx’s peers at the time. Yet, what contradicted Marx’s prognostications, and which seemed equally self-evident during its respective time, was the revolution of productivity set in motion by Frederick Taylor and his peers.?In addition to ideology, there is one fundamental difference between the theoretical bases of Marx and Taylor.?Marx theorized about the superstructure of capitalism, while Frederick Taylor considered the substructure.
In business, as in political science, a transformation of the superstructure takes place more slowly than that of the substructure, making it easy to believe past truths despite emergent evidence of change.
Most people agree that our ability to predict the future is poorly developed. However, difficulty in predicting the future does not warrant a lack of attention to the future.?Following a sudden change, planners often conclude that all the necessary facts were available for us to foresee the change.?Yet we are nearly always caught off guard.
Spotting real trends is like watching waves break on the shore, one after the other, while remaining unaware of the deep currents and invisible undertows that cause this surface-reality. The specific trends change from year to year but the impact of the stories is very predictable. They always focus relentlessly on the technologies alone, whereas the real future clearly lies in the complex inter-relationship of many technological, human, business and societal forces.
So, rather than focusing on “point” technology trends, we are highlighting what we like to call “emerging research themes” that examine the many reciprocal impacts that are occurring between and among technologies, people and society.
EXPONENTIAL TECHNOLOGY ADVANCEMENT
Exponential technologies, a term originally coined by futurist Ray Kurzweil in his essay “The Law of Accelerating Returns,” refers to those technologies for which the power and/or speed doubles each year, and/or the cost drops by half. Robotics, AI, IoT, nanotechnology, renewable energy and gene sequencing are just a small list of examples.
Today, exponential technologies underpin most of modern society. Exponential advances in many technologies are often predictable, such as the rate of growth of computing power that Moore’s Law introduced. Current forecasts for wireless communications anticipate a performance improvement of 10X or more. Full genome sequencing can be done for a few hundred dollars today versus the millions it cost in 2006, or the nearly one billion it cost to produce the first full sequence in 2003. If we were to calculate the value of a “smartphone” in the early 1960s – a smartphone’s processing power based on the cost of computing back then – it would cost hundreds of trillions of dollars.
Kurzweil’s critical insight was understanding how difficult it is for a normal human to imagine exponential growth, “an analysis of the history of technology shows that technological change is exponential, contrary to the common-sense intuitive linear view. So, we won’t experience 100 years of progress in the 21st century — it will be more like 20,000 years of progress (at today’s rate).”
It took just five years for TikTok to reach a billion users (three years less than Instagram and four years less than Facebook). At its present $300 billion valuation, TikTok’s parent, Bytedance, is worth more than Disney, Twitter, SNAP, Pinterest, Omnicom Group, WPP and other peers combined.
The rates of impact will accelerate in a logarithmic way resulting in far-reaching intended and unintended consequences. Industry boundaries will blur, value chains will be reformed with value migrating to new places, and entirely new industries will be created. Every sector of the economy will be affected.
NEW GENERATIONS OF HARDWARE REVOLUTIONIZE SYSTEMS
The days of ‘one size fits all’ processors and hardware are gone. Now we see more and more specialized hardware designs to address very application-specific compute requirements. CPUs are being combined with GPUs along with FPGA solutions, ASICs and custom designs to accelerate compute performance for applications and use cases such as AI/ML workloads and?new multi-modal user experiences including AR/VR/MR.
How software is developed for new distributed solutions requires better tools and processes to address the rapidly rising levels of complexity — code executing at the level of a node or device constantly needs to be ported to new architectures and accelerators to continue to improve performance and, most existing code is written for sequential execution, over-optimized for a particular architecture and difficult to port.
All of this creates two significant challenges to efficiently programming machines: how to program machines with increasingly heterogeneous computational resources and, how to port/migrate code from existing machines to new hardware designs and architectures.
The past decade has seen an explosion in processor parallelism with the move to multicores. The next major architectural trend is the move from symmetric multiprocessors to heterogeneous systems, with multiple different processors and memories. These new silicon architectures enable the acceleration of data-parallel applications such as deep learning, image and video processing, and financial applications.
TAMING SOFTWARE COMPLEXITY
Software complexity and challenges don’t end with hardware innovations. Software today is tied to multiple parallel innovations that are increasingly integrated. Each new wave of hardware innovation is inevitably tied to and integral with cloud and networking performance, all of which requires new software. The exponential pace of these parallel innovations is making software development untenable.
Developers expect evolving software development tools to be functional, ubiquitous, and easy-to-use.?Within this construct, however, the first two expectations run counter to the third. In order to achieve all three, a new approach is required — but what kind of approach?
What is needed is a common means of managing code that can orchestrate and leverage legacy and new development across families of interrelated hardware and diverse computing domains. The tools we are working with today to develop complex application solutions were not designed to handle the scope of new capabilities, the diversity of usage and devices and the massive volume of data-points and interactions between and among systems.
These challenges are diluting the ability of organizations to efficiently and effectively manage development.?The fragmented nature of software development tools available today makes it extremely difficult to leverage new hardware, cloud and networking innovations. The rate of operational change today significantly exceeds the design-develop-deploy cycle of existing software development tools and is expected to increase 5X or more which, in turn, will pressure organizations to develop applications faster and faster.
Software architecture is quickly becoming the foundation of every technology-driven organization; not just hardware and software companies but also any organization that is building digital capabilities. However, relatively few companies understand the necessity of re-thinking how their applications will be developed, consumed and managed.
OPEN INNOVATION CHANGES ALL OF THE RULES
Never before has?open innovation been as much a part of software development?as it is today. Open-source software has become central to virtually all digital strategies influencing every layer of the tech stack, from operating systems and programming languages to middleware and development tools.
Meanwhile, commercial open-source companies like Red Hat, Docker and many more, along with the cloud vendors, have turned open-source software into viable commercial products. Together, these forces have commoditized software development, shifting the speed, scale, and economics of innovation by offering powerful infrastructure and development capabilities without players having to invest enormous amounts of capital up front.
Embracing open innovation in B2B domains has been much slower in its evolution than in the consumer world. And yet, many of societies biggest challenges are more B2B focused – smarter management of finite resources, including climate change and renewable energy, as well as sustainable management of air, water, food resources and waste. However, product OEMs and machine builders work with software developers and solution players in a much more “command and control” mode and have largely forged only simple relationships with wireless carriers, enterprise applications or professional services providers. Open innovation does not work “by invitation only,” rather it is rapidly moving to self-service and open collaboration.
Open innovation will force the industrial behemoths to embrace open-source software and collaborative developer communities. But sadly, most OEMs that we speak with fail to see the evolving relationship between these two dimensions as fertile ground for innovation. OEMs need to understand that new open software tools and open collaboration need to be interwoven and mutually supportive to effectively leverage their combined potential.
领英推荐
The B2B and industrial software business cannot continue to do this the way they’re doing things now. The days of combining existing monolithic applications with new “cloud veneers” and SaaS delivery are over. Applications are now distributed and dynamic in nature, needing to work across different cloud configurations, diverse locations and devices. Software development needs a completely new approach – one where open-source software development tools and libraries are modular and interchangeable and where applications can be decomposed into microservices and components that can be assembled and then re-used. New development frameworks will need to simplify application development and shorten time to market by better organizing developer tools and run-time support for multi-cloud, edge/IoT applications, new multimodal user experiences and application-specific hardware devices.
What’s required are orchestration tools that creatively combine multiple innovations in cloud infrastructure management, workflow automation and data application development to reduce complexity and better leverage the architects, developers and integrators across ecosystems. The “walled gardens” of the B2B industrial and OEM software business will inevitably need to collapse.
AI & INFORMATION AUTOMATION POWERS INVISIBLE BUSINESS
AI and machine learning are creating?an economic and business world that’s vast, automatic and invisible. Information technology’s impact on “autonomy” is moving ahead quickly. Business that once took place primarily among computer-assisted humans is now being executed by ever more complex adaptive systems without human intervention.
Artificial intelligence capabilities made many strides in 2022 with total investments exceeding $100 million since 2020.?We believe AI innovations are positioned to make several leaps forward. From seemingly disparate innovations powered by self-organizing sensor networks processing voluminous amounts of data, computers are now able to understand and form associations based on statistical methods. Computers can, all of a sudden, do what we previously thought only humans could do autonomously.
Autonomous systems have common attributes, including:
What distinguishes AI is that it does things humans can do, but better, faster and cheaper. A new wave of so-called “generative AI” tools has emerged in the market that can generate novel content, rather than simply analyzing or acting on existing data – what we like to call “information automation.” Generative AI models produce text and images, including blog posts, program code and artwork.
AI’s contributions to transportation, finance, media and more already?dwarf many other over hyped innovations?(such as crypto). Generative AI tools have the potential to supplement or replace human creators in many sectors of our economy.?They are powerful tools for ideation, generating many variations that can subsequently be improved by humans.
THE FUTURE WILL BE DRIVEN BY SYSTEMS OF SYSTEMS
Complex systems rely on the interdependence of technologies and their synchronous advancement. Interrelated combinations of compute, network, sensor and software innovations will reinforce one another and multiply their impacts. These combinatorial innovations are enabling new more adaptive systems that increasingly are organized as a “System of Systems (SoS).” A system of systems is an arrangement of independently operated subsystems that enable collaborative interactions among them that give rise to emerging capabilities and values while each of the systems maintains its own capabilities and values.
For example, the development of autonomous vehicles not only depends on advancements in robotics and artificial intelligence to operate vehicles, but also on the maturation of the Inter-net of Things so an array of sensors can analyze driving conditions and interact with other cars, not to mention improvements in lithium and battery technology for cars to be able to efficiently refuel themselves. The interdependence of these technologies has no doubt contributed to their synchronous advancement. For example, many believed a limiting factor in the emergence of driverless cars was the high cost of batteries required to travel long distances. In response, however, battery producers have dramatically increased production to scale down unit costs.?Over the last ten years, EV battery prices have fallen 90%.
RE-DESIGN OF CORPORATE STRUCTURES DRIVES NEW VALUE CREATION
Modern enterprises have been deconstructing for decades and are becoming value-delivery networks consisting of diverse business functions and entities – some owned directly, many sub-contracted, but all requiring orchestration. Agile organizations are extending skills through new relationships and ecosystems increasingly comprised of coalitions of diverse self-motivated participants, not sub-contractors tied to “command and control” schemes.
Massive outsourcing of corporate services, combined with the emergence of platform business models is causing the “dis-integration” of traditional enterprises and is restructuring businesses into three broad categories or roles:
The growing influence and disruptiveness of platform models is forcing all businesses to think more carefully about their future roles. Since not all businesses can be platform players, organizations are now focused on which role suits them, and which players (in the other two categories) are prospects for win-win partnerships that will maximize value for customers.
Airbnb provides a compelling example. The company generates roughly $500 million in revenue per employee, which is higher than many of its technology-driven peers and more than ten times what traditional hotel chains generate. Airbnb’s brand strength drives its performance – the company produces twice the operating margin of its hospitality peers. The company can and will continue to invest in its capabilities at a much greater rate than its peers.
SUPERABUNDANT CAPITAL
Capital is superabundant. Global financial assets are more than 10X global GDP making great ideas more important than capital. At the same time, it’s becoming ever cheaper to form and prove new ventures.?Superabundant capital?is likely to be available through the rest of this decade. As the economy has evolved to a more service-oriented and increasingly digital state, the importance of speed and agility has increased dramatically.
Excess capital will be chasing good investment ideas for many years to come making unique skills and new innovation concepts far more important than capital formation. We also expect to continue to see aggressive investments in innovative digitally-driven players and sectors that have the potential to consolidate into a “winner-takes-all” mode. Virtually any product or services segment likely had twenty or more significant competitors thirty years ago.?Today that number is typically 3-5 globally dominant leaders in each segment collectively earning as much as 75% or more of the profit pool.
UNDERSTANDING CATALYTIC CHANGE
In times of radical change, crises of perception often cause significant failures, particularly in large companies.?Such failures result from the inability of managers to see emergent discontinuities.?The challenge lies in the frame of reality a manager creates in his or her mind — the inner model of reality.?The inner model of reality is a manager’s set of assumptions that structures understanding of the evolving environment.?A manager’s inner model of reality never reflects the reality of the external environment; it reflects a construct based on experience.?It is a reference model that allows managers to focus on what is perceived as important in a complex world.?In other words, it is a simplification of reality and the decision making process.
A manager’s decision model is shaped by the past and reinforced by the present.?All too often, it also is extrapolated into the future in a linear fashion with the help of traditional forecasting methods.?In short, when people are forced into a certain frame of reality, it is difficult for them to see solutions or paths that lie outside their frame of reference.
During successful times, a manager’s reference model and evolving reality tend to match.?In times of rapid change and increasing complexity, a manager’s reference model becomes a dangerous mix of experience and uncertain assumptions.?Typically, the process of selective attention comes into play.?Many managers will ignore important trends because they are difficult to perceive and, therefore, uncertain within their own frame of reference.?This situation often leads to an inability to act decisively or effectively.
Current strategy and planning techniques tend to force a tyranny of replication that is in direct conflict with the continuously changing environment in which we live.?For technology-driven businesses, the assumption that business as usual will prevail over a given planning period can be tragic.?Such assumptions leave little room for the dynamic management (or creation) of change, the early identification and integration of emerging issues, markets and technologies, or the increased presence of unfamiliar competitors.
To be effective planners, managers must reconcile two worlds — the world of facts and the world of perceptions.?To act decisively, the future and all its uncertainties must be weighed from multiple perspectives. Strategy and planning do not require a perfect knowledge of the future.
Business Technologist | Business Expansion | Sales Acceleration | Product Strategist | AI Innovation Advocate
1 年Refreshing as always, Glen Allmendinger AOn the topic of management, I recently read this very interesting article on the traits of digital winners. https://apple.news/AJXk7bPfDQYCv6ftbmY_9lw
Product & Tech Leader in Connected Products and Connected Services
1 年Insightful as always, Glen Allmendinger. A key takeaway for me is the challenge for leaders and managers to be effective in an environment of rapid and accelerating change. Given the powerful nature of human biases, it's all too easy for leaders to fall into the "tyranny of replication" and find their organizations navigating the future based on a framework of the past. That is one ingredient to opening the door to being disrupted by more imaginative innovators.
Founder and CEO @EnterpriseWeb
1 年Glen Allmendinger Great quote! "What is needed is a common means of managing code that can orchestrate and leverage legacy and new development across families of interrelated hardware and diverse computing domains. The tools we are working with today to develop complex application solutions were not designed to handle the scope of new capabilities, the diversity of usage and devices and the massive volume of data-points and interactions between and among systems." You just described EnterpriseWeb. It provides a unifying abstraction to enable heterogeneous and distributed elements/endpoints to be declaratively composed into services, which can be deployed across diverse environments. EnterpriseWeb was designed from the ground-up to support complex distributed systems and end-to-end orchestration. We use graph domain models for intelligent automation that interpret events and dynamically constructs responses based on context - in real-time, at scale. As you rightly point out, organizations have fragmented. The business needs an umbrella abstraction over-the-top of their IT estate so they can logically centralize discovery, composition, and policy-based management. They need an enterprise 'web'.