Beer game and feedback loops
Kevin Pang
Bringing systems level thinking to technology, strategy, and innovation to discover and power outsized growth. | Technology | Strategy | Innovation | Leadership | Teams | Imagineering |
It's been awhile. Usual Q4 stuff has kept me busy from posting some random thoughts here. But, as they say, back in the saddle [for now]. So here goes:
Beware long feedback loops.
Peter Senge at MIT is famous for his beer game in which he challenges executives to manage something as “simple” as a beer ecosystem, from brewery to bottler to wholesaler to retailer. The upshot is many executives overestimate their ability to control the system, resulting in imbalances in the supply chain. The intent is not to per se demonstrate managerial incompetence, but more importantly, to illustrate how quickly complexity can be injected into any system.
Source: https://www.atkearney.com/web/beer-distribution-game/the-bullwhip-effect
One way in which this can happen is to have different temporal loops in the way that information is fed back to us, e.g., with long supply chains. With multiple streams of information feeding back to us as outputs to our inputs with different time lags, we are forced to anticipate and estimate the value of that information as it arrives and the impact it might have on our decision making processes. As Senge and colleagues show, wrong guesses leads to rapid system imbalances.
This “bullwhip effect” is more simplistically illustrated by the below diagram. For a given energy/information input [the energy of arm movement in this case], the farther away you are from the input source the greater the amplitude of energy [or information] variance you are likely to experience. This has important lessons for both those upstream in long supply chains and for end deliverers seeking to harmonize or verticalize their supply chain; that a more systems based level of understanding is required rather than simple, fast near loop [read: reactive] systems.
Source: Wikipedia
The ability for more temporally integrated decision making is in large part the reason for the excitement for the internet of things; connecting things with sensors and feeding that information back centrally. Its second generation form, digital transformation (DT), takes this enhanced command and control one step further by enabling greater precision and accuracy in decision making by connecting not just things but processes and humans for a more in toto integrated system.
To drive this point further, I’ll invoke another Senge example: shower faucets. Senge points out that if hot water takes too long to arrive, one is tempted to overturn the faucet until too much hot water arrives, prompting a re-turn of the faucet to cold, and so on, oscillating until just the right desired temperature is achieved. Either removing the lag in the system or having a better decision making process, one that accurately takes into account system lag results in better performance and less waste. In some ways, I believe, his seminal book published in 1990 which discussed the concept of learning organizations; those that inherently pursue learning through active feedback presaged the idea of open innovation, viz., learning can come from both outside and inside the organization. Similarly, from a technology perspective, it also explains the hope and value premise for DT technologies to improve our management of systems.
So where might DT go, what can it do, and what might be some of its limitations? To help us with this I will invoke yet another MIT denizen, Norbert Wiener.
Professor Wiener’s work entailed trying to predict the future using imperfect past data, by distinguishing signal from noise, inspiring others to invent the field of signal processing. He pioneered a systems-based approach to understanding the value of feedback in creating autocorrecting systems. One of his many contributions is his thoughts regarding an automated anti-aircraft system that functioned by predicting future path based on past information. This and other work gave rise to his coining the term and creating the field of cybernetics; the relationship between communication and control. In Wiener’s world anything can be information in that it related to systems; one reason his work has impacted everything from computers to biology to today’s big data analytics. A very quick summary of Wiener’s remarkable career can be found here.
One of the things I admire about Wiener’s foresight is shown in the above right side diagram, with the very prescient diagram of man in the machine and of course the title of his 1950 book, the “The Human Use of Human Beings.” Today’s interest and progress in machine learning and artificial intelligence is prompting a dialogue and debate on how humans intend to relate themselves to the machines they build. The fact that this man was contemplating such things almost 70 years ago is quite amazing and humbling.
Why am I discussing such things? Allow me to present my very (very) simple minded idea of the complexity of information, and why the internet of things (IoT), sensors, edge computing, data analytics, AI, and yes, the beer game, come together to create the interest and excitement regarding DT. Imagine if you will, the following relationship between a person and the immediate data that surrounds them:
We interact with data in all ways, every day. But the value of the data to us is determined by the context by which we receive it. Context determines value, and whether and how we choose to react to the context-driven data. Context in turn is determined from the environment from which the data comes from. This takes into account all things outside of the user/perceiver. Complexity arrives when we try to achieve the following:
Shades of the beer game! We can use IoT and data analytics to better understand interactions from each player’s perspective, considering how impinging operating environments impact one another and therefore change the context and the data and the meaning of that data as it fluctuates in time. This works well in defined, small n linear chains where we focus on next nearest neighbor and no leapfrogging effects.
However, things are different when a system looks more like this:
Ecosystem of systems
Where we have the possibility of multiple environments impinging on one another. Or we have multiple potential outcomes in a time series as each prior event changes not only the environment of itself but also somehow affects the environments of others, changing the operating context, and therefore the meaning and rules of data and information for others. Examples of this is the inability for machines (AI) to as yet perform predictive analytics in complex systems or to “cure cancer” as AI enthusiasts have long hoped and hyped for; every manipulation of the system changes the system in ways we cannot yet map out of all the possibilities of changes, and therefore predict. In other words, we can’t predict what we don’t know or have not yet experienced as potential outcomes. Modeling and teaching machines to think like humans in these cases becomes limiting because we have no pre-existing “ground truth.”
However, this is where further work by Wiener and others might come in. By collecting large sets of data and treating each source of data as a potential particle undergoing Brownian movement it might just be possible to begin slowly understanding how seemingly independent systems impinge on another to create what we observe at the macro level. Within an overall noisy system of “particles” that impact one another, those that have a shared relationship of some sort would be expected to “move” in time with one another, thus revealing inherent dynamic relationships.
Mapping relationships within systems of systems
If so, this should be important as we really do try to use machine learning and AI to help us develop new treatments for disease, to have predictive analytics in say an overall multiplexed transport system, and as the world becomes even more connected, we move from traditional linear supply chains into more interactive supply and business model ecosystems. AI could help us to understand optimal paths and relationships and perhaps one day help us predict and construct even better paths and relationships.
Today’s simple linear path marking, e.g., connecting obviously related components via IoT and sensors lays the groundwork for tomorrow’s more richly layered, interconnected system(s) that will enable the capture and utilization of the information that flows within and between systems.