Scenario Planning + Big Data = Dynamic, Living Scenarios

Scenario Planning + Big Data = Dynamic, Living Scenarios

We live in a world that is increasingly interconnected and changing at an increasing velocity. New business models and disruptive technologies in the “connected world” are creating new opportunities and threatening established businesses at a faster rate than ever.  There is a significant amount of literature available about scenario planning for business: from battlefield scenario planning to their pioneering use in Big Data analytics. But the details about how to construct viable scenarios that are truly useful, and how to move from scenarios to decisions, requires more than just reading examples.

We like to believe that business and the economy are designed, rational structures of money and things that can be automated. In reality, they are both social interactions of people that can never be perfectly encoded in diagrams or algorithms.

Effectively using human logic requires both correct, current data and perfect control, which are both unattainable.

Therefore humans developed the ability to come to decisions under uncertainty using a wide variety of decision biases. These aren’t fallacies but very practical, simplified decision mechanisms that form the basics of modern scenario planning – a strong foundation in real world, quantitative information interpreted through current data that enhances human capabilities by keeping critical information up to date - not to replace the qualitative judgment and experience. Further, by examining human-computer collaboration in scenarios, repeated human reactions to a particular process pattern (i.e. quantitative data) is potentially interesting and useful to make others aware of the likelihood of a positive action.

 Scenario planning is a powerful tool that can provide insights into the potential impact of these shifts and guide innovation planning, at both a sector level and for individual companies. For the purpose of this discussion, scenarios are inductive narratives of the world at some future point in time. Unlike forecasting, scenario planning is not intended to give a definitive prediction of the future, but rather to communicate a wide range of possible outcomes and the possible consequences of each. Although it is surprisingly hard to create good scenarios, they can be enormously valuable in helping companies large and small, as well as governments at all levels, to develop strategies to help them prepare for the unexpected.

Scenario planning has its roots in the natural resources and commodities sectors where the initial investments are very high and the expected returns mostly decades in the future. This in a world of ongoing inflationary operational costs and unpredictable commodity pricing that are driven by local, regional, national and global factors. One of the early pioneers of advanced scenario planning was Shell and, according to their corporate reports, scenario planning helped them to “predict future prospects more clearly, make richer judgments and be more sensitive to uncertainties in development”.

Scenarios provide a framework for thinking about trade-offs in the allocation of finite innovation inputs, capacity (labor, capital, government, etc.), and spending across a range of possible futures. By identifying common features across scenarios, sector-wide, or regional/nation-wide roadmaps can be formulated that optimize resource allocation under the widest range of scenarios. Scenarios are a useful tool for communicating complex ideas and building a platform for change across a diverse set of stakeholders by putting outcomes in the appropriate context for use in planning. Scenario planning is useful not only to understand possible futures and react to them, but also to guide proactive actions that will influence ideal outcomes.

Recent developments in Big Data storage and analytics are already leading to new scenario planning tools that will offer unprecedented scale of data availability for tremendous innovation in scenario planning and other foresight methods. These will create new disruptive opportunities for “living scenarios” that will adapt and learn to provide users with robust, dynamic, narrative form for conveying data pattern and association analyses. The drive to use the new data capabilities to better manage and communicate organizations’ strategic response to future uncertainties will also generate the challenge of data overload and overly complex data analysis - all data must, after all, ultimately be transformed into stories for widespread human understanding.

Unfortunately, the effectiveness and limitation of current big data scenario planning and analytics tools stems the need to structure data before it can be explored with these new analytical tools. This inherently limits the use of Big Data to those who can afford to spend millions to initially establish and structure their data into a traditional data warehouse and then spend millions more trying (often in vain) to keep their valuable data from decaying with an ever growing maintenance problem.

The biggest problem with using Big Data approaches to scenario planning is that the traditional approach of setting up and maintaining a Data Warehouse is a death spiral of ever increasing costs and technological complexity. Companies must spend many months and many millions of dollars integrating data from source systems into data warehouses using a process referred to as ETL "Extract Transform and Load". To be effective, the teams of IT engineers who are assembling your data have to be intimately familiar with the data and with the work of building and maintaining your data warehouse. The costs of setting up and maintaining a data warehouse represent the most significant and largest ongoing costs for any Big Data application. It is estimated by some accounts that approximately 40% of the total $50 - $60 billion big data analytics market is spent just on the professional services that are required to support IT and data warehousing professionals that are focused on integrating data from various systems into data warehouses.

Scenario planners who are using Big Data are already discovering that the costs associated with the ETL, maintenance and support of data warehouses and costs associated with data warehouse decay (or aging), can be a significant impediment to the utility value of their data for ongoing use in scenarios – and these costs grow with the size and complexity of the scenario application. This lack of data integrity makes it more and more difficult for new or disparate data to be integrated into legacy systems leading to redundant and costly systems, and an increasing load of additional ETL activities from data warehouse or data marts to allow for effective analytics and analysis of the data.

As the volume, velocity, and complexity of data continues to grow, a diverse team comprising of entrepreneurs, scenario experts and big data engineers sat down to fundamentally re-think the way we manage and explore the data collected during scenario planning. The first redesign was to replace the scenario planning “report” with a dynamic ongoing scenario intelligence feed. The software application requires that users understand the “character “ and content of their data before they can form relevant scenarios. So if you want to know; ““how cost effective is a certain medical treatment on a particular set of patients?” or “What’s likely to happen to other global commodity prices (like gold), if the price of oil continues to fall?”; you must first know something about the systems map that shows the interconnections and potential causalities that exist in your data sources. This led to the definition that each data set must have a layer of custom curated information by a data scientist. The strategic principle is that no black box can make the requisite decisions, but it can provide the decision maker with a competitive advantage insomuch as providing deeper insight about odds of succeeding or not.

With the wide adoption of the Elastic Cloud, the rise of Intelligent Indexing, and Machine Learning Systems or Artificial Intelligence (AI), scenario planning is entering into a new age of vastly improved flexibility and utility.

We will now be able to develop scenarios that effectively model indicators by endlessly iterating our scenarios based on real time, global data – basically anything reported can be indexed and incorporated into the scenario planning software in real time. This will allow users to develop scenario-planning tools built on data platforms that constantly update reported data and refine scenario models accordingly. The corporate world is moving fast into the realm of trying to exploit these kinds of big data applications for scenario planning. Companies like Palatir, Tableau, Qlik and IBM (Cognos) have moved quickly to develop world-class big data applications with software products that make the rapid analysis of big data effective and less expensive.

Intelligent Indexing, Machine Learning, and other tools are now being developed that can help to reveal answers to relevant questions you didn't even think to ask. With “intelligent agent” analytics that can learn and guide data queries and constantly improve the intelligence of both your questions and answers. Dynamic, constantly updating scenarios can help management by shoring-up the fragility of their static scenarios and the costly recalculation of predictions by providing always current, quantitative inputs and improving the reliability that is needed to support ongoing qualitative analysis. The world is full of proclamations that “Machine’s Are Becoming More Human" and a constant drum beat, "Can Big Data Show Us the Way?" This may be true, but only on a limited basis, within tight constraints. In an uncertain world where context and assumptions are as ever changing as the valuation of stocks or the exchange rate of currencies, relying on computer software may be as dangerous as it is intriguing. In this respect, scenarios can offer not only a greater divergence of forecasts than an algorithm based on a limited set of assumptions and fueled only by historical data, but they can tell us what might happen if we act on bad advice from those algorithms.

Scenarios aren’t just about managing risk; they are also about seeing or even creating opportunities where others cannot.

Scenario planning, backed by machine intelligence and a foundation of the most current data will enhance the effective management of all kids of assets and the technology that delivers them to decision makers. The dynamic nature of the scenario feeds are then comfortably linked to management options to enable a process that transform them into enduring value. Software supported, dynamic, scenario planning can help all markets and organizations to become more imaginative, more intellectually agile and more adaptable in their planning and prediction activities— and that can only be a good thing when navigating in our turbulent, high-velocity world.

Many thanks to my colleague and friend Gideon Malherbe with VCI (www.govci.com) for co-authoring this article with me. 

Paul Malyon

Product & data strategy | Product team leadership, data commercialisation, governance, privacy | Delivered at FTSE 100 businesses, SMEs and governments in UAE and UK | CIPPe

3 年

Great article, thanks David. Scenario building is definitely an area of interest.

回复
Tripp Braden

Empowering Business Growth through Strategic Partnerships | Executive Coach Developing Future-Ready Entrepreneurs | Simplifying Complexities in Robotics, AI, and Automation with a Leadership Edge

8 年

Great article David. I really enjoyed the perspective you shared. I've been doing scenario building since the mid 90's. This new twist is going to change the way we look at things. One of the critical elements I see is how do we educate our team members on the benefits of automating these processes to allow people to be more creative in the process Letting the technology do the heavy lifting should empower leaders to be more innovative in how they use these capabilities.. Today's early adopters will be tomorrow's big winners!

回复
Patrick Engelking

VP Architecture & Transformation | Advantage Solutions

8 年

Good post! As you recommend, we are also now leaning into the conversation on analytics by talking first not about the technology at all but the "use cases" (aka scenarios) that stakeholders want to address. It streamlines the technology discussion down the road by focusing on the "front office" of the need itself - not always an easy framing conversation.

回复
Josh Loftin

Wordsmith and storyteller

8 年

A well-written, insightful article. Data can provide a nice advantage, especially in industries where most companies haven't adapted to it because of the cost. But it still requires human inputs, a human decision on output (data points) and human analysis. I see too many companies purchasing or subscribing to platforms -- in digital marketing, for example -- that they think will do all of lifting for them. When they realize the data still requires human interaction throughout the process, they get frustrated and fail to leverage it.

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了