The Three Biggest Things We Get Wrong About Analytics
It seems just about every business and industry these days are talking about how they are applying analytics to improve their business decisions. But peek under the hood and you’ll quickly discover that for most part, the term analytics is being applied rather loosely. Unless we move the needle on what real analytics can do for a business, we’re short changing our future, a future where we must use insights based on all available data to make better decisions -- because in today’s world, what worked yesterday may not work tomorrow.
So what’s stopping us from getting there now? Three main issues come to mind:
People don’t understand what ‘analytics’ truly is
We have become very used to seeing reports and dashboards on what happened an hour ago, yesterday, last month or last year. But, while we live in a world where most people understand reports and dashboards--because they see them every day-- what we aren’t used to is interpreting these historical views to understand trends, to project them into the future, or to understand the factors that impact the trends. We most certainly are not used to building models that can figure out how to optimize the outcomes.
Moving from reports to analytics is a journey, and one that people need to be willing to undertake.
We start with what most people call analytics: reporting. This may be a spreadsheet or the event logs from a process automation; its key function is to help us understand historical output. What has happened in our organization, in either the distant or immediate past, and summarize those events for easy oversight. This is useful in identifying performance to plan or process bottlenecks, but it’s a far cry from planning better outcomes for the future.
That is where the next stage of maturity comes in: predictive analytics. Taking what we know of the past, and measuring the processes or policies that impacted the historical performance to predict the future. Some of this is already taking place, but the largest barrier to application here is the lack of trust people have in the system. Prognosticators of the future are often wrong, so we have lots of experience to back up a distrust in predictions: just look at political polls. That, however, is shortsighted. All models of systems and behaviours improve with use and with access to large volumes of data. Since we can store and access virtually unlimited data today, our trend predictions can be very accurate, especially with respect to business processes or consumer and employee behaviour. If we can get past the distrust, we’ll reap the benefits of these designs. This is a virtuous cycle.
Once we’ve begun to trust trend analytics, the next stage of maturity is to model potential future outcomes by testing and predicting the impact of process or policy changes. By understanding and weighting the factors that impact our trends, we can model future scenarios to create a picture of what the future might look like under different assumptions or policies. By making adjustments to those models we can begin to improve our future outcomes and ultimately chart a path to the future we optimally want to see.
Some industries like electronic design have used such modeling for ages: take for example heat dissipation and power use optimization, where chip designs have long benefited from the detailed understanding of the interaction between silicon and metals in extreme proximity under different power use conditions. However, such examples typically only exist in very closed systems, not as a general rule of business. The closer we get to human behaviour, the less modeling has been attempted.
The fourth stage and the ultimate goal of analytics introduces the capabilities of machine learning and AI. With the data and models in place, our analytics can automatically learn from history, improve its decision algorithm with every event and transaction, understand the influences of the environmental factors, and make a decision for the best outcome.
This stage may seem far off but we are getting there faster than we might think. For example, the insurance industry makes its coverage decisions by understanding the risk trends, the environmental factors, and the subscribers’ actual historical performance and profitability. Such coverage decisions are made by an algorithm that may be too complex to be fully understood even by the people who administer it. We may not like the outcome, but one effectiveness of the decisions is unquestionable.
Naturally, the AI stage will come with a number of political hurdles, and the sooner we start this debate, the better for all of us.
Analytics can be boring or frightening
Humans naturally rely on instinct or experience, even if these are wildly inaccurate or driven mostly by emotion. Taking the time to collect data and building models of behavior is usually reserved to wonks of policy or data scientists. They then present their findings to the decision makers who may or may not take their advice.
This may seem like the natural order of things, but it is missing the point: we make decisions all the time. Every time we ask a question and get an answer, we trigger a host of other questions that can only be answered with access to instant trend analysis and models. The feedback loop between the decision makers and the data scientists is too slow and fraught with fear and misunderstanding. Learning how bad we really are at making effective decisions is very painful.
And so, we fall back on gut feel, instinct, or making decisions while missing critical information.
How then, can we persuade the decision makers to adopt a more analytical approach to their thinking? This is like any behavioural modification: It takes time and repeated reinforcement by success.
The solution here is to make complex analytical systems easy for the user, taking the data scientist and analytical wonk out of the decision loop. Present the decision maker with answers to questions, enable them to guide themselves down a path of discovery, to test hypotheses, and provide them with optimized suggestions for action.
A good example may be the pricing of produce in a food store: let the produce manager decide what is the optimal price at which the goods will sell at maximum profit and minimum waste. The cost and shelf life of the goods, availability of supply, the store traffic, the seasonal price sensitivity, the current state of the economy and the weather are all known data points. Allowing the produce manager to test a couple of hypotheses and establish the optimal price should be an easy and fun exercise. And the price can be changed to reflect a change in any of the underlying factors that impact the decision: If it is sunny outside, decrease the price because fewer people will come in to shop.
Boring means there’s no emotional connection. But chances are, if we do strike an emotional chord, it’s because we don’t like hard truths.
Making something easy however, does not mean guaranteed adoption. As humans we like to receive an emotional reward for the things that we do. Being shown we perform poorly compared to others or that we make bad decisions is not rewarding. Changing behaviour is hard.
And here’s where analytics can really stumble. Anyone that has used a fitness tracker knows that if you aren’t immediately seeing the outcomes that you want to see you take one of two actions: you are either motivated to improve the outcomes, or you become disengaged and find ways to ignore the data and ultimately stop entirely.
People naturally like to see benchmarks, they like to see where they fit in the community. If the data presents an image that matches their own presumptions, then it’s easier to adopt that model. If there’s a disconnect, which there often is, there needs to be a strong incentive to change that behaviour or it won’t be adopted.
Analytics models therefore need to build in feedback mechanisms that reinforce good behaviour and improvement. Outlining the paths to success, not just presenting the failures. Showing the factors that depress performance. Focusing on the important things so that change doesn’t become overwhelming.
Analytics also need top down support to become a cultural norm. If the boss asks questions that require answers that came from forward looking analytics, it is likely that the organization will adopt that approach. If, on the other hand, the boss makes poor decisions based on gut feel, forget trying to change the organization behaviour. The good news is that in the fast moving global economies of today, luddite bosses have a short half life.
Organization change management | Culture fit & alignment in acquisition | Leader assessment & coach | Talent management
4 年insightful John … hope you're well
Data Warehouse Architect | 14 Years of Expertise in BI and Data Consulting | Top 3% Data Developers | Driving Strategic Decision-Making through Data Insights
4 年This is really great read
Board of Directors Audit Chair at BlackSky
4 年Great article on Analytics and future directions. Thanks John.
VP Vocera India at Stryker
4 年Thanks John for this excellent insight.
CEO and Co-founder at Space and Time
4 年Great read, John Schwarz!