Trends Driving the Adoption of the Best Practice Analytics Engine
Image by Evalueserve/Geoff Livingston

Trends Driving the Adoption of the Best Practice Analytics Engine

by Marc Vollenweider, Evalueserve co-founder

As data has scaled, companies have sought to perfect their ability to process data and create ROI, authorizing large-scale investments to build an analytics engine. However, as time passed, perfection is unattainable due to the increasing dynamic evolution of use cases, data volumes and data types, technology used to ingest data, analyze information, and distribute insights. Many companies are now focusing on building more nimble best practices engines.

Moving away from the “perfect” analytics engine concept

The best practices engine allows you to evolve, and meet each problem and use case with a solution that can create results that a company needs. An analytics engine provides tailored data analytics solutions that enable organizations to make decisions. In the case of the best practices engines, it relies on experiences and use case needs to achieve results time in and time out.

Here are five trends moving companies towards best practices engines:

1)?The increasing dynamics of change demand faster decision-making, and therefore more flexible and nimble analytics

To say change is impacting global business is the great understatement of the decade. From COVID and supply chain issues to the unfortunate Ukraine war and increasing demands for ESG practices, companies face change at every corner. Everything is becoming so dynamic that many companies are eliminating annual budgeting cycles and using analytics to drive decision-making en route to success.

For example, a major pharmaceutical company only looks at short-term, medium-term, and long-term outcomes in their functional area planning instead of the annual budgeting cycle. Industry and internal dynamics drive this trend, e.g., faster innovation and shorter product lifecycles, emerging digital models, and unexpected local and global competitive dynamics. In addition, we cannot ignore external discontinuities such as supply chain disruptions and geopolitics.

Because of these dynamics, real-time or streaming analytics are gaining significant traction. Real-time decision-making capability provides enterprises with speed, boosts their customer loyalty/outreach, and offers a significant competitive advantage. Plus, continuous analytics can alert users as events happen, help improve supply chains and reduce costs, and bring fast ROI on streaming data pipeline investments. [IDC Future Enterprise Resiliency and Spending Survey Highlights Key Findings in the Areas of Customer Experience, Enterprise Intelligence, and Digital Sovereignty]

Meike Escherich, associate research director for European Future of Work at IDC,?noted?a significant uptake in the implementation of real-time analytics, with one in three European companies already using them to measure team performances or planning implementations in the next 18 months. Similarly, Gartner predicts more than half of major new business systems will incorporate continuous data intelligence in 2022. [Real-time analytics in 2022: What to expect? | VentureBeat]

At Evalueserve, we work with many of the Fortune 500 clients and have seen several use case examples in practice:

a) Real-time CIMI (competitive and market intelligence)?analyzes trends and data as it happens to inform strategists, sales and marketing of competitive actions and dynamic market evolutions as they happen.

b) Lead scoring and recommendation engines?leverage AI to rank leads and provide sales recommendations (what to sell, to whom, and when). This helps the sales team become more productive.

c) Supply chain control tower?provides clients with real-time and end-to-end visibility into their supply chain. This helps them make decisions proactively, manage risk better, and become more resilient.

d) Predictive maintenance?leverages AI in manufacturing facilities to detect machine failure. This helps companies manage downtime risk, maximize output (production), and improve CX.

e) Predictive health?leverages AI to assist healthcare professionals in faster decision-making. These models enable faster (and accurate) diagnoses and finding the best possible treatments, thus helping both providers and patients.

A lot of our clients are having second thoughts about building that perfect engine and moving to a buy off-the-shelf mindset. Products can be deployed faster if they have the right features, functionalities, and UX. These tools serve as building blocks, as they cannot resolve the vast majority of use cases (approximately 90%). Domain experts solve these use cases with ‘modular’ components stitched together, customizing the ‘last mile’.

2) The growing importance of data-driven decision-making

No alt text provided for this image

Insightsfirst is one of the Evalueserve products that companies use to understand competitors and larger markets.

Reaffirming this trend, data-driven decision-making is making big inroads in all functional and geographic areas of companies, marketing, sals, products, R&D, operations, finance, HR, etc. Momentum is driven by a group of early adopters in almost every industry who are successfully harnessing analytics and AI to address domain-specific problems.

According to?IDC analysts, most corporations (90%) predict valuable information as a “critical enterprise asset and analytics as an essential competency” in 2022. As a result, businesses are investing at record paces. Businesses spent $215 billion in 2021 on analytics solutions, a 10 percent increase over 2020. [Eight Trends Predicted To Define Data Analytics In 2022 (forbes.com)] Financial investments reaffirm and accelerate the trend of analytics engine investments.

3) Flexible adaption of ‘the engine’

Competitive balance coupled with increasing dynamics create a situation where there is simply not enough time to develop the ‘perfect’ analytic engine for each analytics use case. Companies that want to remain competitive are moving away from approaches that ‘hardwire’ data feeds, analytic models, human analysts, and distribution engines for each use case.

Managers at all levels need an analytics engine that ‘does the job’ today and can adapt quickly to tomorrow’s challenges. This requires a deep real-time understanding of changing requirements, a ruthless removal of outdated use cases, creative ‘cracking’ of new use cases, and constant life-cycle management of existing use cases.

The whole value chain -- from collecting, cleansing and analyzing data to build distribution engines -- should produce real so-whats for specific end users and deliver them these meaningful analytics on time and in the right format in the right medium. Architecting a solution for use cases requires human guidance to direct strategic use, tactical execution, and interactions within the larger analytics engine. Self-serve tools allow professional and citizen data scientists to address modules in the larger effort to develop and modify existing reports and models in an agile fashion.

In addition, there are now KPIs that have emerged because of this dynamism, including?time to insight?and?time to value. Most mature analytics organizations are keeping close track of these KPIs.

4) Fast and flexible reallocation of resources

No alt text provided for this image

Smartphone pic: Image by?guteksk7?via Adobe Stock

Estimates suggest that about 50 percent of all analytics do not have any impact anymore. They are not ‘consumed’ for various reasons, or even worse, they provide the wrong answers because their static decision-making models do not reflect current market dynamics. This inability to use analytics effectively drives the adoption of more flexible best practices engines.

Best practices analytics engines require fast reallocation of analytic resources, both machine and human, from outdated use cases to new use cases. Companies should maintain an Analytics Roadmap (prioritized sequence of Use Cases) that is refreshed periodically as a part of their data strategy program. This helps them stay current and aligned to enterprise strategy and priorities.

5)?More concise and simple analytics provide "so-what" insights and stories

Whatever needs to be told, should be done in much less time. Before the pandemic, in-person sales meetings would last one hour with a bit of small-talk before and after the meeting. Thanks to video-based communications, the average meeting is now 30 including the small talk. This applies to analytics as well.

When clients consume analytics, Evalueserve sees a clear trend towards ‘stories with conclusions. “Data storytelling” has become a very important part of the job description of data scientists. Decision-makers do not want to study large tables of data to then have to draw their own conclusions. Further, they must be delivered when and how decision-makers consume information, ideally embedded in their standard platforms, e.g., Salesforce. This means providing analytics in the right format in the right channel when they are needed.

We learned data science professionals need to communicate findings in the language that businesses understand. They must tell the end user the so-what behind the analytics, and do so with a meaningful story. Only when executives understand the insight will they trust and adopt it. To ensure the successful adoption of analytics, companies are investing in grooming and hiring “Analytics Translators” whose primary job is to act as a bridge between data scientists and business professionals.

In conclusion, these trends are driving businesses to choose more agile and nimble paths toward providing analytics. Static implementation models don’t provide the repeatable Return on Analytics that more nimble analytics engines can provide. As more companies realize this, they, too, will move towards adopting best practices analytics engines.

This article was originally published on Medium. Please follow me there, too.

要查看或添加评论,请登录

社区洞察