A Deep-Dive into Algorithmic Operating Models
Somil Gupta
AI Influencer of the Year | Adaptive BizOps Coach | AI Monetization Expert | Founder & CEO | Keynot Speaker
We had excellent discussions in the last weeks about Dynamic Pricing and Order-to-Cash process. And this week we discussed some of the high-level concepts and the importance of Algorithmic Operating Models (AOM).
What is AOM?
"AOM as a set of data-driven business capabilities and assets that can be algorithmically orchestrated to enable value delivery and value capture for dynamically generated customer scenarios."
In the picture above, we visualize the algorithmic business into three distinct layers.
AOM operates in the tactical layer of Data/AI that works as a 'semi-agile' domain representation and translation layer between 'static' business strategies and 'agile Data/AI development'. This layer, therefore, contains the evolving customer scenarios and decision models based on continuous experimentation and validation.
Once we establish such a tactical orchestration layer, we can design a blueprint for the algorithmic operating model using the standard platform and component design. In the next few days let's collaborate and co-create that blueprint.
According to Andreas Welsch, VP - SAP, currently, there are different silos or best-of-breed applications divided by business function or even sub-processes in the enterprise landscape. The pre-requisites to developing the orchestration layer to enable self-adaptation based on new data will require a common data layer/ model/ semantics, defined interfaces (APIs), and a flexible, compostable, architecture (e.g. microservices)
And this is how the orchestration layer will probably get implemented. Now, the question really is how to get the domain knowledge representations i.e. decision workflow and data-driven assets - insights, predictions, aggregates, etc together in a meaningful way?
And most importantly we need the context for decision-making in there. For example, while calculating the safety stock of an SKU, need
1. Data-assets: demand forecast, current stock level, minimum service levels
2. Objective functions - do we need to reduce cost or improve service
3. Contextual information - is this a sale season or a high-demand season or a regular season.
4. Choice alternatives - transport options, lead times, etc
All of these are 'representational assets' needed to model the business and must be encoded in the tactical layer to enable algorithmic decision-making. So really looking for contributions from the community.
In the second post on Algorithmic Operating Model (AOM), we took a peek into its inner workings. The article is a must-read for anyone considering algorithmic business.
There are 3 key points that the article addresses
1. Why is it so challenging to transform a traditional Operating Model into an Algorithmic Operating Model?
Historically, companies have optimized their scale, scope, and learning through 'greater focus and specialization', which led to the siloed structures. Business Process Re-engineering of yore focused on centralization and process excellence that gave birth to the current architecture paradigm of data warehouse/data lake. Therefore siloed and fragmented data is the result of the traditional operating model, not its cause and ERPs simply reinforce this phenomenon.
2. How AOM is different?
AOMs are built on a unified, seamless data-driven core where "there are no workers in its “critical path” of operating activities."What it means is that the 'functionality' like order processing, purchasing, etc offered by the ERP systems is no longer the prime value driver anymore. The data that flows into the organization and connects silos and business domains is the prime driver on which 'functionality' can be built after?aggregating the data and extracting value through analytics and AI.
With data and algorithms at the core, both learning and network effects amplify their impact on value creation and capture. Decision-making becomes a science and Analytics systematically convert internal and external data into predictions, insights, and choices, which in turn guide and automate operational workflows.
3. Processes become 'agile' products
AOM in essence works in two phases. The first phase is identifying the 'micro-opportunity' from analyzing the data. This could be any customer-centric value where an organization can differentiate itself.
The second phase is matching and deploying the right process to capture it. AOM takes traditional processes and transforms them into 'products' that can be positioned towards these micro-opportunities depending upon the context. And to do that, you need a deep product-inspired understanding of use cases and user context. We will cover productization in the later section.
In conclusion, building an AOM requires a lot more than SW, data pipelines, algorithms, and experimenting, we need to reimagine organization as a 'network of assets, capabilities, and resources connected and orchestrated through a unified data-driven core'.
@Anders Dalgard from Teradata explained it beautifully with his 'ming vase vs Ikea plates' analogy:
"Let me offer you an extra perspective on how to compete on and industrialize ML/AI at scale with 10,000 to millions of models in PROD => Moving away from artistic Ming vases to IKEA plates:"
He suggests the following steps:
1) Implement an Enterprise Feature Store for reuse of features to leverage already valuable features.
2) Divorce the model BUILDING (!) tool/platform from PROD platform so the model can be put into PROD where the data is located via PMML, ONNX, etc. to deliver an enterprise robust platform yet 100% freedom for the data scientists to bring and use any tool or platform or language they want to with no Group IT limitations.
3) Don't move data to the analytics - move the analytics to the data - in particular when you are using cloud/multi-cloud (latency and egress fees).
4) Prevent pipeline jungles, as Google Research writes, which means DO NOT have one data pipeline per model!
5) Make use of in-database model scoring rather than containerization all the time when running at scale.
6) Implement both DataOps & MLOps with automation including model drift surveillance.
Henrik Gothberg from Dairdux called it the Enterprise Operating System (EOS) - I.e reimagining an enterprise as a system. "The enterprises need to PIVOT to this new operating model. When the work of Value engineering, Data Engineering, and Algorithm engineering will fly. Your middle layer (The Tactical Layer that hosts AOM) will fly."
"All that we started talking about will have a hard time to fly in the old organizational paradigm. I hope that we soon can start sharing what we simply term the "DAIR-Hypothesis" as a working name."
Henrik further explained that the enterprise data backbone is key but what do we mean?
"We been figuring this out since the end of 2019 at Scania Financial Services. It came to the front of our thinking driven by figuring out how to merge two huge global initiatives. One is a Digital Transformation (DT) with focus on Data and Analytics and the other is a program for a global ERP/ Common process roll out...."
"The main challenge for the ERP program. when doing heart surgery in 60 local markets replacing local Loan and Leasing Systems and Finance and accounting systems is to deal with the "Twilight Zone" in a Scalable way. I.e all data provisioning that is happening outside the ERP. In, out, in between the core systems. To clarify. One of the problems to reaching automation with an ERP replacement is that. Even if you reach Automation inside the ERP. To reach End2End Automation is all about your data backbone supporting both OLTP and OLAP End2End.
So the Digital Transformation program was merged with the ERP program and the DT was renamed Insights & Data backbone. The portfolio steering was merged So we now simply look at the Use Case first and then decide where in our platform of platforms we solve the data/AI/UX value chain." ~Henrik Gothberg
领英推荐
Algorithmic Operating Model and Servitization
An excellent research paper by David Sj?din , Vinit Parida , and others builds upon a similar concept and identify and conceptualizes the underlying capabilities associated with AI in a digital servitization context where AI has played a pivotal role.
Their findings relate to two parts
1. Organizational AI capabilities
2. Principles underpinning AI business model innovation
Both these parts converge into a holistic framework for servitization.
The most important aspect to remember is that organizations should not consider these capabilities as 'systems and processes. That's the old way.
We need to envision these capabilities as 'products' and think about how can we productize them.
Andreas Welsch explains
"Data & data science are an essential aspect but, there’re so many additional themes surrounding the core data work to get to a successful AI project and AI-embedded product. We’ve developed an internal process that we call ‘AI Factory’ which addresses phases, resources, and skills from idea to operation. That helps us prioritize the most valuable use cases and bring structure to the overall ideation/ development process."
Christian Rasmussen, Head of Technology, Grundfos takes it even further as he talks about the data layer and how it interacts with the user journey
"I found it very useful to add a data layer to User Journeys. There are many great user journey templates. Some include backend and IT products. Having the tools there lead the team towards a staged approach thinking about known tools in marketing, sales, supply chain, etc.
I found that a data layer is much easier to relate to for people who does not know the tools in details (no one does across a complete business) and build what support the user journey."
Continuing the discussion on Algorithmic Operating Model (AOM) inspired by outstanding research by David Sj?din , Vinit Parida , and others, here's their framework for scaling AI capabilities through business model innovation
The authors identified three core capabilities of Business Model Innovation
1. Agile Customer Co-creation: Customer partnership and co-creation to demonstrate Proof of Value and support critical KPIs using reusable and reconfigurable AI assets
2. Data-driven delivery operations:?Using data and insights from AI in operational and strategic decisions for continuous improvement, learning, and innovation. Continuous monitoring of Data and AI for intelligent resource orchestration in service delivery e.g. maintenance, service, technical support, etc
3. Scalable ecosystem integration: Value capture by stimulating, enabling, and orchestrating 'competence and capability of internal, customer, and 3rd party actors to co-produce outcomes and realize opportunities - new value prop, rev. models etc
I find this framework to be a real milestone in understanding the big picture from discovering value opportunities using data to deploying AI assets for creating customer value to orchestrating internal and external resources for delivery and leveraging the ecosystem competence for value capture. This is a map that all of us need to use to develop our AOMs.
The Need for Productization of Data and Business Capabilities in AOM
When we visualize the AOM as the tactical orchestration layer in the Algorithmic Business landscape, AOM can be understood as a "network of assets, capabilities, and resources connected and orchestrated through a unified data-driven core" where internal teams and partners can collaborate to identify 'micro-opportunities' (value discovery) and position 'productized process and capabilities to (create, deliver and capture value).
Traditionally, business capabilities and processes are designed around one 'Best Operating Point (BOP)' i.e. the most likely scenario or rule as envisioned by the designer of the system. So there is one workflow for order management, one logic for prioritizing inventory, a fixed number of steps. "This is the process in our company, this is how we do it"... sound familiar?
In AOM, you can't have one BOP, there are several BOPs depending upon the context and timing of the requirement. Therefore the eventual 'process' or the actual path the system takes cannot be determined apriori. The only way to do that is to build several potential paths into the business capability and select the right path in run-time based on a 'set of parameters'. Those set of parameters become the 'process features'.
This is why most Algorithmic Operating Models used by GAFA are built upon 'matching' the requirement features with the process features or data features. That's what AI does best! It can detect and match the patterns. But if the processes are not developed as a product, there are no features to match against. Therefore productization of data, processes, and customer requirements are essential for AOMs because that is the most efficient way to leverage the power of Data and AI in identifying the best possible response to a given requirement algorithmically.
@Jon Cooke from Dataception explained
"Being able to overlay AI on top of existing business models rather than trying to carve a specific business is absolutely key. And the approach to be able to experiment and flex as one discovers more is vital, fail fast, etc.. otherwise, massive costs and time are wasted.
Also, you are absolutely spot on, to support this, productize the processes so they can expose key features of their process to enable models to access them massively accelerates the outcome without having to re-design ones org."
Stephan Zimmer, Manager - Enterprise Analytics and Automation at BWH Group throws a cautionary note:
"In principle, this corresponds to nothing less than a paradigm shift in how businesses operate. Whilst I’m a strong proponent of these ideas as they lead to a more data-driven business and value chain I fear that the conversation needs to involve the business leaders more and more immediately. So to be clear: AI and data initiatives are not any longer just a tool but rather the inherent mesh that keeps business units working together."
Henrik Gothberg argues that "Data/AI/Software needs to be part of the core DNA of the organization. Data/AI/Software is part of the core business process... I.e the process IS data/AI/Software. Then the Old dogma of IT supply and IT demand and the traditional view of IT vs Business is flawed. You will still have central/platform Infra-oriented tech vs Business Value chain tech. But the inherent view of the interface is different. Tech is everywhere. But our jobs with tech are different.
Like in a manufacturing plant. Some are engineers of the building, ventilation, and electricity. Others are engineers setting up and monitoring the health of the assembly line and then you have assembly line workers churning out products....
So very different tech roles. And part of different organizations. different types of engineering and production skills. Data/AI is no different.."
In conclusion, Algorithmic Operating Models are based on algorithmic analysis, selection, and orchestration of Data/AI capabilities and AI Business Model Innovation capabilities of an organization. The algorithmic selection and orchestration using Data and AI require that both the data-driven opportunities and business processes and capabilities are developed into 'agile products' with well-defined features so that ML algorithms can interpret the features and match the right opportunity with the right business capability.
I greatly thank all the contributors: Andreas Welsch, Anders Dalgaard, Henrik Gothberg, Stephan Zimmer, Jon Cooke, Christian Rasmussen, Puneet K Bhatia, Mikael Klingvall, Bill Schmarzo, Anuj Kumar Sen, and all the others I might have missed out by mistake.
Recommended Reading on the Topic:
2. Competing in the Age of AI
Data & Analytics Manager Core Business Range at Inter Ikea Group at IKEA
2 年Anders Dalgaard i would love to hear more about the Ming vase vs IKEA plate.
AI Advisor | Author: “AI Leadership Handbook” | Host: “What’s the BUZZ?” | Keynote Speaker | ex-SAP
2 年Thanks, Somil! Great discussion and progress this week. Thanks for including me.
Data & Analytics Leader | Professional Nerd | Lifelong Learner
2 年This is great stuff. It will take awhile just to properly take all this in and understand it. Somil Gupta, do you have any thoughts around org structure and what the team(s) look like to build/maintain the AOMs? Also, do you think there's potential vendors/consulting firms/external partners who are in this space that can help build this? I'm guessing many internal teams within an organization who get the approval to start down this path will need external help to bring this to fruition.
Dean of Big Data, CDO Chief AI Officer Whisperer, recognized global innovator, educator, and practitioner in Big Data, Data Science, & Design Thinking
2 年Great thread Somil. Thanks for starting and nurturing the conversation - giving everyone a voice and a chance to contribute and build upon the thoughts and works of others. Truly, social media at its best! #BetterTogether
Data and Ai
2 年Succinctly written article Somil Gupta!!! A different prospect of looking Business-Data-AI integration ??