What matters now: AI at Scale
Business environment has become AI-defined: Competitiveness depends on extent of AI utilization for higher customer value, better customer experience and enhanced operational efficiency.
The only viable response to ongoing disruption is to embed AI in all aspects of value creation. Target level is to be set at tens or even hundreds of concurrent AI use cases cutting across all company activities.
To embed is to integrate. This is about deploying data products, AI models and enterprise applications to create AI-enabled products, services and processes. Integration calls for extensive set of digital capabilities combined with solid data, AI and software engineering practises.
The ultimate goal is to turn Business Domains into digital innovation and value capture engines – needed for sustained competitiveness in the Age of AI.
This article explores prerequisites and key enablers for AI at Scale:
The article ends with explicit recommendation: As the first step, assess constraints that prevent achieving AI at Scale. Once constraints have been identified and understood, they can be systematically eliminated one by one.
AI-defined business environment
Right now, AI is penetrating all areas of running a company. Many tasks previously done by humans are now given to AI because it can perform them better and faster.
Succesful AI deployment leads to productivity leap with more value being created with smaller cumulative investment. This simple formula guarantees investment funds availability to further AI research, to AI tools and solutions development, and to companies who are skilful and agile in their AI deployment. Money finds productivity.
Virtuous circle of more investments to better AI solutions leads to productivity leap and makes business environment “AI-defined”. Competitiveness will depend increasingly on the extent of AI utilization. Disruptive nature of AI technology evolution means that AI laggards will get marginalized. This is not some unlikely scenario. This is given. Only timing varies from one industry to another. Money and resources run away from low productivity. In short, AI revolution speeds up Creative Destruction.
Industry Leaders move Productivity Frontier by themselves with cutting edge AI use cases. Followers stay near the frontier. Their innovative power is not enough to move the frontier itself but it is enough to keep up. This is viable strategy provided that the gap stays small enough.
New AI use cases create Competitive Advantage. Companies with strong AI use case portfolio enjoy strong pricing power and margins. However, CA is eliminated when the rest of the pack catches up. Over time, all AI use cases transform into BCRs – Basic Competitive Requirements.
Not having CAs is not a big problem. Not having BCRs is. Company without BCRs lacks table stakes to the game. This is the existential risk created by the ongoing AI revolution.
Not having CAs is not a big problem. Not having BCRs is.
Things may move surprisingly fast. Taking Generative AI case, business managers need to ask questions like “How long will it take for our customers to expect natural language based User Interface to access our products and services?” and “How fast will ChatGPT consumer experience cross-fertilize to expected Customer Experience by our B2B customers?” For reference, in case of consumer-oriented mobile applications cross-fertilization didn’t take long. Hence the key question becomes “When will natural language based User Interface become table stakes in our business?” To prepare for the D-Day is to create digital capabilities to equip products and services with fine-tuned Large Language Models running on proprietary data.
To compete is to embed
To succesfully compete in AI-defined business environment is to embed AI in all aspects of value creation. Digital Innovation Process provides keys to the castle – with innovation defined as something useful in customer’s hands rather than mere idea.
Value creation builds on digital products, services and business processes. They are the practical means to deliver higher customer value, better customer experience and enhanced operational efficiency. Value capture happens thru monetizing all this – with higher margins leading to greater financial gains.
To embed AI is to integrate AI use cases in products, services and processes. Consequently, Digital Innovation Process consists of use case discovery and integration phases. The process is highly iterative with lots of experimentation, feedback and learning – leading to continuous optimization and improvement.
To embed is to integrate
While embedding AI in all aspects of value creation is useful concept, AI use case integration is about concrete engineering. To embed is to integrate. Integration calls for large set of digital capabilities from data products to AI models and from enterprise applications to modern software engineering.
As discussed, digital products and services are used to deliver customer value and customer experience. Correspondingly, digital processes provide the path to operational efficiency. They don’t have to be natively digital but they need to be digitalized for AI use case integration to take place.
Products, services and business processes become digital thru an enterprise application of some type. Subsequently, enterprise application becomes the ultimate AI use case integration point – with software engineering providing the means of implementation.
Operating Model: Organizational pull to match AI technology push
Decentralized operating model is the single biggest factor in achieving AI at Scale. With Digital Distribution operating model, ownership of data products, AI use cases and models, and enterprise applications is allocated to Business Domains.
Business domains are defined as Bounded Contexts. Thru this crucial operating model design principle, parties involved can focus on domain specifics rather than making futile attempt to master company-wide details. For example, marketing team can focus on improving marketing operations with marketing related data, AI models, applications and processes. Overall, Digital Distribution operating model can be thought of as generalization and extension of Data Mesh principles of decentralization, ownership and self-service capabilities.
Business domains as bounded contexts enable shared semantic understanding. Things become meaningful and tangible. Language, concepts and terminology become common. Combined with strong data culture, this establishes initiative across all business domains – with business capability owners seeking to leverage AI to improve operations and product/service performance on their own. After this, AI use case discovery no longer requires centralized push or facilitation. Decentralized initiative is foundational to achieve AI at Scale.
Initiative within Business Domains emerges as cornerstone for AI at Scale
Overall, combination of decentralized operating model and strong data culture creates organizational pull that matches AI technology push. The ultimate goal is to make business domains into innovation and value creation engines. When succesful, this leads to sustained competitiveness in the Age of AI.
Minimizing cognitive load on business domains
Business domains are responsible for AI use case discovery and integration. That puts heavy cognitive load on them. The load related to domain-specific market opportunities, customer needs and business problems remains theirs to carry. However, cognitive load related to data, AI and software engineering is to be minimized with Plug-and-Play Framework.
Plug-and-Play Framework is the centralized part of Digital Distribution operating model. It applies platform engineering principles with extensive utilization of automation and self-service platforms, combined with services provided by Platform Team. In this way, business domain teams can focus on domain-specific business topics rather than on engineering complexities.
New business domains just plug in and start to play
Plug-and-Play Framework serves all engineering needs related to data products, AI models and application software. Combined with carefully defined demarcation line between Platform Team and business domains, onboarding new business domains becomes straightforward: They just plug in and start to play.
Digitalization as prerequisite
Digital characteristics of products, services and business processes play pivotal role in AI?driven value creation. Missing or suboptimal digital elements may emerge as significant bottlenecks or outright roadblocks to AI integration. In the Age of AI, they constitute technical debt that is now due to be paid.
Products and services are either natively digital or digitalized. From AI integration perspective either variant is fine but their digital characteristics do depend on the origin. In?general, natively digital products and services offer somewhat smoother path to full-blown AI utilization.
Digitalization of business processes is about deployment of ERP module of some sort. In case of fairly standard processes and corresponding ERP modules, evolution of AI support boils down to vendor roadmap. Conversely, in case of business-critical highly customized processes and modules, AI integration gets more complicated and depends on ERP module flexibility, scalability and customization capabilities.
Leverage from AI is restricted by digital characteristics
AI builds on data. Subsequently, AI utilization depends products’, services’ and processes’ ability to source, process, exchange, store and maintain data. Specifically, processing of data builds on AI algorithms and architectures – represented by AI models. Depending on digital characteristics of products, services and processes, leverage from AI is more or less restricted. Understanding these restrictions is essential.
Modularity, portability and reusability as design principles
Ubiquitous AI builds on principles of modularity, portability and reusability. These principles cut across all design aspects from decentralized operating model to engineering details. They apply to business domains, ERP modules, data products, AI models, microservices and containers alike. The exact nature is context-dependent but same core principles operate in all those AI at Scale enablers. Let’s take some examples.
Decentralized operating model is inherently modular with multiple business domains taking ownership and iniative. By definition, operating model is about standardized operations across all business domains and specifically when interacting with the platform team. That is, mode of operation is also portable and reusable. The term Plug-and-Play Framework is by itself an attempt to capture merits of these design principles.
领英推荐
Composable ERP concept by Gartner is useful frame of reference when considering business domain needs for modularity, flexibility and adaptability. Single vendor monolithic ERP would not fly in the context of Digital Distribution operating model.
Nextdata mission is about decentralized data at scale by applying proven software engineering methods to data products in the context of Data Mesh. Nextdata value proposition is compelling: data products become portable and reusable thru containerization and by accessing them thru APIs like any microservice. When AI models are considered as part of data products – rather than only data consumers – benefits of modularity, portability and reusability extend to them too!
Data products and AI models accessed via APIs like any microservice makes intriguing value proposition
Microservice architecture or containerization need not to be considered as prerequisites for AI at Scale. However, as inspired by Nextdata example, they do serve as useful benchmark for architectural and computing design and for targeted engineering capabilities. In the meantime, “modular monoliths” may serve short to mid term needs just fine.
Integration Framework for data products and AI models
Integration Framework consists of multitude of enablers used to integrate data products and AI models to enterprise applications. Integration framework is a subset of Plug-and-Play Framework with focus on integration specifics. The goal and methods are the same: Minimize engineering related cognitive load on business domains thru platform engineering, extensive utilization of automation and self-service platforms, and support provided by the platform team. In terms of AI utilization, integration of data products and AI models is where rubber meets the road. This simple fact sets the importance of integration framework.
Starting from enterprise application internals as the stage for integration to take place, two situational factors emerge: Technology platform and Software Development Kits (SDKs) – with significant inter-dependencies. Here technology platform refers to application type within specific hardware context. For example, in case of digitalized physical product, enterprise application runs on embedded software. When the product happens to be Edge device, limitations set by the hardware may be severe from computing power to storage or even battery. Conversely, digital service running on cloud is completely different playground for data product and AI model integration to take place. The choice of SDKs depends heavily on technology platform but in all cases the level of support to business domains is crucial to minimize engineering overheads.
As discussed above, microservice architecture combined with containers offer unbeatable flexibility with self-contained and independently deployable encapsulated modules. This leads to significant integration benefits. However, the penalty is development overheads thru increasing complexities and required skillsets.
APIs’ contribution to AI at Scale is significant
Application Programming Interfaces (APIs) play pivotal role in integration framework. APIs’ contribution to AI at Scale is significant – so much so that API strategy with API-First principle in application design is justified. APIs are the primary method of interaction between microservices, data products, AI models and enterprise applications. APIs facilitate data access for batch and real-time data alike. They are the connectivity tissue enabling seamless interoperability. Robust API design caters to scalability to handle large volumes of requests and data transfer volumes – in order to support large amounts of versatile AI use cases.
Finally, Integration Platform as a Service (iPaaS) emerges as essential integration framework component. iPaaS comes with advanced tools and managed services that ease the burden of building and maintaining integration infrastructure for seamless connectivity and interoperatibility. Capabilities include API management, data access and transformation, pre-built connectors and templates, and security features.
Industrial-grade data, AI and software operations
Data, AI and software engineering practises are foundational in achieving AI at Scale. Rather than based on improvisation or artesan-like tailoring and tinkering, these operations need to be industrial-grade from design to development and from testing to deployment.
The goal is set as speed, agility and quality at scale. Speed represents short time-to-market, agility is about responsiveness to customer needs and changes in competitive landscape, while quality is fundamentally about trust by all stakeholders. At scale connects to ability to discover, develop, integrate and maintain data products and AI models in high enough quantities to enable hundreds of concurrent AI use cases.
To achieve industrial-grade operations defined this way, workflows are to be based on DevOps, DataOps and MLOps practises and methods – XOps in short. Due to unpredictable and dynamic nature of customer and market needs, these operations are designed to support experimentation, customer feedback collection and continuous learning and improvement. Organizational design is based on “you build it, you run it” principle. This is to create product-oriented culture with ownership and accountability.
The relationship between measured XOps outcome and underlying enablers is systemic
Building and maintaining industrial-grade operations calls for verification. For that, well-established DevOps measures are deployed for XOps as a whole: Delivery Lead Time, Deployment Frequency, Mean Time to Restore Service, and Change Fail Rate. Each of these measures is influenced by a complex interplay of digital capabilities and XOps practices. That is, the relationship is systemic. Understanding dependencies between measured outcome and underlying enablers and capabilities is crucial to pinpoint areas for improvement.
For business domains to deliver on the promise of digital innovation and value capture, ramping up XOps is essential. Plug-and-Play Framework is there to support business domains in achieving that goal.
Digital capabilities outsourcing and skills acquisition
AI at Scale implementation calls for substantial digital capabilities, both in terms of human capital and platforms/tools. It is not realistic to have all of them in-house at once. In fact, in many cases, outsourcing significant portion of digital capabilities may turn out to be the optimum as permanent solution. Prerequisites and guidelines for effective capabilities outsourcing and skills acquisition are many. Let’s explore them.
First, strategic clarity boils down to two questions: 1) What are the digital capabilities needed and which of them are missing, and 2) Which of them could or should be outsourced rather than built in-house? Capabilities needed connect to topics discussed in this article: AI use case discovery leading to development, integration, deployment and maintenance of data products, AI models and enterprise applications. All this while minimizing resulting engineering related cognitive load on business domains.
What (not) to outsource is discovered thru Core versus Context analysis to distinguish between core value creation activities and contextual activities. The former enable differentiation and lead to competitive advantage while the latter are essential in supporting the business but are not directly linked to customer value creation.
Core versus Context analysis provides guidance on what (not) to outsource
Core versus Context analysis utilizes both digital strategy and business strategy. Digital strategy identifies necessary digital capabilities themselves while business strategy provides the means to pinpoint core value creation activities in relation to company’s offering, markets, customers and competitive landscape.
Second, operating model modularity is foundational for effective outsourcing implementation. Clear demarcation lines between modules – i.e. functions and activities – provide practical planning and implementation tools. In addition, Enterprise Architecture provides comprehensive blueprint of organization's processes, systems and technology. EA can be used to identify dependencies and integration points to gain further understanding of outsourcing opportunities and impact.
The duo of Operating Model and Enterprise Architecture provide tools for outsourcing implementation
Operating model elements discussed in this article would cater to outsourcing roughly as follows: Ownership of AI use cases, AI models or data products cannot be outsourced. Without ownership, they would not connect tightly enough to domain-specific value creation activities. Therefore, business domains need to acquire in?house skills and capabilities to facilitate ownership e.g. thru Data Product Management. However, ownership does not necessarily imply in-house development.
Plug-and-Play Framework is a prime candidate for outsourcing. However, due to central role of Plug-and-Play Framework in AI at Scale implementation, it is vital that outsourced services integrate seamlessly with in-house operations. This maps to careful partner selection and later to governance structures and partner management processes to ensure alignment of goals and effective collaboration. Partner selection is not to be based on cost or technical expertise only but also on cultural alignment and adaptation to business domains’ operational tempo and innovation cycles. Regular performance reviews apply.
In terms of outsourcing options, XOps appears dualistic
XOps falls between those two and emerges as somewhat dualistic with regards to outsourcing options. As a whole, XOps is an integral part of business domain operations and digital innovation. It is the central vehicle to create and deliver value, and to experiment and learn. Clearly not the first thing to be outsourced. Certain aspects of XOps, however, can be modularized and outsourced. For instance, routine DevOps tasks like Continuous Delivery pipeline management or standardized DataOps processes could be handled by external partners. MLOps, particularly around AI model deployment and monitoring, might be outsourced with standardized procedures and when outsourcing would not compromise proprietary AI models.
Use hired help to fill in the gaps and to get the ball rolling
Finally, skills acquisition depends on conclusions on what (not) to outsource. In case of in-house digital capabilities the corresponsing skills need to be either recruited or trained within existing organization. In many cases, temporary gap filling thru hired help (consultants) provides fast and effective way to get started and to maintain momentum – especially in the early phases of AI at Scale build-up.
In case of outsourced capabilities, the necessary in-house skills connect to strategic and operational partner management including partner selection, oversight, relationship management, target setting and governance.
Towards AI Organization
Vision of AI Organization is about achieving AI at Scale with hundreds of concurrent AI use cases. It depicts future where AI has been extensively deployed for higher customer value, better customer experience and enhanced operational efficiency. It predicts ubiquitous AI copilots working side-by-side with human colleagues with nobody noticing anything special.
To make that vision a reality, significant amount of digital capabilities will be needed – covering technical and non-technical capabilities alike. Organizational inertia is to be met with continuous communication and thru creation of strong data culture. Build-up towards AI Organization needs to be systematic, disciplined and persistent with clarity on strategic goals and priorities.
All that calls for strategic change management with hands-on take on digital capabilities combined with good understanding on how investments in capabilities lead to productivity gains and financial results. This belongs to Management Team permanent agenda with guidance expected from Board of Directors.
First step: Assess constraints that prevent AI at Scale
The path to AI Organization is long and arduous. The amount of details is massive. Keeping eye on the ball at all times with clarity on objectives and priorities is difficult. Theory of Constraints applied to AI at Scale build-up makes the task more manageable.
Continuous focus on constraints and eliminating most severe constraints at any given time provides effective way to prioritize effort over the entire AI at Scale build-up. This facilitates early wins while optimizing for the long-term whole.
Constraints are about bottlenecks or downright roadblocks. In order to eliminate constraints they need to be identified and understood. Understanding is about clarifying their nature and impact.
If essence, Constraints Assessment creates planning, prioritization and communication tools for strategic change management.
Constraints Assessment covers eight areas. Together, they provide holistic landscape view to bottlenecks and roadblocks that prevent achieving AI at Scale. Assessment areas are as follows:
Constraints Assessment starts with the question: “What prevents us achieving AI at Scale?“ It is the first step in securing future competitiveness in AI?defined business environment.
?
Founder @ quickintegrate | Simplifying Integration and API
2 个月Thanks Antti Pikkusaari for great insights. Need to ensure AI integrates/interoperate with existing ecosystem. As you pointed out this would require "Integration Framework for data products and AI models" . One need to ensuring how structured data ( all existing data in multiple systems) and unstructured data (AI driven information and prompts) can be integrated to bring real business value.
Principal, SanjMo & Former Gartner Research VP, Data & Analytics | Author | Podcast Host | Medium Blogger
9 个月Excellent reminder Antti to adress AI in a systematic and holistic manner. Very well researched and written. Thanks.
Building Nurish - Nurish Helps Us Manage Our Nutrition Better | Former Gartner Chief of Research | AI & GenAI
9 个月Useful points. Yes, AI is broad in scope and as you point out, it is vital to think about it's use in all stages of value creation, not just ones that appear easily in front of us.
Strategic Visionary: Architecting the Data-Driven Digital Transformation Roadmap for Value and People Centric Excellence
9 个月Brilliant note on #AI at Scale. Thanks for sharing, Antti Pikkusaari for this thorough note. AI at scale has huge potential in innovation and new product + experience development, a rapid acceleration to digital transformation.