Whitepaper: AI, Modern cloud technology, Big data management, Industry case studies & Integration architecture design by Sourajit Ghosh (SG)
Sourajit Ghosh (SG)
Chief Expert, MBA, AI strategist & data scientist, SAP AI Blackbelt, CX & CRM advisor & thought leader, Gold Medalist Engineering, Enterprise & Business architect, Management consultant, Advisor startups, IIT-IIM alumni
Architecture Design
This whitepaper will explain the architecture design above. The goal of this whitepaper is to help you as an organization have a template for your north star architecture when it comes to:
1.?????Leveraging the new paradigms of modern cloud technology
2.?????Ensuring your architecture factors in the innovations of big data management
3.?????Designing the landscape to integrate to various applications and systems in a complex, heterogeneous & hybrid enterprise architecture landscape
Business value
Why move towards to this modern architecture design above? Based on my experience, customers usually drive these key business values from such transformation:
1.?????Drive innovation and activate new business models
2.?????Sustainable and scalable technology landscape
3.?????Delightful experiences for your customers, suppliers, marketplace & employees
4.?????Unlock the potential of our IT landscape to drive strategic transformation to support business needs
5.?????Faster return on investments on IT investments aligned with company objectives
6.?????Differentiate yourself from competition
7.?????Attract the best talent in your organization – from software developers, architects, and technology leaders
Data strategy
In this phenomenal explosion of data and proliferation of the state-of-the-art modern cloud technologies available, it is imperative for any organization to start designing a architecture which will allow them to accelerate their revenue growth and also safeguard their future profitability given the forthcoming macroeconomic changes, transforming customer expectations and highly competitive landscape.
You can well imagine the phenomenal momentum (mass * velocity) at which data is growing and the massive business transformation that will result.
This provides business leaders a significant opportunity for designing their company strategy around big data in this new zettabyte era. Data strategy is no longer a standalone technology topic for the IT organization. How a company uses its enterprise data can bring in significant business value.
Organizations that can harness this explosive data growth and make it operational will create significant business differentiators with respect to their competition. Think about scenarios like:
·??????Improving employee engagement and retention and helping them achieve their goals by recommending relevant learning experiences based on their existing talent, your company’s strategy, and their emotional experience at the workplace
·??????Reducing the bullwhip effect in your supply chain and increasing your inventory turns by getting real-time, data-driven visibility of your entire demand and supply chain with predictive insights
·??????Leveraging big data to drive the entire lifecycle of order management: from generating interest to driving purchase behavior, from order processing and fulfillment to completing downstream processes like logistics, finance, and service
You will also need to harness the powerful synergy of your operational and experience data. The coming together of the operational and experience economy opens amazing growth potential for all organizations.
Enterprise data strategy model driven by business outcomes
Consider this framework below. You need to blend in these three critical themes to design your data strategy:
Business value drivers
Your key business value drivers will have to influence your enterprise data strategy. Business leaders must ask themselves: How can we leverage data to fulfill our business plan? While designing your business value drivers, think about these key themes:
·??????How can you increase both customer lifetime value and operational efficiency?
·??????How can you blend in the emotional insights and experience data and use that information to drive operational excellence?
·??????How can you design your business strategy using data to drive business outcomes?
Big data management
Many organizations over the last few years have been reactive in this space. As an example, you might be now living in an enterprise landscape where you have various technologies supporting some of these key data-management focus areas, but you are executing them in an ad-hoc, disconnected, and non-strategic manner:
·??????Data ingestion, replication, and extraction-transfer-load
·??????Data federation
·??????Data cataloging
·??????Master data governance and data quality
·??????Data pipelining and orchestration
·??????Distributed big data processing
·??????Big data databases and data storage
·??????Cloud platform-as-a-service
·??????Machine learning and data science
·??????Analytics
As we progress rapidly in this zettabyte era, it’s critical to build a data intelligence strategy to start shaping a coherent big-data management platform that is sufficiently scalable, flexible, and powerful to take on this new world of big data. Here are some examples of what that would mean:
·??????Business application transformation: Streamline innovation initiatives around business applications to support enterprise transformation programs.
·??????IoT ingestion, orchestration, and robotic process automation: Transform IoT event streams into enterprise-ready data and derive actionable insights and then automate the process by leveraging intelligent robotic process automation.
·??????Connected data warehousing and predictive analytics: Experience the power of analytics when it is consumed at the moment of experience. Build a multifaceted data warehouse across diverse and distributed data assets and connect that with your applications with live connections.
Business process integration
We live in a world where you now have access to very powerful technology. Having said that, sometimes this results in individual lines of business (for example, supply chain, marketing, sales, etc.) or individual business units deciding on their data strategy and technology enablers in silos. At the surface, this approach may seem nimble, agile, and fast. However, you will soon realize that this strategy is not scalable. By operating in silos, you are not just sacrificing your operating margin by losing efficiencies of scale; more importantly, you are not harnessing the power of one common enterprise-wide big data platform to power your business process.
Business process integration is thus not just about joining a process from two applications with an API flow: it’s no longer only an IT topic. It’s also not throwing data into one giant data lake, making it virtually impossible to make that data actionable in real time at the point of experience. Business process integration is a key business-driven strategy causing a synergy that results in a massive exponential gain for your organization.
Enterprise data ecosystem
As an organization you will have to holistically look at the big picture when it comes to who is most critical for your company. Too often organizations do not factor the big picture while designing the enterprise architecture.
For example, let’s take the concept of “Customer 360”. Many just bring in CRM and website data into that design – but ignore ERP, financials, logistics, data in that design. And many just create dashboards viewing such customer 360 without making that view – actionable, intelligent and a dynamically reactive personalized live system.
Extend that same example to Supplier 360, Employee 360, Competition 360, Asset 360 etc.
And last but not the least, design landscapes to factor in industry accelerators and prepackaged industry data and integration best practices to accelerate your implementation.?
For example, consider these architecture designs below for reference.
Business to everyone: Enterprise Architecture Design
Utilities industry: Enterprise architecture design (service, asset & customer experience platform)
Enterprise architecture design: Telecom
Enterprise architecture design: High Tech
Data sources, systems & types
We live in an extremely heterogenous world of data sources, systems, & types. Consider a landscape of a typical retail organization.
Retail: data sources, systems, & types
Key data sources, systems & type
The integration platform thus (which we will cover later in this whitepaper) will have to thus support all various kinds of such data sources, systems & types:
·??????Relational Databases & NoSQL (non-tabular) databases
·??????OLTP (online transaction processing) & OLAP (online analytical processing) systems
·??????ODS (operational data store), DWH (data warehouse), Data marts
·??????In-memory databases
·??????Excel & CSV
·??????Legacy apps
·??????Enterprise Cloud storages including hyperscalers
·??????On-premise applications
·??????Streaming (example live IOT data feeds from machines)
·??????Devices (including mobile & machine data)
·??????Hadoop, HDFS (Hadoop Distributed File System) – relevant for big data
·??????Social media and external data systems (examples economy data, analyst data, etc)
·??????Graph databases
Integration platform as a service
While designing your Integration platform as a service (iPAAS) landscape, it’s imperative that you consider a holistic strategy. Here are some of the key components to consider:
·??????Identity & access management, consent & privacy, security
·??????API design & management
·??????B2B & EDI management
·??????Integration catalog & prepackaged templates
·??????Data orchestration, monitoring, governance
·??????Data pipeline, ingestion, processing, enrichment
·??????Data quality & cloud automation
Integration case-study
Consider the following Business To Business Value Map
Here are some examples from iPAAS which you will have to consider while making the above value map actionable from a technology integration standpoint:
Identity & access management, consent & privacy, security
For the entire lifecycle of the customer/supplier/employee/business partner you have to provide a robust platform which not only makes registration, onboarding and self-service experience easy; but also ensure you factor in all legal & IT regulations connected to their consent & privacy; and do that in a highly secure manner in the cloud
API design & management
You will be working with various systems in this scenario; example: B2B portal, CRM, ERP, logistics, websites, mobile apps, marketing, etc. Having a centralized API management hub to manage existing APIs, modify them, create new ones and do that at scale with intelligence & security
B2B & EDI management
You will need to support various B2B EDI specific formats like X12, EDIFACT, Rosettanet; and transport protocols, including AS2, FTPS, SFTP, and HTTP/S. Beyond this you will have to support the onboarding, maintenance, and self-service needs of the B2B merchants. Monitoring the transactions and maintenance of the B2B partner directory is critical too.
Integration catalog & prepackaged templates
Firstly, you wouldn’t want to start from ground zero. Leveraging prepackaged content when it comes to out-of-the-box integration between applications, best practices templates, integration accelerators is the need of the hour to accelerate time to value. Your iPAAS will also need a easy to use and flexible catalog management for your APIs. For example, in the B2B value map, if you have an out of the box API package which delivers an “available to promise” process – not just leveraging that packaged integration out of the box but also maintaining that package as a catalog along-with other such business processes is key to success here
Data orchestration, monitoring, governance
Consider one item above in the value map: “omnichannel pricing visibility”. You will have pricing data coming in from various sources, collated in one system, intelligence built in another and leveraged across all channels and systems both in batch as well as real-time. Various kinds of synchronous and asynchronous process will have to be supported. Various users will login to various systems who can influence the change of the data. And there will be changes to the APIs as well as data systems. Hence, to scale securely and accurately, having a powerful capability in your iPAAS around data orchestration, monitoring & governance will help deliver sustainable business value here for your company.
Data pipeline, ingestion, processing, enrichment
Given that you will deal with big data, live data streaming and a variety of data processing needing data enrichment, your iPAAS will have to deliver these capabilities too. Consider a customer logging into the B2B portal above to see their existing service installations. This will include their service contracts, maintenance plans, service orders coupled with IOT insights and predictive alerts – your iPAAS will have to support such integration needs
Data quality & cloud automation
Your scale of operations with multitude of applications and integration formats, with various maintenance cycles (including cloud and on-premise systems); thus, will need an iPAAS system which will protect and manage integration data quality and do that at scale with cloud automation methods, tools and best practices.
Master data platform
Key components of a master data platform
Master data hub
In a large enterprise, similar master data components exist in various landscapes. For take the example of a cost center and employee ID of an employee – that master data might exist both in your HR system and Payroll system. Similarly, customer master will exist in ERP, CRM, e-commerce systems, etc.
Thus, you will need a centralized master data hub to ensure the key elements of the master data table attributes are properly mapped, validated and then orchestrated, distributed and integrated in the relevant systems in the landscape.
Master data governance
Data is dynamic. And with many users/systems/processes working on similar data – data quality suffers. You may be struggling with bad data quality, lack of data quality management processes, lot of duplicates or incorrect or incomplete data.
With a powerful centralized strategy (and relevant technology solution) you are driving towards a comprehensive master data governance approach. You thus have well managed strategy and framework for master data management; your data strategy is aligned with your business strategy and corporate objectives. You will organize and streamline your data ownership & stewardship such that business rules are invoked during data lifecycle processes. Eventually your master data governance will lead you to a strategic advantage, allowing your company to be truly data driven.
Data science platform
Key components of a data science platform
Machine learning & Artificial intelligence coding platform
The ability of a platform to leverage various modern programming languages like R, Python, etc and prepackaged libraries and templates connected to Machine Learning & Artificial Intelligence to execute projects connected to Machine Learning projects, for example:
·??????Linear Regression and Preprocessing Concepts
·??????Logistic and Softmax Regression
·??????Decision Trees and Ensemble Learning
·??????Random Forests, Support Vector Machines, and K-Means Clustering
·??????Neural Networks
Robotic & business process automation
To drive robotic and business process automation, you will have to consider the following key components:
·??????Business process workflows
·??????Low-code No-Code platforms
·??????Automation development
·??????Pre-built automation components and library
·??????Business process intelligence & governance
·??????Analytics
·??????Integration with data science platform and enterprise landscape
Statistics Insights & Predictions
This is where you start leveraging your ML-AI coding platform, integrate that with your robotic & business process automation to drive business actions.
Consider a use-case of predictive maintenance of machines. The machine data (and customer data who owns the machines) and the technician data (who will service the machine) exists in various systems:
1.?????ERP
2.?????Service asset master
3.?????Edge computing & sensor IOT
4.?????Order management
5.?????Field service management
6.?????Customer service management
7.?????Logistics management
Your goal would be to get the relevant data from the above systems connected to the asset-customer-technician; use your data science platform to predict when the machine is due for maintenance by predicting potential failures, and then dispatch the technician with the right tools to be onsite to repair (preventatively) the machine.
For this you would have not just ensure you have your data platform deliver this insight – but take the action in the relevant systems. Thus, needing your iPAAS landscape.?
Case study: Data science in Production Planning optimization
Consider a simplistic classical scenario of production planning optimization for a single production plant. The use-case here is the forecasting department for the plant, has come up with a quarterly forecast for the 4 quarters of the upcoming planning year. To meet the variable demand, the company can engage in a combination of working with overtime and/or subcontracting external workers. All options have various. Also, the plant has to have an inventory of 250 units.
The goal is to come up with a model which will factor in the above constraints, and then create the production plan such that total production cost in minimized.
Following is the data relevant to this case-study:
Now this problem can be modeled, and the solution can be done using EXCEL-Solver and Linear Programming. Example output is below
However, consider now these huge challenges in this production planning optimization in real life:
1.?????What if you want to model across not just one plant – but many plants?
2.?????This is a disconnected system – what if cost changes or availability changes? What if you want to connect this system to your various inventory, human resource, manufacturing systems across all your plants?
3.?????What if you want to bring in other factors which will influence both your supply and demand planning? Think weather, macroeconomics, freight insights, labor market, stock movement, sales pricing, competitive insights, etc?
4.?????And what if you want to connect this model real-time – connected to all your systems and plants – and do this weekly (instead of quarterly)
You are thus moving towards a centralized data science platform connected with your enterprise. How will this big data architecture look like? See refence design below.
Supply Chain & Manufacturing – digital transformation: Big data enterprise architecture design
In context to data science focus on the following boxes under Supply Chain Digital Transformation in the overall strategic architecture design:
ML-AI & Data Science
???????Centralized ML-AI & data science + IOT platform
???????Vendor-SCM-Customer data platform
Cloud Platform
???????Agile, auto scale, secure, multitenant
???????Extendible, business user friendly
???????Microservices
Innovative Tech
???????Industrial IOT platform
???????Cognitive computing
???????Augmented & mixed reality
???????Digital twin
???????Robotic automation
Integration
???????Master data integration
???????Process integration
???????API hub & management
???????Big data integration
You would need a combination of the above to work together to truly use the power of your Data Science platform and then scale it to the whole enterprise.?
Cloud extensibility & side-by-side extension
There is no two doubt that you will need a powerful extensibility platform and a coherent IT strategy to manage your extensibility. There is always a way to do in-app extensions for your applications – and write the lines of code intrinsic to the application. For example, creating an extension field in the “material” object to add “Seasonal – yes/no field options”.
However, in this whitepaper I am exploring a more complex scenarios of extensibility – which will need extension done outside in a cloud platform – outside the core (to keep the core clean inside the parent application)
Here are some of the key components which you may consider while designing your cloud extensibility platform
Microservices
We are seeing a shift from traditional monolithic to a microservices driven architecture for applications, along-with leveraging microservices for extensibility. This option must be evaluated on a case-by-case basis. The idea is to split your extensible application/program into a set of smaller, interconnected services instead of building a single monolithic application. Each microservice is a small application that has its own architecture consisting of business logic. Some microservices would expose a REST, RPC or message-based API and most services consume APIs provided by other services. Other microservices might implement a web UI.
The value of doing this is it addresses the problem of complexity by decomposing application into a set of manageable services. This makes custom code much faster to develop, and much easier to manage, maintain and comprehend. It enables each service to be developed independently by a team that is focused on that service.
Because in a microservices architecture platform, developers are free to choose whatever technologies make sense for their service – this makes the extension development friendly for the developers.
Most importantly, microservice architecture enables each service to be scaled and deployed independently, so rapid innovations by continuous deployment of extensions is possible – and while keeping the core clean.
Data & logic modeling
When it comes to creating extensions, you would need to work with both existing data and creating new logic (by appending existing new ones or creating a logic from scratch). You would have to take architecture decisions here. Sometimes you will need virtual access to the data, and sometimes you will need persistence of the data in the extension application itself.
I will explain this further with a case-study later in this whitepaper.
Intelligent workflows
Since applications existed in enterprise, workflows existed in those applications. If an event happens, then the software needs to execute on a set of steps post that as a workflow.
What is new in this space when it comes to expectations are:
·??????Workflows now need to span across multiple applications
·??????They must be intelligent enough – not just via manual coding but also bring in machine learning
·??????They are multi-event driven
·??????The workflows can work as embedded application, extension and also cross line of business
·??????Can have multiple layers of business logic underneath the workflow
·??????And the data on which the workflows will need to work on – can come from various sources, some live streaming and some persistent data, some structured and some un-structured
Intelligent workflows take on a powerful role thus in your cloud extension platform.
Low-code No-code platform
You now have some powerful low-code no-code platform wherein you can build new apps and/or extensions. A good low-code no-code platform will offer you:
·??????Business user friendly experience – with people with little to no programming background can easily use the user interface of the platform to create apps and/or do extensions This includes visualization of the logic
·??????The platform must have a lot of prebuilt accelerators, packages, integrations to other microservices and applications
·??????And the platform has to be enterprise ready, scalable and work coherently and securely with your enterprise landscape
Case study on complex extension and app development
Consider this use-case: A field technician has to a complex work order onsite to repair a machine installed a customer factory site. This work order installation will be under a special customer contract (which resides in the CRM system). The pricing of the work order along-with special pricing for the key account service connected to that factory – that will reside in a pricing system (probably the ERP). The key relevant repair instructions for that machine engineering will be in some asset management or engineering system – which will have to be made available to the field technician at the time or repair.
Data warehouse & business analytics; Cloud Platform infrastructure
3 Vs of Data: Variety, Volume & Velocity
To deliver a holistic approach towards data management, consider the 3 aspects of Big Data:
·??????Data variety
o??Various structured and un-structured data integrated via different integration methods from a plethora of data systems
·??????Data volume
o??We are now looking at peta-bytes of data sometimes in large organizations. The new age cloud platform will have to factor in the huge volumes of data
·??????Data velocity
o??Think about the case-study earlier in this whitepaper on production planning optimization. If you must do that optimization – not quarterly but daily, and accordingly design your plant production plan for the data; real-time velocity of data must be considered. Or imagine a consumer landing on an e-commerce site and based on the clickstream pattern and past purchases, again high data velocity will have to be considered
Key components in holistic cloud analytics & data platform
Here are some of the critical components which you will need to consider as part of your overall technology architecture landscape to achieve a holistic cloud analytics & data platform:
·??????Cloud data lake infrastructure
·??????Data storage
·??????Data ingestion
·??????Data governance & lineage
·??????Data modeling
·??????Data discovery
·??????Data process catalog
·??????Data orchestration
·??????Data serving & consumption
·??????Business intelligence
·??????Reporting & Dashboard
·??????Insights (KPIs, heuristics, predictive)
·??????Real-time alerts
How does your cloud platform and analytics work with your overall enterprise architecture? Consider this proposed architecture of Data 360 below.
Design architecture: Data 360
Personal Disclaimer
The content expressed in this publication is purely the personal opinion of the author and do not necessarily reflect the official policy or position of organization the author works for.?The information presented in this whitepaper is for general informational purposes only and should not be considered as professional advice or any specific implementation or actionable recommendation. Do not also consider this whitepaper for any implementation or software purchase or software design without doing your due diligence and evaluation. The case studies presented in this whitepaper are purely hypothetical and the purpose of which is creative ideation in the minds of the reader to generate excitement and interest in this topic for future self-exploration & research. This publication was crafted also with the help of generative AI technology from various LLMs. While the core ideas and content are the product of the author’s own work; sections of the article when related to content creation, editing choices, elaborations and summarizations of content are influenced by Generative AI and thus may include content from other sources not declared in the references and also may contain content which may be influenced by the inherent Generative AI inaccuracies or biases. Also note the domain of technology & AI is also rapidly changing so the relevance of this whitepaper may change with time. The information in this article is for general informational purposes only and is provided in good faith. The author makes no warranty regarding the accuracy or reliability of the content. Any actions taken based on this information are at your own risk. The author does not endorse any products, services, or companies mentioned and are not responsible for any linked third-party content. By reading this, you accept this disclaimer in full.
Helping Customers to close the gaps in their Customer Experience!!
1 年Indeed a very good read dada !
TEDx Speaker | Director - SAP Customer Experience | CX Solution Experience | Marketing & Digital Marketing | Artificial Intelligence | Sustainability | Speaker ?? | Mentor ??
1 年You content & whitepaper always rocks my friend ??