Is Artificial Intelligence Business Ready? - Part 2
Dr. Stefan Krusche
Managing Director at Dr. Krusche & Partner: Hybrid AI for your Decision Superiority.
In a previous article, we have focused on artificial intelligence for human decision making. We asked ourselves how to turn AI into a mature business tool, and finally identified three main needs:
- Agile production lines for AI models
- Continuous integration of future innovations
- Structured end-to-end AI business process
In this article, we look into AI production lines, argue in more detail why these are vital for AI in business, and, introduce a solution how AI model building & usage can be acce-le-rated and simplified.
AI Models are Business Assets
Decision making affects future and unknown situations. AI provides a likely picture of what these unknown regions will look like.
This is achieved by learning from the known and transferring the results to the near future and unknown situations. Algorithms instruct machines how to learn and models define how to shed light on the unknown.
For data-driven decision making, AI models are vital and represent high impact business assets. The way how we deal with these assets in an ever-changing world up to now does not reflect their importance.
Hand-Crafted AI Models
AI models are manufactured in laboratories, hand-crafted by data scientists. It is often a long way to use these models to forecast and predict: Barriers originating from different tech-no-lo-gies & missing in-terope-ra-bility unacceptably increase time to insight, sometimes to infinity.
AI models are made to answer a certain question: Is this ATM transaction fraudulent? What is the demand for electric power within the next 24 hours? How does my best buyer look like? What is the best price? The more questions we ask, the more models we need.
And we should not forget that AI models are not made to last. They are aging and become less relevant. Even if we organize their scheduled retraining.
If we look at AI model building & usage, we are reminded of the beginnings of the auto-mobile. A complex technology made for a few and mastered by even fewer.
Under the hood of AI Modeling
Do not shrink back. This section is not for data scientists. However, without digging deeper, it is hard to understand why we are demanding a second digital transformation with a strong focus on our AI instruments:
Learning from the known is learning from large amounts of historical data. The more exten-sive the more precise the data can be.
Learning from data is achieved by applying a sequence of pur-pose-built stages, called data pipelines. Each stage transforms data from one state to another and at the end there is the AI model.
As a data scientist, with every demand for a new model, one has to build new pipelines from ground up. Even if many intermediate stages could be reused. And every data scientist is do-ing this on his own in every new project.
This is just the tip of an iceberg. In many cases one has to build the factories where these pipelines run as well. What only insiders generally know is that data scientists spend more time building and maintaining the tools for AI systems than they do building the systems themselves.
A recent survey of 500 companies by Algorithmia found that expensive data science teams spend less than a quarter of their time training and iterating machine-learning models, which is their primary job function.
What about smaller companies, who never have a chance to hire high-skilled data experts?
Without transforming our AI instruments, the majority of enterprises will not gain any business value from their digital transformation efforts.
But how can we overcome the era of hand-crafted & experts-only modeling and transform AI data pipelines into agile and ease of use assembly lines for the rapid pro-duc-tion of thousands of models?
Shift Focus on Standardization & Unification
In order to free artificial intelligence from the laboratories and evolve as a business tool for the masses, there is an urgent need to shift our focus from algorithms and scientific tasks to standardization & unification:
The standardization & unification of data pipelines, its stages and their interaction & inter-faces to enable their transformation into agile AI assembly lines.
Code-Free & Plug-and-Play
What we need are reusable & configurable AI components that can be orchestrated into com-prehensive pipelines without
- having to write any line of program code,
- thinning hundreds of AI notebooks and
- knowing many different basic AI libraries.
Agile AI in business demands for a plug-and-play mechanism with pre-built AI plugins that
- abstract from complex & diverse technologies,
- are based on standardized & unified APIs,
- can be seamlessly combined with other plugins, and
- support model building & usage likewise.
Declarative Artificial Intelligence
Equipped with a Lego-like construction kit for artificial intelligence, we are able to declare (without coding) plugin configurations, which plugins are best for which stages, and finally which plugins have to be connected to form an AI pipeline.
Declarative approaches are known for data processing for a while:
In the early days of business rule processing, experts had to implement software applications for every ruleset. As a result, the number of requests for new rulesets quickly exceeded pro-du-ction capabilities.
Now we declare business rules without coding and leverage pre-built rule engines to extract data that match these rules.
The Data Mining Group defined a Predictive Model Markup Language to ease the exchange of declarative data mining & statistical models between different technologies.
And finally, in the area of Bioinformatics, Medical Imaging and other data-intensive fields such as Astronomy and High Energy Physics, we observe approaches to define open stand-ards for data analysis pipelines.
What is still missing for artificial intelligence, however, is a common approach that combines Lego-like plugins with declarative building instructions for AI pipelines and a pipeline engine that generates AI pipelines on demand without writing any line of code.
Cloud Data Fusion
Ending up with a wish list is simple, even if it aggregates long-standing experience with exis-ting Enterprise AI solutions: We wanted to implement a next-generation Enterprise AI plat-form covering all the requirements but did not what to start from ground up and reinvent many wheels again.
We finally end up with Google’s Cloud Data Fusion. At the heart of Google’s new data integration service is CDAP.
CDAP (former Cask Data Application Platform) is exactly that declarative pipeline technology we were looking for.
Data processing pipelines can be built on top of pre-built plugins, leveraging a declarative approach, and supported by an ease-of-use point-and-click interface.
Enterprises can simplify and accelerate their data integration tasks by reducing time to less than 20% that is regularly needed without CDAP.
This is great, and that is why we put Cloud Data Fusion at the heart of our thinking. And no, we are independent and do not work for Google.
Artificial Intelligence with Plugins
Cloud Data Fusion is primarily intended to accelerate data integration, but on top of its declarative pipeline technology, we can do, and we did more.
We extended its existing plugin approach to cope with artificial intelligence and built 200+ standardized plugins covering the full spectrum of machine intelligence. From deep learning and machine learning, natural language processing, time analysis and more.
Charged with these extra and standardized plugins, Google Cloud Data Fusion made a huge step to transforms into a next generation Agile Enterprise AI solution for the masses.
Artificial intelligence is freed from the laboratories and now ships with agile assembly lines for rapid AI model production and usages.
What comes next?
Business environments are never at rest and stakeholders constantly evolve their relations to enterprises. It is vital for Enterprise AI to always keep pace with this.
The next article focuses on continuous integration of future AI algorithms & libraries and sket-ches an approach how this can be implemented today.