Data, Artificial Intelligence and Cloud Trends for 2020 and Beyond
Image source: Google Images

Data, Artificial Intelligence and Cloud Trends for 2020 and Beyond

As we rapidly near 2020, let us see some of the key Data, Artificial Intelligence and Cloud trends to look forward

Data Platforms today are re-architected for Cloud Native Era, Cloud has become core of Data Strategy to accelerate from idea to experimentation and Artificial Intelligence has started delivering reasonable value to some of the early adopters

Below are my top 5 predictions for 2020

Disclaimer: If you are looking for jazzy keywords like quantum, AI Explainability, AI Bias, Blockchain etc then you are going to be disappointed :)

Cloud Native Data Applications

Cloud Native technology are rapidly moving from stateless applications to support complex data driven applications. Frameworks like Apache Spark, Kafka, SQL Engine and few databases are already ported or getting ported to cloud native technologies like docker and kubernetes

We will be seeing more and more of data application getting seamlessly ported to cloud native framework. Architecture of these technologies have to re-imagined rather doing a lift to shift, for it to work efficiently with Docker and Kubernetes

Why is it important?

  • Hybrid and Multi Cloud ready
  • All parts of enterprise application converged on to single infrastructure (Data and Applications)
  • Better dependency management of packages and seamless migration from development to production

You can see my short video on dependency management and how cloud native technology help solve it in Reference section below

Edge Analytics/Edge AI

With billions of IOT devices around the world and many playing part in core business process, enterprise has been capturing tons and tons of information. 2020 can see these tons of dark data lying untapped, being analyzed and used to fuel new data products 

Algorithms/Models out of huge pile of IOT data will be moved on closer to edge of devices to reduce latency of decision process

If you are looking for details around edge analytics below is a short video on it

You can also look at my other related video (Reducing model size for Edge) and one of the insurance IOT use case in reference section towards end of the article

Hybrid Cloud

Hybrid Cloud development was slower than expected in 2019 in-spite of rapid investments by major cloud vendors and product organization. In 2020 we will see adoption accelerate in Hybrid Cloud and products that enable seamless connectivity, centralized monitoring, security, load balancer's among others

Many enterprise will embark investment onto Hybrid Cloud to leverage best of private and public cloud and help organization keep infrastructure cost under control at the sametime help them accelerate and scale innovation 

No alt text provided for this image

Prescriptive Analytics (From What to why)

While predictive analytics will continue to be core of many data driven decision in 2020 we can see matured enterprises prototyping and deploying prescriptive analytics into their decisions

Prescriptive analytics is going beyond predicting future. It provides best course of action among various options that can enhance decision outcome

Prescriptive analytics prescribes "why an outcome will happen" and the real world results are fed back to model to learn and re-prescribe, improving prediction accuracy and prescribing better decisions over time

Augmented Analytics for Data and Model management

We will see increased adoption of statistical and machine learning technique for data and model management. Some of the areas where Augmented analytics will play a major role are data quality, data security, smart missing value imputation, master data management, metadata management, model monitoring among others

Augmented analytics will also play a key role in data operations allowing apps to burst out before possible spikes or automatically spot and react to failures/anomalies in real time

There have been patches of implementation already happening in this space within enterprise today. We can see new products in the market that can automate many of the data management, data preparation and model monitoring task and help enterprise accelerate on their analytics journey

References

Docker and Kubernetes (Cloud Native technology) for Data Science

Model Quantization to reduce model memory and CPU need when deploying on low power/compute edge devices

Usage Based Insurance - IOT business case


Sudip Dutta

Oracle ERP Cloud(Oracle Fusion) Technical Solution Architect , Oracle Integration Cloud, Oracle ERP Cloud Financials Transformation.

4 年

Usable based insurance case study is very innovative. Thank you for sharing nice blog. Keep posting.

回复
Naveen S.

AZURE | AWS | GCP | IBM | IOT | DRLOps | MLOps | GitOps | DevOps | Machine Learning | Reinforcement Learning | Deep Learning | CV | NLP | Transfer Learning | STGNNs | Blockchain | Steganography

5 年

Hi Srivatsan, Google's Federated Learning Algo's is a very basic form of Unsupervised learning (Just Averaging (Similar to Unsupervised Kera's Mean / Avg Algo), which does not Improves value significantly, If an Organization happens to embrace Unsupervised learning Algo's like Sparse coding or ICA or Consistent Feature Analysis on top of these Edge deployed models "FROM THE CLOUD", it would be a game changer for future tech and will lead to more economic value. The above indicated?Unsupervised learning Algo's?Sparse coding or ICA or Consistent Feature Analysis are going to drive Continous Cognitive Value Improvements ( slightly Similar Analogy or Metaphor between Kubernetes - Docker ) I am already aware of Quantization, which is basically turning a heavy model into a light weight model ( via pruning ) to adjust to the needs of the customer constraints of low compute / memory / etc., kindly correct me, If I am wrong, Thank you. Best Regards, Naveen.

Naveen S.

AZURE | AWS | GCP | IBM | IOT | DRLOps | MLOps | GitOps | DevOps | Machine Learning | Reinforcement Learning | Deep Learning | CV | NLP | Transfer Learning | STGNNs | Blockchain | Steganography

5 年

I may be wrong, Subsequent to 5G Rollout (Reduces Network Latencies), I guess building Unsupervised Learning Algo's like Sparse coding or ICA or Consistent Feature Analysis on top of these low power / compute models (which are nothing but, the Blind copy of all or significant (via pruning) learned weights and bias based on Memory & Compute Constraints ) will continously Improve the cognitive value of model, kindly correct me, If I am going wrong somewhere Srivatsan, Best Regards, Naveen ( An Ex-Employee of Cognizant :) )

要查看或添加评论,请登录

社区洞察

其他会员也浏览了