Data mobility issues will derail digital transformations
The Information published an article last week on how companies though are racking up huge bills for cloud services, one of the least understood expenses is the cost of moving data from on-premise data centers to the cloud and between cloud providers. It’s called the data transfer costs and ran up to about 7% (a little less than $50M) for Apple in 2017.
Most digital transformations for large established organizations have cloud computing and AI/machine learning at the heart of it. Most of the latest innovations in applications, machine learning, advanced analytics are mostly built in cloud native applications that runs on some cloud - AWS, Azure, Oracle or Google Cloud Platform. Capabilities like elasticity, ability to stand up environments really quickly and reduced operational complexity makes cloud environments a perfect choice for these transformation.
On the other hand, most of these organizations’ mission critical and sensitive data are on on-premise data centers, in their data warehouses, databases and sometimes even in mainframes. And this data has to be closed to those new applications in the cloud to accelerate the creation of new services, products and experiences. But this data, often in the order of 100s of TBs to even Petabytes is way too expensive to move to the cloud vendors, leading to the data transfer costs. Further, in a world where a single cloud vendor is most likely not going to be the answer to all needs for most enterprises and that’s when this becomes really complicated.
Moreover, with about 40 Billion connected devices expected to be online by 2025, the lack of a more efficient and secure data transfer infrastructure will mean most of these devices will not be able to deliver an integrated and intelligent experience to the end user. Edge intelligence is definitely an emerging capability but the lack of efficient integration with the core will lead to higher costs and sub-optimal user experience. I’m reminded of Peter Levine’s End of Cloud Computing discussion from a few years ago.
This is also why I’m stoked about what we are working on at Molecula. Molecula is reinventing the Data Virtualization category by abstracting the access (query) layer from the persistent layer for all of the data, across multiple sources, formats and locations. Truly bringing performance, portability and control to Enterprise Data through a Software Defined Data Infrastructure. If you are looking to make your hybrid/multi-cloud analytics and applications more performant, while lowering the risk and cost of data mobility, talk to us. We’ll be honored to unlock the potential of all your data, to help you win in the Intelligence Era.
Managing Partner at 3Cube Tech Care
5 年Thanks for sharing, please check inbox
Worldwide AI Products Lead EY Client Technology Platform Fabric
5 年Good stuff, Ganesh!? You've certainly outlined an interesting vector for a TEI analysis.? Let's remember to connect on this next time we catch up.