To Truly Leverage Your Data, You Must Optimise Your Data Storage
Bernard Marr
?? Internationally Best-selling #Author?? #KeynoteSpeaker?? #Futurist?? #Business, #Tech & #Strategy Advisor
Data is the fuel of industry and commerce in the information age. Insights and analytics help us answer questions about how our businesses should operate and who our customers are with ever-increasing accuracy. Companies can use artificial intelligence (AI) to put products and services in front of the right people when they are making buying decisions. Robots and automation are driving efficiency in every business process, from manufacturing to logistics to HR. And the global network of connected devices we refer to as the Internet of Things (IoT) – from smartphones to self-driving cars – means machines can communicate and talk to each other to solve problems with no input from us.
Every single one of these game-changing developments is powered by one thing – the vast and ever-increasing flow of digital data that we are generating and are able to capture, store and analyze.
It’s clear that some companies have gone on to revolutionize their industries – and the way we live – by harnessing this flow of data, creating new services for us that make our lives easier or better. This includes tech giants with their search engines, communications tools, and e-commerce, as well as more specialized solutions covering everything from ride sharing to streaming entertainment, booking holidays, and dating.
Unfortunately, not every business has had the same success. It’s been frequently reported in any number of industry studies and reports that a majority of the data that businesses generate is not used, and many organizations are still struggling to manage, let alone monetize their data streams.
Here I’m going to take a look at one of the first hurdles companies have to overcome – storage. It's a seemingly simple one, but lack of a strategy for overcoming it can lead to a lot of headaches further down the line, as data volumes continue to grow and decisions need to be made about what data is important at what particular time.
Cloud services offer near-unlimited capacity for organizations to store as much information as they need, but there are complicating factors. Some data may be too sensitive or come with too high a regulatory burden to host off-site. Some data may need to be accessed instantly from anywhere in the world, while some may simply require archival for legal reasons. And data may need to be audited regularly to make sure it is still relevant and hasn't become outdated or possibly even illegal due to changing regulatory frameworks around the world. Not knowing where your data is, how many copies you have, or how to access it at any given time can severely impede your abilities to carry out these essential functions.
For the most valuable, insight-rich data, the fastest and most highly available storage systems are a necessity. Modern business analytics operations require the ability to move and sort large volumes of data to provide business users or customers with the responsive, push-button functionality they expect from services today. And it has to be backed by encryption and security because trust is everything – no one wants to use a service that will endanger or expose them due to handling their data in an unsafe way.
Intelligent Data Storage
Today’s most advanced data storage systems, such as the IBM FlashSystem family, store information on solid-state non-volatile media to achieve the best possible speed, resilience, and security. They also take advantage of AI technology, including machine learning tools, to smartly manage the way data is stored and accessed, further increasing the speed of access and minimizing the chance of errors or data loss that could impact your business. For example, data that is predicted to be accessed more frequently will be queued and ready to go when it is needed, whereas data that is likely to be less mission-critical may be flagged for transfer to a less accessible but more secure environment, such as tape storage, for archival, or even deleted if storing it may create further problems. To make these predictions, IBM relies on insights from over two exabytes (two billion gigabytes) of data it has under management.
Resilience is another key requirement that any business wanting to get serious about data needs to figure out. If your essential internal and customer-facing operations are all built around acting on data-driven insights (which, of course, should be the aim), they can't grind to a halt because there are problems with data flow or infrastructure issues. This might mean ensuring data is constantly backed up across cloud and on-site servers (while maintaining regulatory compliance) as well as legacy systems. IBM provides this for its FlashSystem customers through its FlashCopy technology that allows production data to be rapidly copied and replicated. This means that in situations where data integrity is absolutely vital, two or more identical copies can be kept continuously synchronized, in separate physical locations, and restored with virtually nothing lost, should an unexpected disaster happen.
Coping with very fast-moving and constantly changing datasets
One organization that has tackled the problem of implementing the infrastructure needed to deal with truly fast-moving data is the UK Met Office. Its data is used by governments to plan for changing weather, by supermarkets to react to seasonal trends, and by researchers investigating climate change. For this to happen, it needs to ingest and analyze 300 million weather-related data points every day and make them available as insights to its customers. In fact, it does it all twice – to eliminate the risk of data flow being interrupted during development.
To do this, it has developed a hybrid cloud strategy built on IBM FlashSystem, and although its original assessment was that flash storage might be prohibitively expensive, it turned out to be a cost-efficient solution due to the high level of compression it enables. This lets them build the type of high-performance data infrastructure needed to push its data and insights from its in-house servers to the public cloud and on to its customers.
Another example of a data strategy adapting to cope with ever-increasing volumes and workloads needed for today’s applications comes from the Roman Catholic Church – perhaps not the first place we would think to look!
The Archdiocese of Salzburg needed a solution to more effectively provide services such as support and community outreach to its congregation, as well as preserve and provide access to its massive record of historical documents and literature, some of which is over 1,000 years old.
By migrating to solid-state, non-volatile storage systems and leaving behind the mechanical disk-based storage, with its lower access speeds and higher fail rate, the archdiocese was able to increase its response times by 10 to 20 times while also improving security and encryption. Due to the higher availability of data and better understanding of their storage system, thanks to AI analytics, the church is now embarking on a project to make its fascinating historical records accessible via the cloud.
As data becomes an increasingly important piece of an organization’s operating assets, it’s important not to underestimate the importance of making the right decisions when it comes to storage. It's no longer a simple choice between cloud or on-premises, with a hybrid approach often seen as the best way of realizing those all-important efficiencies. Every business needs to consider storage as a core component of their data strategy, just as they would with data acquisition, analytics, and actioning.
Thank you for reading my post. Here at LinkedIn and at Forbes I regularly write about management and technology trends. To read my future posts simply join my network here or click 'Follow'. Also feel free to connect with me via Twitter, Facebook, Instagram, Slideshare or YouTube.
About Bernard Marr
Bernard Marr is a world-renowned futurist, influencer and thought leader in the field of business and technology. He is the author of 18 best-selling books, writes a regular column for Forbes and advises and coaches many of the world’s best-known organisations. He has over 2 million social media followers and was ranked by LinkedIn as one of the top 5 business influencers in the world and the No 1 influencer in the UK.
Founder at RialtoLabs | ?? Passionate about Innovation | Global Sustainability | Soil Health ?? Tackling Food Waste ?? Supporting positive change through vision and leadership.
3 年Interesting article Bernard. An issue that RialtoLabs must address/overcome as we continue to develop our first AI product. Food for thought, as to how a distributed approach to data storage, its access, and computation costs may also help as part of an overall strategy. Thanks for sharing.
Senior Manager: Data Operations at Cell C
3 年Definitely means that there needs to be more collaboration between application and infrastructure teams to get the best outcomes for a data storage strategy. I agree a hybrid approach seems to be the best way to go in creating these efficiencies.
Te ayudo a transformar tus Proyectos Comerciales en exitosos!
3 年Thanks for sharing Bernard!
A giver and proven Tech Entrepreneur, NED, Polymath, AI, GPT, ML, Digital Healthcare, Circular Economy, community wealth building and vertical food & energy hubs.
3 年Good appraisal and well balanced piece