The future of data in Covid times
Michael Schemel
Technology & AI Executive at UnternehmerTUM | Young Leader at United Europe | Strategic Advisor at DAVINC·E | AI Group Member at Bitkom | Advisor at BioExoTec (Cancer Diagnostics)
The Covid-19 pandemic has created a unique period of cooperation and unity. Governments are stepping up to take care of their people, and those people are looking after one another. Companies are doing their best to support their employees and give back to the community at an organizational level. One of the crucial components driving change behind the scenes is the technology and data that underpins it.
Ten or fifteen years ago, life in lockdown would have been far less comfortable. As businesses were forced to close their doors and people moved to remote working, our lives went almost entirely digital. With its heavily auto supply chain, Amazon was still able to fulfill fast home delivery, Netflix and Spotify used algorithms to keep us entertained, and AI-powered logistics systems meant we still had access to food.
Source - McKinsey
It is the world of data, or Big Data, that helped to enable us during the crisis and allowed business continuity. According to Statista, the Big data market size is forecast at $103 billion by 2027 from $42 billion in 2018. The consumer preference for digital channels only serves to accelerate the market as we become better connected and data is more accessible to consumers, businesses, and all other stakeholders.
A data-driven future
The Covid-19 pandemic has taught businesses that they need to be more agile, resilient, and empathetic in perhaps the most catastrophic of ways. As consumers flock online, businesses without any data assets struggle to understand how their audiences behave amidst a “new normal.” Conversely, those with rich customer data have had a distinct advantage since the onset of the virus in understanding the behavioral and emotional drivers behind decisions.
Investment in data through artificial intelligence (AI) has shifted from being a nice-to-have to a business imperative within the space of months. The disruption to business as usual processes exposed gaps where existing technology, processes, and human resources could not scale up to meet the demand introduced by the pandemic.
Data is the critical driver for AI and intelligent automation. That fact isn’t new, but it has taken a global pandemic to push businesses into action. The American agricultural company Land O’Lakes is an excellent example of this in action.
The onset of Covid-19 meant that Land O’Lakes urgently needed a view of the impact the crisis was having on its supply chain. With legacy systems and processes spanning more than 60 countries, the situation could easily have got beyond them. However, they invested in software to connect all supply chain data such as shipping, procurement, retail, and customer information to develop vital insights that ensured Land O’Lakes were in control at a testing time.
Going forwards, businesses have to align their data using a similar strategy to remain efficient in the digital ecosystem. Any business that thought their customers would never go online can no longer stand by that belief.
Demystifying the cloud
Something that Covid-19 has served to do is debunk several myths around data and the cloud. For example, before the pandemic, enterprises would often fail to launch data projects over fears of cloud governance and security. Where cloud data centers are out of their control, CTOs and CFOs would lose sleep thinking about who might have access to sensitive corporate data.
Without access to onsite networks and with the growth in digital data, organizations are now having to invest in cloud technology as a means of scaling up. What this is proving is that data in the public cloud is just as secure, if not more secure, than traditional data centers. Cloud providers like AWS and MS Azure have vast experience in managing security and can scale to meet storage needs and combat any new external threats. Post Covid-19, businesses will see that cloud technology makes it easier to protect data than conventional legacy systems.
The other trend that links to the move to the cloud is the accelerated adoption of SaaS. SaaS can support your entire data supply chain and removes the need for technical in-house resources and software developers. Organizations that realize the flexibility and speed of a SaaS-based data strategy move ahead of the static competition. This is especially important for SMEs as they look to keep pace with the big-budget technology enterprises.
Source: https://dzone.com/articles/how-to-develop-successful-cloud-based-saas-applica
Focusing on external data
During Covid-19, there has been an accelerated trend towards using external data. Outside factors are causing such mass destruction that internal data alone cannot give an accurate picture. Siloed past activity isn’t enough to predict the future, meaning companies are looking elsewhere to understand consumer trends and behavior.
Source: https://www.idc.com/getdoc.jsp?containerId=US46333320
Organizations are embracing data as a service (DaaS) solutions that are similar to SaaS but focus on outsourcing data storage and integrations. DaaS gives the agility and scalability to ingest external data in the business. While there are inherent risks of bias and inaccuracy with data you do not own, a robust quality process can unlock the value of information from other sources. With 86% of senior leaders reporting difficulty in hiring data talent, DaaS and data marketplaces solve a problem.
Post Covid-19, data collaboration will be a theme, especially with the benefits of doing so during the pandemic in finding rapid solutions to the problem.
Changing your data strategy
Companies are moving away from conventional data strategies. Instead, the approach needs to center on creating a best-of-breed stack built around a central data hub. Server to server integrations helps to connect data to all of the technology solutions. The data hub ingests everything from your data warehouse and external sources to create a single source of truth.
The data hub can run several tasks in parallel, such as machine learning algorithms or creating insight dashboards. If the entire technology stack is sourcing data from the same place, you ensure consistency throughout the enterprise and lower the risk of error.
Data strategies post-Covid-19 should drive the right technology investments. This means enterprises should put their efforts into collecting data that offers business value and then accelerating the relevant technology to support that journey. Data hubs can support the rapid exchange of data that organizations need to serve their customers effectively.
Source - https://www.talend.com/resources/customer-360-data-hub/
The future of data
Experian believes that there are three data trends closest to fruition in 2021.
- Data standardization will be critical to unlocks the value of data. It will drive the use of marketplaces and DaaS, which in turn makes data more available. Gartner predicts that by 2022, 35% of large organizations will be either sellers or buyers of data via formal online data marketplaces.
- Responsible use of data will be at the forefront of data strategy to operationalize AI, machine learning, and deep learning models.
- Data stewards returning to the business after many deemed them surplus to requirements pre-Covid-19
As well as having access to more data and utilizing external sources, Experian says that using it in the right way is the only way to show its actual value. People are becoming more motivated to share data following Covid-19, and if safeguards are put in place to provide long-term protection, that trust is a huge driver for innovation.
The pandemic demonstrates the need for flexibility in business approaches to data. Industries need to work with regulators to ensure frameworks are in place that supports an environment for innovation.