Why are even large corporates struggling to implement big data solutions?

Why are even large corporates struggling to implement big data solutions?

Have you ever meet a CEO of a large corporate who is NOT talking about how valuable their data are and how they will focus on big data applications? I haven’t!

But have you ever meet a CEO who can showcase how they are using big data applications to build their business? Very rarely!

Big data is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it... (Dan Ariely)

So why are even large corporates with millions to spend building applications struggling to deliver on their promises?

The problem is not lack of data. Every large company I know of possesses terabytes of data from every part of their organisation. Instead, the problem is in the utilisation of that data. Data are scattered around typically over 50 different applications (CRM, ERP, mail, intranet, etc.), and integrating that data across the applications is easier said than done.

An example of a superficially simple problem that turns out to be not so simple to solve is that unique identifiers across systems are different. So I might be ‘Nicolaj H Nielsen’ in one system, ‘Nikolai Nielsen’ in another system, and ‘Nicolaj H?jer Nielsen’ in a third. How do I know whether the three instances of ‘Nicolaj/Nikolai’ are actually the same person? And the same applies with other IDs such as email addresses – are [email protected] and [email protected] actually the same person?

The same goes for GDPR. Under GDPR, you have the right to ask a company to show you what data they have on you, what they do with it, and why they are storing it. But how can the company provide a valid answer to such a request, when your data are stored across multiple systems with no single interface? 

I sent such information requests to five corporates which I know store my data (two banks, one insurance company, one leasing company, and a telco). After five weeks, only one of them managed to come back to me with an answer! I am sure they really want to do it faster and better (to avoid potential fines related to GDPR), but they simply don’t have the overview and tools to do it.

The result for corporates that want to build big data applications, including GDPR, is that they hire a lot of data scientists with the goal of building cool new applications, but the data scientists end up spending 80 per cent of their time collecting, cleaning and integrating data across the various systems.

And this is the part of the job they like the least. See the graph below.

The solution: Cluedin.

I think the problem is only getting worse as more and more data become available but separated in many different systems. That’s why I joined Cluedin two years ago as an advisor and shareholder. Even then, Cluedin already had an amazing team with a clear vision of how to solve the problem.

Cluedin’s solution is to provide a data layer that automatically integrates data across over 150 different applications and then uses machine learning techniques extract meaning from the mess. In this way, Cluedin allows highly paid data scientists to spend most of their time on what they enjoy the most and what generates value for the corporates hiring them – that is, building big data applications – and not on the boring task of integrating and cleaning data.

Do you want to know more ? 

Feel free to join the webinar, where CluedIn founder, Tim Ward, gives his take on how companies can use the GDPR regulative, as a steppingstone to transform your company into being more systematically Data-Driven.

Join webinar via this link: https://register.gotowebinar.com/register/2187247877391690242



要查看或添加评论,请登录

社区洞察

其他会员也浏览了