The Data Dream of CTOs and CDOs
Photo from Sharosh Rajasekher on unsplash

The Data Dream of CTOs and CDOs

How CTOs and CDOs are dealing with Excel at the age of APIs?

We have all undoubtedly faced this situation in a board review meeting: one department refers to the level of a specific KPI, while another department has a different figure for the same KPI.

Any of the next scenarios can then occur:

— How the hell can we have such different numbers?

— Where are you getting your data from?

— Let’s ask our data specialist to join the meeting.

An old banking dream about data systems was to “only” add one layer to all built-in independent solutions across departments to ultimately come up with a global centralized data access.

Most Chief Technology Officers, Chief Data Officers, or even Chief Operational Officers still face the same challenge across many industries. They want to amend or delete some legacy systems and change ineffective data-managing behaviours, both of which indirectly multiply data sources. Legacy slows down the data-centric strategy they have in mind.

During my days in banking, I remember building pricing tools with Excel/VBA for our sales force to allow them to provide fast pricing refreshes for complex products (e.g. structured products). Not only were most sales reluctant to endorse the pricing validity responsibility, but they were compiling errors as they were fiddling with the template files, trying to enhance their features.

Table of Contents:

1. A single data source of truth or the data dream of CTOs and CDOs.

2. Do corporates really want to kill Excel data usage?

3. Is “Optimum Latency” the real killer of Excel data?


1. A single data source of truth or the data dream of CTOs and CDOs.

Photo from Unsplash

The value of data has not been questioned anymore for quite some time now. “Companies need to become data-driven or perish” has become a well-accepted statement. Those who don’t view their company’s data as an asset may well face trouble in the coming years. However, in the field, managing decisions to embrace this reality varies across industries depending on the level of belief that data is not just a by-product.

Hiring a data specialist is not going to help companies get ahead of the race, though. Time and effort should be spent building a shared vision of the bigger picture and developing a data strategy.

A classic goal should be to lay the foundation for API consumable data all over the company, not only for trendy AI goals. Why? Mainly because APIs will ensure structured and accurate access to your data. This, in turn, will foster a single source of truth for your data. CTOs and CDOs are here to efficiently enable the use of data as an organizational/strategic asset. It is up to them to create the infrastructure and culture necessary to reach this goal.

To illustrate this, let’s examine the genesis of a company where founders try to answer a customer pain point. The Minimum Viable Product is built to solve a said pain point. As the company grows so does the CTO’s responsibility to find new ways to scale the solution’s capabilities. I encountered many examples proving my above point and how some tech companies are trying to address this topic:

· Fintech startups appearing 10–15 years ago

Once their MVP was validated, they needed to come up with a more robust solution to deal with a high volume of consumption. They suddenly faced an expensive challenge to come up with their product’s next version, which was 100% focused on building an efficient APIsed structure.

· Challenger banks (or neo banks)

They decided to create from the get-go a technology that would be aligned with a real data-driven core strategy, so they would not have to come up with a new version once the business accelerated. Some decided to leverage platforms (banking as a service), others just came up with their full API-focused new structure.


2. Do corporates really want to kill Excel data usage?

Photo by Sammy Williams on Unsplash

In short: NO. They understand the pros of using Excel to manipulate data. Excel is still the most used tool to manage data for some very good reasons, mainly thanks to its flexibility and ease of use.

What they do want to kill is Excel-like data usage done independently in different corners of the company. They want to prevent smart guys from coming up with new analytics based on raw data that is not coming from a single common source of truth. This becomes a real challenge when considering data legacy systems.

Simply put, a data legacy system is an infrastructure built to centralize data in a very inflexible way. Very often this is initially built for one specific area of a company. It does not answer the needs for efficient data consolidation and consumption across the company. Here is where Excel files fly from one end to another, as users are tired of spending/wasting time on an “outdated” system. At one point, gathering data and working on your models from Excel becomes faster and more accurate. And inevitably, this will end up in the situation I described in my introduction.

So, on one hand, you don’t want to slow down creativity but, on the other hand, you need to ensure that the data is easily accessible to all, and not only for data scientists or quants. Who knows what feature or function could arise from a business colleague with supposedly no data background? Business teams are the ones facing the market realities and can come up with very good ideas to solve specific problems.

The culture and habits around your data are the real targets. Don’t shoot the messenger because he is playing with data using a flexible tool. Tools are just a vehicle to support this data-driven culture.

It is a challenge to adopt the perfect tool as each company has its specificities and legacy systems. CTOs are spoiled for choice. However, in the end, they should target the most appropriate tools in line with their ambitions. Very often they will bring added flexibility to their legacy systems and maybe migrate to a modern full-fledged system at some point.

Additionally, don’t forget the positives of a data-driven culture. It’s a key driver to attract talent to spearhead continuous innovation.


3. Is “Optimum Latency” the real killer of Excel data?

Photo by George Huffman on Unsplash

With the rise of APIs, companies can easily access meaningful data for their business that is actually generated from outside the company. The better they can digest this massive amount of data, the better they can make smart decisions for their business. Hence the rise of data science departments (or “data hub” teams) in many companies. They will filter and analyze this data efficiently to help management take fast and accurate decisions.

One month ago, Guido van Rossum, who created the famous Python programming language — one of the most liked programming languages in the world according to Stack Overflow — announced he is currently looking to double the velocity of its language. There is strong competition for delivering fast calculations and the announcement aims to address one of Python’s main weaknesses compared to languages like C ++ or Julia.

As detailed in Deloitte’s report ‘Machine data revolution: Feeding the machine’: “end users have less and less patience for the kind of latency that legacy systems and data models often deliver. The optimum latency time between click and desired response is less than 50 milliseconds — any longer and users become irritated by the delay and make executive decisions themselves”.

This kind of requirement is clearly not something that Excel will deliver soon. But you have some solutions at your disposal to bridge this pitfall and empower your Excel with external code. Have a look at How to boost Excel data management with APIs via Python, R, Rust… for more info on this topic.

Who said CTO or CDO is an easy job!?


Thank you for reading!

You can connect with me on LinkedIn.

Happy to answer any queries that you may have.

Louis Dewavrin.

Avery Michaelson

Portfolio Manager at Sea Point Capital | Founding Partner of Longitude Solutions | Founder & CEO of UCapture

2 年

Thanks for sharing?Louis ??

回复
Avery Michaelson

Portfolio Manager at Sea Point Capital | Founding Partner of Longitude Solutions | Founder & CEO of UCapture

2 年

Thanks for sharing?Louis ??

回复
Dermot Carroll

Head Of Solutions specializing in Intelligent Automation at Virtual Operations

3 年

I like this article! I've always thought that if an organisation's infrastructure is the heart and blood vessels of the body(organisation), the blood (data) is what should be being pumped around the organisation. Theis needs to be high quality and timely, otherwise compromises happen. Thinking of the data lifecycle, in my opinion, one data has "landed" into excel it is for informational purposes only, and should never be re-injected to live production systems. There are definitely better methods than excel for data transport.

as your picture shows nicely it is not corporates that want to kill excel usage, it is those of us who have emerged from Stockholm syndrome, with excel as hostage taker!

Jerome Combes, London Business School MBA

Finance Director, EMEA Regional Treasury at Kenvue

3 年

Great read Louis- thanks for sharing!

要查看或添加评论,请登录

社区洞察

其他会员也浏览了