Decoding the Real Value Behind Your Analytics Initiatives
AI generated

Decoding the Real Value Behind Your Analytics Initiatives

Why is data an asset ? A practical guide to quantify the value of analytics initiatives in your organisation


In today's rapidly evolving business landscape, growth-stage organizations are increasingly recognizing the value of harnessing data. Nevertheless the journey towards effective data utilization is often clouded by questions about the strategy, effort, and true value of data initiatives. Navigating this complex terrain requires a framework to break down the process into digestible chunks, as there are no shortcuts.


Why is Data an Asset ?

Data is increasingly recognized as a pivotal asset, primarily due to the significant economic value it can generate. A "data asset" is a piece of information that is valuable to a business or organization. It's any data that an organization has that can be used to achieve its goals. This includes everything from databases, customer lists, and financial information to proprietary algorithms and software.

To understand this, let's revisit the textbook definition of an asset:

It is anything (tangible or intangible) that can be used to produce positive economic value

The key term here is 'use.' Take the example of a house purchased for rental purposes. The initial stages involve meticulous selection of developers and locality, and careful budgeting. However, the true value of this asset emerges not merely from its ownership but from how it's utilized — finding tenants, furnishing, and maintaining the property to enhance its yield.

Similarly, establishing a data infrastructure is just the starting point. The actual economic value materializes through the strategic use of this data. How a team leverages data to optimize processes and operations is crucial. You begin to see quantifiable differences in operational efficiency, which gradually reflect in the P&L statements.

Yearly Return vs Cost

As time progresses, the accumulated data becomes Increasingly valuable. It unveils trends and shifts in customer behaviors, offering insightful clues for strategic decision-making. The lifespan of data as an asset is virtually infinite, given its evolving nature and the continuous insights it offers.


Evaluating the Effectiveness of Your Analytics Initiative

A common query from founders and management before starting their data journey is about the return on investment (ROI). As we established the crux of the answer lies in the effective usage of data. We will follow a sequence of steps, beginning with identifying the gaps. The diagram below helps you visualise the steps to take before starting a data project. We will go through each step in detail.

Decision Flow Diagram


1. Identifying Use Cases

The first step involves listing all departments within your organization, such as:

  • Sales and Marketing
  • Product Design and Development
  • Customer Service and Deployment
  • Finance and Accounting
  • Procurement

Following this, distribute a form to each team to pinpoint areas needing improvement. The APQC Process Classification Framework serves as an excellent cross-industry guide to identify all functional processes.

A snippet of APQC Retail Process Classification Framework ( PCF )


Using the framework as a reference, encourage your teams to delve deep into each aspect of their operations. Gather all potential use cases, but remember to prioritize those that have the most significant impact on the key performance indicators (KPIs) of your business.

Use cases for the marketing division


2. Impact Analysis

The next crucial step is to determine how these use cases can drive value, primarily in terms of two key business metrics: revenue and cost.

To understand the impact of each use case. We can employ guesstimate methods or use industry benchmarks to analyse the impact of analytics on the use case. For more details on these guesstimate methods, we can provide a comprehensive post, as these methods may vary from company to company.

Consider the example of a Customer Segmentation use case. If a company has already invested in tools for customer segmentation, the impact will be measured by the difference between the data ingested into the tool and the data available in your warehouse. However, if your company is starting from scratch, the impact of segmentation will likely be more substantial.

Total impact on P&L by each use case.


The snippet above shows how a company with an ARR of $10 M can benefit from implementing a data mart for the Marketing division. 1

A similar method must be followed for all the use cases and departments. Try to break down the percentage in actual number terms to see if the actual dollar impact is realistic. Summing up these numbers further will provide a collective impact on Revenue and Cost.

For someone looking to generate a detailed report of the impact, one can drill down and map how each KPI is impacted by the solution.

An example of how parameters are linked to each other. Profit is linked to revenue and operating costs, which is further linked to other downstream parameters.


For the sake of simplicity, we will avoid this complex structure and focus only on the core drivers of Revenue and Cost.


The graph above shows how the use case of each department adds to the ROI of the total project. As you incrementally derive more use cases from the data, the higher the yield on the asset.


3. Identify Data Sources

Having assessed your use cases and their impact, it's time to determine the necessary data sources for building the data product. One effective approach is to map the data models required for each use case.


Data Assets mandatory for each use case


Once you've completed this mapping, identify common data sources shared across multiple use cases.


Data Asset Priority


In the above scenario, transactional data and Google Analytics are the most common datasets across use cases. This helps deployment teams prioritize data asset development.


4. Analyze Costs

Costs associated with a data project can be categorized as follows:

  • Infrastructure Costs
  • Deployment & Maintenance Costs
  • Team Costs

These costs are highly customizable and dependent on your organization's pace and the number of use cases you wish to implement.

Infrastructure costs encompass recurring expenses for ETL pipelines, data warehouses, and visualization tool licenses. For cost-conscious startups, open-source tools like Metabase, Superset, or Lightdash are valuable alternatives.

Deployment and maintenance costs come into play if you opt to outsource part of the deployment process to a data consulting firm. Their expertise can help you avoid common pitfalls when selecting and deploying the right data stack.

Team costs cover data analysts, data scientists, and data engineers who work on use cases, translating insights into actionable information for your operations team. This is where the true business value is extracted.



It's vital to continually monitor costs versus value over the long term. If the value derived post-implementation doesn't justify the costs after a few years, it's time to reevaluate your strategy.


5. Capital Budgeting

Assessing a project's viability often involves conducting an NPV (Net Present Value) and IRR (Internal Rate of Return) analysis. These metrics help determine whether it's worth allocating capital to build data capabilities at this stage. 2


NPV and IRR analysis


If the NPV is greater than 0, it's advisable to proceed with the project. In cases where multiple initiatives compete for limited capital, compare their NPVs and prioritize those with the highest values.

To break down the projects into smaller sub—sections you can use the data asset priority list and implement the entire project in phases. That reduces the risk of high investment at start. Once the value is realised and the operational team is proficient in using the dashboards. You can chart the roadmap for setting up other data assets.

In conclusion, data is undeniably a valuable asset in today's dynamic business landscape. Its potential for generating economic value is substantial, but realizing that potential requires a strategic approach and effective utilization. The true worth of data emerges through intelligent and purposeful application.

To embark on a successful data journey, organizations should follow a systematic framework that begins with identifying relevant use cases and assessing their impact on critical business metrics such as revenue and cost. The right data sources and efficient cost management are essential components, and capital budgeting analysis can help determine the feasibility of data initiatives.




References

  1. https://medium.com/towards-data-science/how-to-build-and-manage-a-portfolio-of-data-assets-9df83bd39de6
  2. https://www.superchargedfinance.com/blog/how-to-define-value-drivers
  3. Capital budgeting for SaaS Products.


Footnotes

1 The numbers on the spreadsheet are hypothetical. They are not benchmark to any industry or business. They are added for the sake of representation.

2 The cost for implementation are no representation of actual costs. The numbers are added for demonstrating the concept.



To read my original post and interact with me on this topic subscribe to my substack blog.



Great article, Darshan! It's really helpful for decision-makers to understand the value of analytics initiatives.

回复

要查看或添加评论,请登录

Darshan Bhagat的更多文章

  • Driving Data Strategy with Value Driver Trees

    Driving Data Strategy with Value Driver Trees

    How Value Driver trees can help shape a better data strategy In the realm of data-driven projects, a data strategy…

    2 条评论
  • Demystifying Data

    Demystifying Data

    Insights on Data Strategy and Analytics Hello, Data Enthusiasts ! It's my pleasure to welcome you to my newsletter…

社区洞察

其他会员也浏览了