Digital Conversations #1:           “Whose numbers are right?!”
(Multiple Report Variations)
Adobe Stock

Digital Conversations #1: “Whose numbers are right?!” (Multiple Report Variations)

Digital transformation is being hindered because organizations fail to identify and correct persistent types of data errors that affect the accuracy of business insights.

  • Business users frequently encounter discrepancies in the results between different versions of?reports that are supposed to be the same
  • ?It is difficult for organizations to identify causes for report inconsistences, so they go with a “best guess” and the errors remain unfixed
  • In today’s distributed systems landscapes, IT teams lack the control, resources and tools to work with business users to find permanent fixes to these problems

?

The London Whale

In 2012, a financial analyst working at J.P. Morgan’s offices in London was working in Excel on value-at-risk (VaR) models when an inaccurate copy and paste from an existing Excel report into a new one produced numbers that dramatically understated the company’s trading risk and was a primary factor in a subsequent $6.2 billion trading loss incurred by the company.?While this may seem like a unique type of occurrence, it is by no means so:?if one does a google search on reporting errors that caused significant impacts for businesses, governments, military forces or individuals, there are countless additional examples to review.

Simply put, numbers matter immensely and are relied upon constantly to support decisions of all kinds in every organization.?However, as Gregg Easterbrook identified:

Torture numbers, and they’ll confess to anything

As a result, the accuracy and reliability of reports used within an organization is a constant concern.?Inaccurate reports represent a large barrier to delivering the trust needed for business users to embrace the data-driven culture required to operationalize digital transformation.?


The (Digital) Conversation:?Multiple Report Variations

So how do business and technical users recognize this problem when it is occurring within their own organizations??What follows is the conversation that occurs in the separate yet parallel IT and Business universes within an organization when report discrepancies are being experienced.

No alt text provided for this image

Although the conversation for each type of data/analytics problem is different, all share the same essential result:?business users perceive that the organization has widespread inaccurate and unreliable data that is not being fixed by IT (“our data is terrible”); and the IT group concludes that lines of business (LOBs) are making decisions and creating digital assets independently which lead to problems they do not understand (“they don’t know what they’re doing”).


The Digital Divide

When meeting with business users to discuss their ability to leverage data to deliver business insights, we often lead with a very simple question:

Do you go into meetings in which two different people have the “same report” but different numbers?

?We have asked hundreds of business users this question and have never – not even once – received a negative response.?Quite simply, the type of conversation highlighted above is a routine occurrence and is the rule rather than the exception.

Clearly this is not a good thing.?If information is not accurate or trustworthy, then how can a solid analysis be performed, and effective decisions be made???Each determination made based on the wrong information presents a risk to an organization, and the accumulated total risk of the ongoing occurrences of this issue across LOBs represents a significant exposure.

So, if this is a known pain point across every organization, what is preventing a solution to these problems??The simple answer is that business users are not collaborating effectively with technical specialists to identify the flaws that create this dilemma, and thus they can not perform lasting fixes.?This is not due to a lack of desire to address these problems but rather prevailing organizational obstacles that get in the way.


The Collaboration Fog

Deux hommes se rencontrent bien, mail jamais deux?montagnes
(Two men may meet, but never two mountains)
- French proverb

When problems persist and strong, active mechanisms for communication are lacking, the resulting disconnect gives rise to frustration and blame.?DigitalOpz refers to this as a “Collaboration Fog,” a situation in which two different people or groups are in close proximity but can not see their way through to each other.?It is a situation characterized by the following factors:

  • Two or more groups share the same concern about a given challenge or barrier;
  • Each group has a different perspective on what is causing the problem;
  • Collaboration pathways are not clearly established to enable a joint resolution.

The counter-productive result is that initial complaints transition into attitudes of acceptance as people adapt to what seems like an unfixable issue.?

In the case of multiple report versions, the reporting problems fade into the background over time. ?They are blended into the single, broad bucket of data quality, accuracy and trust issues without specifically defined causes, reinforcing perceptions among business users that the quality of data available to them is poor.

The IT term to describe the causes of data problems being experienced by business users is “data quality dimensions.” ?For each identified data quality issue identified, there can be several different data quality dimensions that may be at the root of the problem.?And as data volumes and applications continue to grow dramatically over time, the lines between different data quality issues become blurred and one issue becomes indistinguishable from another.

This is why it is crucial for data practitioners and business users alike to recognize the underlying causes at play.?If these are identified properly, it will enable participants to target the appropriate solution path and establish long term fixes.?If not, it will significantly impede the organization’s ability to leverage data for digital transformation.?

?

Key Barriers to Success ?

There are three primary barriers to resolving report discrepancies and producing an authoritative set of numbers:

  1. When report results are discussed in meetings, there are typically time pressures driving the analysis (e.g.?end-of-month numbers or an executive presentation), and there is simply no time for business users to investigate the discrepancies, so they simply use the report that has been relied on previously.
  2. The process to investigate report inaccuracies is normally not well-defined, and few organizations have dedicated business or data analysts available to action these requests, so the mechanism to perform such an analysis is not readily available.
  3. With multiple source systems, different departmental reporting tools, and conflicting versions of a report – which includes edited reports saved locally by users – the “Root Cause Analysis” required to identify the problem is often difficult and time-consuming.

Even for issues that are identified and correctly documented, ?it is a tall order for an organization to determine the root cause, correct the error, establish a single certified report that can be reliably produced on an ongoing basis.

Now that we understand why it is be challenging to identify and fix discrepancies between different versions of the same report, let us turn our attention to a related question:?why is it so difficult to establish a common report in the first place?

?

?The Technical Details

Without going into the intricacies of the creation of an analytics report, let us just say that there are several steps involved in building a report, and this means that there are several potential points at which discrepancies can occur.?From a high level, the creation of a new report involves three major steps:

  1. A business user identifies the information that is needed to answer a specific business question that supports a business goal, including definition of the pertinent Key Performance Indicators, or KPIs, which are metrics used to assess progress and results. These requirements are documented and passed to IT.
  2. IT finds the required data in the source database(s) and copies it into a data warehouse or lake. Then it is combined, filtered, and enriched to meet the requirements. ?To improve performance, the data is copied into datamarts and made available to the reporting tool(s) known as the Business Intelligence (BI) or analytic tools.
  3. Business/data analysts use a BI tool create the report by identifying the relevant and available data and then writing queries that perform calculations to populate the report. Once the data resides in the report, calculations, filters, grouping and sorting may be performed to produce the final results.

Unfortunately, each of these steps may contain unintentional flaws or bugs that will result in incorrect numbers and lead to future business confusion.?Moreover, the data itself can be incorrect or missing.

?

Illogical Logic

First, let us turn to the establishment of the business logic upon which the report will be based.?The central issue is well captured in this quote from the musician James Deacon:

What you see depends not only on what you look AT, but also on where you look FROM

People in different roles and business groups define their needs differently, even if they may think that they are speaking about the same thing.?A common example of this can be found in the difficulty that organizations regularly experience with the seemingly straightforward sales revenue report.?This is a clear numbers exercise, and therefore a safe assumption is that all that is required is a simple accounting calculation to establish a sum.?However, depending on what one is looking AT (product line, sales group) or where one is looking FROM (country, region, LOB), or even what period of time (12 months, or 13 four-week periods) the logic applied can produce substantially different numbers.?

Here is one example of a filtering discrepancy:?

A company may sell products to external customers but also offer internal employees the option to purchase the same products at discounted rates, such as a telco selling mobile phones and wireless services.?If a sales executive were to request a revenue report, that person would only be drawing data from external customer purchases, as that is what the sales group is measured on.?On the other hand, were a finance executive to require the “same” total revenue report, it would include both the purchases from external customers and those from internal employees.?

One can imagine a situation in which both executives come to a quarterly meeting armed with the numbers produced by their report, only to discover that their numbers do not match. Ironically, both reports are correct – it is their definitions of the Business Term “customer” that is to blame.


Everybody’s Doing It!

To paraphrase the quote from the previous section:?what you create depends not only on what data is needed, but also on the efforts of IT to get the data out of the databases, and on business analysts to create the report.?Two people with the exact same design, but different ways to execute the plan will end up with different results.?When we apply this thought to the creation of data-driven reports, the core challenge is that analytics tools and reports are constantly growing and accumulating at the corporate and LOB levels across every organization.?Here is one eye-opening statistic that bears this out:

Large enterprises have an average portfolio of 364 applications, 56% of which are managed by shadow IT

Source: Productiv State of SaaS Sprawl 2021

When it comes to different analytics or BI tools, every tool employs a query format that is specific and unique, even if the intent and function are similar.?As a result, even if the underlying calculations and data sets are the same, the report may present results in a slightly different way.

It would be unreasonable if not impossible for technical specialists within the IT group to have the collective skills to track and manage queries across different tools purchased at the LOB level.?The fact that these tools are often purchased by business groups without consulting IT or applying a consistent assessment methodology renders the inventory of reports nearly impossible to manage or support.

What can be done??

In a digital utopia, each organization would have a single centralized data repository that would contain an authoritative set of data and would also establish a standard analytics application for use by all departments across the enterprise.?Each new report would be certified and formal processes for access, proposed changes, and audits would be in place.?Most reading this will emit a longing sigh at this alternate reality, despite many previous vendor attempts and claims to have created such a universe.?We all know that with the demand for information, speed of business and resulting proliferation of reports – especially in a “self-service” world full of departmental reporting tools – this remains a dream.

That does not mean that there is no path forward.?Admittedly, given the sprawling and unmanaged inventory of reports within virtually every organization – including legacy reports, departmental reporting silos and localized copies – it would be futile to attempt to fix this problem retroactively.?Be that as it may, an inability to correct previous outputs does not mean that the process cannot be updated and corrected moving forward.?What that will require however is a step back to establish effective collaboration processes that value long-term accuracy and controls over speed.?This guiding principle is eloquently captured in this African proverb:

If you want to go fast, go alone.?If you want to go far, go together.

This requires the organization to re-assign responsibilities around data processes and usage so that ownership and accountability is clear.?This will also make internal communications aimed at finding and fixing reporting errors simpler and more achievable.?The overall goals of this effort should be to ensure that business groups take ownership of their data and the quality of that data, while IT experts remain the owners of the databases, lakes, development code and systems.

The solution being described naturally calls for the creation of a pragmatic, business-centric digital governance program.?Such a program can achieve quick business wins and grow organically by focusing on the most important and business critical reports.?Once results are demonstrated and trust is built, support and investment will surely grow.

No alt text provided for this image

要查看或添加评论,请登录

Tom Kilburn的更多文章

  • Friday Morning Inspiration

    Friday Morning Inspiration

    As you open your screens this morning to find out that a court case was just completed which proves to one half of…

社区洞察

其他会员也浏览了