Total Cost of Ownership of an Analytics Program (Part 2) - Reckoning the Costs: The Future is Here
Cost Overview
In our last article we discussed the benefits of an Analytics Program. The takeaway from that piece is that the benefits are vast and will consistently eclipse the costs of a holistic program if implemented properly. Here we will address the costs in the manner that befits creating a business case.
TCO: Total Cost of Ownership
Too often, software is evaluated by comparing license costs. If an organization is intending to engage in a true Analytics Program, the license costs of the enabling software are likely to constitute no more than 10% of the Total Cost of Ownership. Most of the hard cost is in the time and talent of your people, and though minimization of these costs through choosing the right path is critical, the most important impact to the business case is the loss of functionality (and therefore value) by not avoiding certain pitfalls.
Scope of an Analytics Program
In order to accurately describe the costs of an Analytics Program, we should first describe the scope of what is to be achieved. An Analytics Program allows a department (in this case IT) within an organization to do three things:
- Optimize decisions made regarding its own operations (Data Warehouse)
- Communicate the value of its contributions to the strategic goals of the organization as a whole (Dashboard)
- Supplement the data across the organization to support critical projects (Data Lake)
Let us place this in contrast to a reporting function on a particular application. Most applications have rudimentary reporting functionality. This typically consists of the ability to extract data into spreadsheets or text files to do other stuff with.
For practical intents and purposes, this type of extraction cannot be automated for repetitive process and is a shaky foundation for the activities above. Plenty of organizations do this, and the manpower involved is titanic and prone to single point of failure collapse. Making the right choices in your platform can return tens, hundreds, even thousands of your people to high value activity rather than pulling data.
Bear in mind also that this writing is being performed in 2021 before we get to the real Philip K. Dick stuff (or wait, I guess we already did most of that). Some of these things will date very quickly, but most have been around at least the quarter century that I am professionally aware of.
The Basics
At a minimum you will need an ability to create the following content items for the applications that are under scope:
- Reports
- Dashboards
- Calculations – KPIs, Metrics, custom business attributes
In the subsequent sections, we will discuss the relevant cost considerations and functionality limiting factors for each.
The Philip K. Dick Stuff
I would not argue that one should sacrifice good on the altar of perfect, so one should bear in mind the existing skills within one’s organization when considering what is truly on the table. If you’ve never seen a report breaking down breached incidents by vendor before, it’s not necessarily feasible to expect streaming predictive analytics tomorrow. However, it is worth considering that you would want it one day, and depending on who you are, that may be tomorrow.
- Predictive Analytics
- Live/Streaming Analytics
- Data Lake Integration
Find the right fit. If you’re in an organization under $5B in revenue, these are not necessarily priorities. The top priority is likely ingesting and integrating your critical operational data into a common repository. If you’re an organization under $1B, your first priority is likely instrumenting your infrastructure consistently to produce this data. We will discuss the proper foundations, and those are available to organizations of any size such that they can be utilized as one grows into them.
Reports
Let us first consider what a report is, or at least should be. Fundamentally, a report should be something that enables an audience member or an audience as a whole to:
- make a better decision
- to make a decision more quickly
Another way to think about this is to say a report should help a person or group perform a task more quickly or accurately.
I believe these two definitions draw a stark line between useful and useless reports (how many reports have you seen that do not fit this definition, and what did you do with them?). They also set up a pretty clear set of tasks required to build a report and identify the people that need to be involved.
To remember the elements of an effective report utilize the helpful mnemonic, TOPHAT:
- Target: Identify the audience
- Optimize: Understand the decision/task
- Parse: Obtain supporting data
- Harmonize: Arrange report to support decision
- Automate: Run the report repeatedly
- Transport: Deliver the report
Let’s also state here a principle from Agile. Reduce the number of handoffs! If one person can do each of the items above, you will have a much shorter path to value than if you need to involve 7-10 different people.
Target: Identify the Audience
Who is making the decision? Who is performing the task? In an ideal world, a member of that audience has requested the report, but that is not always the case (think executive mandate). If someone has not stepped forward and volunteered then the one doing the mandating is going to have to tap the volunteer on the shoulder. Let us call that volunteer, regardless of their method of promotion, the Report Champion.
Quite often, there are similar audiences across the organization faced with similar decisions or tasks. Consider whether the platform supports serving multiple audiences with the same report through parameterization or if separate copies of a report must be maintained.
Perhaps obvious, but worth noting: If you cannot identify the audience for a report, spare yourself the pain. A report without an audience is a toy without a child – they can be cool to build, but may highlight life choices to reconsider.
TCO Considerations:
- Operational Management Involvement
- Report Duplication vs. Parameterization
Optimize: Understand the Decision/Task
This is where you’re going to need that audience member. Ideally, this is someone that makes this decision or performs this task every day (or repeatedly) but is lacking certain pieces of information. If they can state the value of having that information, at what level of detail, and with what degree of timeliness, all the better. Laying this out is absolutely critical but the most commonly ignored step of report building.
TCO Considerations:
- Operational SME involvement
- Business analyst to translate into technical/data requirements
Parse: Access the Supporting Data
This is where it can get dicey. Having identified what information is required from a business perspective to properly support a decision, all we need to do now is go find it. In essence, this involves three steps:
- Identify the application
- Navigate the data model
- Retrieve with query language
The key is that you need people with all three skills, and ideally someone with understanding of the business operations. If you have a reporting interface that can access multiple applications, abstracts the data model, and interacts with the data in a user-friendly interface, you can achieve this. Otherwise, you need the aforementioned audience member, an SME to identify the application, an application SME that can navigate the data model of the target application, and a SQL developer to stitch that data together. Remember that Agile thing about handoffs? Keep an eye on overspecialization.
TCO Considerations:
- Functional SME to identify target applications [often not required]
- Level of data model knowledge of target application(s) [required]
- Query development [T-SQL, Oracle SQL, APIs if required, or GUI]
Harmonize: Arrange Report to Support Decision
In order to build and maintain reports, you’re going to need someone trained on the reporting interface. The key consideration here is whether the Analytical Program utilizes an industry standard reporting interface (Power BI, Tableau, a rapidly diminishing list of others) or application specific reporting interfaces (e.g. Remedy Smart Reporting, ServiceNow Performance Analytics, SolarWinds, etc.).
Another consideration is who will be doing the report building. This can be the Report Champion, or it can be a dedicated report builder (DRB). The Report Champion will have operational experience and will eliminate the need for requirements sessions if they can be empowered to build the report themselves (industry standard reporting tools, user friendly interaction with the data model). A DRB is more likely to maintain report standards and handle the report building itself more quickly. There are pros and cons to either path.
Consider the benefits of utilizing an industry standard report interface. For the Report Champion, this increases the likelihood that training will not be required. In the event that training is required, it will result in a skill that they can use throughout their career across the organization.
In the case of a DRB the industry standard is almost required as their mandate will surely include multiple applications. This role is often an external hire(s), so it is also necessary to consider how large the community of developers is that have skills in that reporting interface. The less common it is, the more expensive they are to hire and the longer it takes to find them.
TCO Considerations:
- Report specialist or Report Champion
- Handoffs drive delay, lost information, rework
- Reporting interface – Industry standard or Application Specific
- Training – standard or specific
- Existing community of developers
Automate: Run the report repeatedly
Ad hoc is important, but most ad hoc questions turn into repeated reporting requirements. If the report is truly supporting a decision, that decision will come up again if not in the subsequent moment. That means the data needs to be refreshed. Generally, the worst case is that the refresh comes from an extract from the production system that is manually formatted or uploaded to the reporting interface. Best case is that the report updates from a near real time mirror of production data optimized for reporting performance.
TCO Considerations:
- Impact of extract or report run on the production system (avoidable with reporting strategy)
- Time to extract data (automatable)
- Time to format report (automatable)
- Timeliness of data (minimizable)
The TCO considerations here are all of the employee hours spent preparing extracts and formatting reports, decreases in performance in the production system caused by poor extract construction, decreased value in the information obtained from being out of date. Most of these costs can be reduced to zero with a properly implemented analytical platform. Non-automation costs are sizable as the decisions made today are typically made tomorrow as well.
Transport: Deliver the report
Finally, let’s talk about how reports are delivered to the audience. I use the word transport because I remember an early day in my career when I did not consider the method of delivery. I copied some JCL into the top of my SAS report that I did not fully understand and inadvertently requested a hard copy of my report. Additionally, I neglected to get a record count and did not fully appreciate the size of my data set. After submitting, I did not see the option for .csv download but was very pleased to receive a pallet of 1960s printer paper three days later.
Reckon the costs
We’ve highlighted areas of cost impact for a single report. Now consider the breadth of the Analytics Program. Hopefully, there are 100s of reports that can be used to improve operations. Each is an opportunity to create enormous value for a relatively small investment in analytics. Bear in mind however, that the breadth of implementation multiplies the impact of each of the above considerations in platform choice and direction. One should also consider how Out of the Box (OOB) content defrays some of the full implementation impact either by eliminating the necessity for creating a report or by accelerating its creation with similar content that can be used as a template.
Multiplicative TCO Considerations:
- Costs above for each report X 1-n
- % of need covered by OOB content
- % of need accelerated by OOB templates
To be continued . . .
As reports are our first topic (and the backbone of an Analytics Program), let it be said that the points raised here and the structure of the argument will resonate through most of the following items both futuristic and mundane (both of which we will discuss in our next article).
Northcraft Analytics was created to be the foundation of your modern Analytics Program for IT, but the principles behind its creation would serve for any department. These TCO considerations have driven the design of the product from its inception over a decade ago.
- Built on industry standard Microsoft products
- User friendly interaction with data model for each covered application
- Pre-built dashboards, reports, metrics, ETL for each covered application
- Over a decade of experience with engaged clients has produced operationally relevant content
If you are considering implementing an Analytics Program for your organization, please contact us. We are happy to provide a TCO consultant at our expense to assist you with creating a business case to get your project funded. If you let the opportunity pass you by then:
Ridley Scott, please forgive me.
Postscript: It had not occurred to me before researching this article, but apparently the winning formula for 80s genre film success was to have Rutger Hauer hold a bird of some sort.
Stone cold classic Ladyhawke above. Further support of theory from A Breed Apart below.
Partner @ Carmichael Consulting Solutions | Information Technology
3 年Ryan Williams JD Sharp, PMP Troy Rachey Cameron Sewell Martin Palmgren