Tail Wagging the Dog - Historical Data & Unique Visitor Counts
Adam Greco
Analytics industry veteran. Product Evangelist @ Amplitude. Helping teams build better products. Author of the definitive book on Adobe Analytics. Ex-Salesforce, Ex-Omniture.
In recent weeks, I've run into a few situations in which organizations have been held back because of historical digital analytics data. While historical data can be important, in this post, I will share some perspectives on why I don't think your organization should be held hostage to historical data.
Scenario #1 - YOY data
Year-over-year digital analytics data is a staple in our industry. Most organizations like to see how the current time period compares to the same period the previous year. This is especially true in the retail industry. But how much weight should you put on YOY data?
Many years ago, when I joined Salesforce to head up its digital analytics team, the digital analytics implementation was a bit of a mess. Most of the people within the organization had lost faith in the analytics data and the overall implementation. After my initial assessment, I told my boss that I wanted to restart the entire implementation from scratch. He thought that was a bit too radical and one of his main concerns was that people would lose year-over-year data. I argued that the data from last year was inaccurate anyways, so why should we stop having a better analytics implementation in the future just because people wouldn't be able to see last year's inaccurate data! For me, having a fresh new start and an analytics implementation that people trusted far outweighed the loss of untrusted year-over-year data. And I told him we would have year-over-year data - in a year ??
But what if your historical data is relatively accurate and you decide you want to make a change to the way you collect data or change to a new digital analytics tool? In these cases, I often see organizations that are afraid to change tagging logic or switch to a new digital analytics tool because it will negatively impact historical data. To me, this is akin to the tail wagging the dog. Why should an organization hold back on making things better or more accurate simply because it will make historical data look bad?
Imagine your organization is using a digital analytics tool that has poor adoption, low self-service, and users that have little faith in the implementation. While changing tools is not always the solution, sometimes a fresh start is needed. Maybe changing tools affords the opportunity to collect less data, but do it better. Or maybe having users see a new tool will psychologically demonstrate that things are different from the past.
What if the analytics team identifies better ways to combat bots or identifies new logic that makes tracking marketing channels more accurate? Should you hold off on those improvements out of fear that it will distort year-over-year comparisons?
More importantly, what changes will you really make to your website/app differently if this year's data is vastly different from last year's data? I am not asking this question facetiously and encourage you to add a comment below with examples of how YOY data has made a difference at your organization.
In my opinion, the goal of digital analytics is to use data to drive improvements in digital products and experiences. So what changes would you make based on year-over-year data? Suppose that this month's data is 10% lower than the same month last year? What are you going to do differently? If product A is up 10% and product B is down 10% are you going to stop trying to improve product A and just focus on product B? Probably not. You probably want both to go up! Is your digital analytics platform the only indicator that YOY metrics are going up or down? Hopefully not, since digital analytics platforms should not be the system of record for your important metrics.
领英推荐
In addition, there are many things that could have caused the YOY figures to be different. Perhaps you ran different campaigns? Maybe a new competitor entered the scene that took away from market share? Maybe your customers bought new versions of your own products instead of last year's? After the COVID-19 pandemic, almost all organizations lost access to reliable year-over-year data and guess what - the world went on! So if you have an opportunity to improve the way you collect data or if you need to switch new tools, I don't think preserving year-over-year data should be high on the list of reasons to not make a change (see my full advice on choosing new analytics tools here).
Unique Visitor Counts
Along similar lines, in recent weeks, I have encountered a few organizations that are considering changing digital analytics tools, but are concerned about the reporting of unique visitors. To be fair, both of these organizations are media organizations that report unique visitor counts to investors. Major changes to these unique visitor counts could impact share prices and financials.
Both organizations are considering moving to a digital analytics tool that has a more accurate way of reporting unique visitors. It's easy to show how each tool computes unique visitors and why the counts are different. But in both cases, the new unique visitor counts would be lower than they had been historically. So what would you do? Would you take the hit and report to the street that a new method of counting is being used that is more accurate than in the past? Or would you identify the delta in unique visitor counts and continue reporting the inflated unique visitor count?
While this can be a challenging situation with possibly negative ramifications, my vote would be to report the most accurate number possible and explain the difference. I might look at trends and see if there is an accurate way to calculate the old unique visitor count and report both for a period of time until internal and external stakeholders understand and are comfortable with the new numbers.
But at the end of the day, this is another situation where I feel that holding on to past historical data shouldn't stop you from doing what is right in the future.
Final Thoughts
In the digital analytics industry, there will always be situations in which you can improve your analytics data or implementation. These improvements will likely have impacts on historical data and trends. In these situations, you will have to make deliberate choices about the future. How will you handle these situations?
My advice is to not let the historical data tail wag the dog!
Multi-Solution Adobe Director/Architect | Full Stack IT Strategist | Adobe 3x Certified Architect AEP (WIP), AEM, Analytics | Ex-Publicis Sapient | IIT Roorkee | MSVV
2 个月I agree with the advice. Equally, I have first hand experience of how hard it can be to convince stakeholders. I used to work for a workforce management software company. Our software calculated the total hours and dollars and sent that data to payroll companies that would print the paychecks. During the parallel phase, the client noticed that our numbers were different from their current numbers. We had to explain that their current numbers were incorrect and had been incorrect for years! We're talking about a Fortune 500 company here. It was a hard pill for them to swallow. But we sat them down and explained our calculations and they finally agreed to accept our numbers move forward, albeit with a bit of egg on their face. :)
Product coach: discovery, metrics and experimentation for trios and product teams || 1:1 coach for Lead & Senior Product Managers || Miss Impact Mapping || Keynote Speaker || Maker
7 个月I would make the decision depend on their relationship with their investors.
Experienced Analytics Leader, Thinker, and Doer
7 个月"The sooner you make the change, the sooner your historical data (including your unique visitor counts) will be good to go." I've used the "you will have YOY... in a year" line myself on more than one occasion! The cautionary note is that if this discussion is happening every three months ("Hey, you keep telling me I'll just have to wait a year, but then something changes and the counter gets reset!"), then one of two things is happening: 1) maybe the organization needs to pick their tooling and stick with it for longer, or 2) the organization is in such a dynamic environment that external factors are causing those resets...in which case maybe there should be a discussion about the usefulness of putting a bunch of weight on the use of an annular time horizon.
...and then even after you've improved the accuracy of your data maybe there'll be another issue, akin to cookie consent management, which will come along and require some such adjustment to be made that will itself impact counting and YoY comparisons. Probably better to operate on the basis that your analytics kpi data will - more often than not - require some explanation just as quarterly earnings figures do.