The Digital Analytics Evolution - A Founder’s Perspective
The Evolution of Digital Analytics
Digital analytics serves a crucial role for any organisation operating with digital channels. Initially, the main focus of digital analytics was to streamline marketing expenses. This required a solid understanding of traffic sources and their respective conversion rates. In response solutions such as Google Analytics and Adobe Analytics were introduced to the market to fill the technology gap.?
However, the digital analytics landscape has evolved. The spotlight has shifted from just optimising marketing expenditures to enhancing the overall digital product experience. This progression has carved out two prominent categories in the market: Product Analytics and Experience Analytics. With Product Analytics, we’re primarily focused on WHAT users do on our websites and landing pages. This is why tools such as GA4 and Amplitude are event-based. But it’s also key to understanding WHY users struggle in the first place.?
This is where Experience Analytics comes into the picture. Tools like Insightech take this a step further, offering features like session replay, click maps, and journey maps to provide an in-depth contextual understanding of user experience and how it impacts conversions.?
Here’s a diagram outlining how the market has evolved:
The Build-Measure-Learn Framework
As the founder of an analytics startup, I’m a firm believer in the Build-Measure-Learn methodology, which we utilise for the step-by-step development of our product and business processes. This approach is a fantastic tool, providing a proven pathway for continuous improvement in organisations of any size. The role of digital analytics in this methodology is undeniably central, particularly in the Measure and Learn phases.
Product Analytics tools have emerged as a crucial pillar in the digital analytics world. They offer an infrastructure that collects data and measures product effectiveness. Such insights can include how many users registered last week or who updated their contact information in the past month. However, to truly understand why users may not be performing certain actions within a product, it's vital to leverage session replays provided by experience analytics solutions. These tools can illuminate potential product-related issues that might be proving problematic for users.
What about A/B testing tools? They can significantly streamline the 'Build' phase by pinpointing the most effective strategies, ensuring that limited resources are invested wisely to maximise ROI. However, I've observed a common pitfall many organisations fall into - they often plunge headfirst into crafting solutions via A/B testing campaigns without fully understanding why users are struggling or experiencing friction in the first place. This leap can unfortunately lead to wasting significant time and resources on ill-fated A/B testing campaigns. What’s missing is an insights layer to amplify the ‘Learn’ phase.
As you can see in the above diagram insights are a key part of the story when following the build, measure, learn framework.
领英推荐
Challenges
Product Analytics tools can be incredibly valuable on the measurement side, but they often require substantial planning and implementation effort to track intended data effectively. While this sounds ideal in theory, the practical execution can be quite daunting.?
In my experience, many solutions tend to deteriorate once the technical expert on the team moves on, or when adequate resources are not committed to maintaining data quality.?Furthermore, these solutions typically only cover data points set out in the initial plan for common user conversion journeys.?
As we already know, real-world user interactions are far more complex and never this straightforward. Consequently, there are potentially some really big data black holes which make it a little harder to objectively ‘Learn’ from what we’ve measured. Experience analytics takes a different approach by capturing the experience level data of all users. Experience Analytics tools enable digital teams to exactly reproduce these experiences with session replays.?
At Insightech we go one step further by making these replays interactive and searchable so it’s easier to quantify conversion impact.?
How Insightech helps you both measure and learn
At Insightech, we built a comprehensive solution to the above-mentioned challenges with our interactive session replay feature. But what exactly is session replay and how can you use it to lift conversion rates on your website??
Session replays inside Insightech track the full user experience - without random sampling. This includes all error messages, mouse movement and clicking locations, and much more. This data is accessible for any page on your website and is displayed as a click map and scrolling heatmap in the UI.?
With a single line of code you’re able to capture this data out of the box without having to create separate events for each action you’d like to track. This makes it much easier to ship new changes to the website when you’re in the ‘Build’ phase.?
Another nice effect is that when changes are made to your website they’re reflected in your replays. Essentially, everything your users see in their session is captured and faithfully reproduced no matter what you do on the ‘Build’ side. Additionally, digital experience data is captured and accessible inside the data lake. This way it’s possible to measure any scenario without the massive design and implementation headaches.?
If this sounds too good to be true, yes, there are trade-offs. Capturing everything upfront delivers speed in implementing tracking solutions and flexibility in utilising the data. However like with any data lake project, there can be a large amount of data that’s actually captured and you might worry about the costs. It’s worth noting however that in today’s maturing big data technology market, a "track-everything" approach is not only feasible but also offers a significant return on investment.
I will continue writing my thoughts on building digital success with analytics and my journey of helping people to achieve that with my startup. Feel free to subscribe to my newsletter and send me your feedback.?
Founder, SaaS Pimp and Automation Expert, Intercontinental Speaker. Not a Data Analyst, not a Web Analyst, not a Web Developer, not a Front-end Developer, not a Back-end Developer.
1 年The build-measure-learn approach you describe comes from W. Edwards Deming. He called it plan-do-check-act. Lean Six Sigma called it define-measure-analyse-improve-control or DMAIC. Regardless of how you call it, it's the same idea. The trouble with the approach is that senior management would misinterpret it as indecision. Your progress without knowing what the exact solution will be, predict what the ROI improvement will be. Their gold standard is bold leaps of faith without realising that, for every success story they can tell you, there are thousands that failed. Few will have the courage to tell these stories of failure, and people can develop a biased view of how successful these leaps of faith really can be. The iterative approach is more honest because failing, especially failing early, fast and often, lets you find the solution and deliver value faster.