On metrics and analytics
myriam jessier via unsplash

On metrics and analytics

Insights on using data TL;DR:

  • How to work with data without expertise in data
  • AI and NoCode Tools
  • Data Collection and Analysis
  • Dashboards and Data Visualization

In this issue, I am talking about tracking metrics and analytics. The old adages of “you are what you measure” and “what gets measured get done” ring true to some extent. I found that my customers often tracked things that were readily available or easy to measure. At AWS, we tried to understand what the goals are and what needed to be done to track them. This often involved coming up with our own metrics and building our own tools.

How I think about metrics

I’ll start again with what I learned at AWS in the past 3 years. The organization operates on data, with people striving to make data-driven decisions at each level. We had a Weekly Business Review (WBR) with the key metrics in our team, which were collated up into a Monthly Business Review (MBR). Those, in turn, were synthesized into a Quarterly Business Review (QBR) at the business unit level. In practice, an observation you pointed out, a data point you collected, and a recommendation you made at the team level could go all the way up to the CEO, Adam Selipsky.

Cedric wrote about metrics at Amazon extensively, and Christine reveals what WBR in practice looked like. The practice is a combination of standard formats for reporting, along with a process to guide people through it. Initially, it is quite intimidating as you have to gather data on the fly throughout the week, then see how your individual observations connect to other people’s views. A lot of number crunching involved.

I track my own metrics using ChatGPT, Notion, and Zapier, but that’s because I’m cheap. If you can, the tools below will speed up your data scraping, extraction, analysis, and visualization. The WBR-MBR-QBR system is great for catching signals across the organization to get an early hint at what’s going on. I started using a variation of that to consistently reflect on my work.


Making data-driven decisions

I see a lot of AI tools emerging to make data-driven decisions. Take Tara , for example, that uses project management data (among others) to determine how engineering teams are performing and recommend improvements. If you look at its interface, the word “allocations” alone should be enough to strike fear into the heart of any engineer. This is a good example of what not to do in leveraging data. You need to know what outcome you are aiming for and then decide what data will help you make the right decisions.

Tara takes a broken model of allocating engineering time across projects and tasks, then makes that model worse by playing into it. The problem with data-driven decision making is that people tend to pick the data to fit a decision they already made. Using data to explore options is much harder and needs some understanding of data science. Luckily, that’s no longer necessary in many cases.

screenshot tara

AI and NoCode to the rescue

I am not a data analyst. Excel is enough to give me a headache. That won’t be a problem in the future as you can create formulas with tools like GPT Excel or Ajelix today. I assume that these will be built in features in Excel, Sheets, and Numbers will soon have AI built in. Soon, you won’t need to dive in deep into office tools. I wrote a piece about leveraging technology to run product management as a one-person show.

Data Collection and Web Scraping

You don’t need to be a data scientist or expert programmer to get data from public sources online. The AI news site Unity has an overview of data extraction tools and data analysis tools . In comparisons and listings, it’s hard to determine what you need initially. The tools I’d prefer from those lists are Octoparse for scraping and Monkeylearn for analytics. The reason is simple - both are NoCode and have a low barrier to get started. It’s better to start today than delay decision-making for marginal improvements.

Using GenAI tools for data analysis

Tech.co outlines 4 ways to get ChatGPT 4 plus for free . MS Bing, Co-Pilot, Perplexity, and Merlin use the same LLM. The difference is in the User Interface. I’ve been using Gemini and Claude which have similar performances, so there is no reason to pay $20 / month unless you want to build custom GPTs. Sites like Geeky Gadgets make comparisons between the different models, though, for most intents and purposes, the slight performance differences are negligible.

Dashboards for Metrics

I used Tableau while working at GE and PowerBI at McKinsey and AWS. Tools built 15 years ago in software are like clay tablets and chisels for writing. They work and have evolved to some extent, but compared to the new entrants, they have not aged well. Have a look at Cyfe or Geckoboard as dashboards for your purposes. They have sleek interfaces, are easy to use, and have many templates for most cases. They are not bloated - yet - I find them easier to use immediately.

screenshot geckoboard

Data Visualization

I learned over the years that you need to visualize your data and embed your insights into storytelling. Dashboards are not that. They provide oversight, but don’t tell the story. Flourish does a great job at visual storytelling. You may have seen the animated bar charts showing changes in urban population or climate online. They were bought by Canva a while ago, which is great for visualizing ideas, though it has become very cluttered. An easier-to-use alternative is Snappa .

flourish visualization

Building your own data hub

I’ve been having fun exploring different tools and building my own dashboards. Ideally, this should be completely automated at some point as I don’t want to spend much time on it. I found that most tools available to people are adequate to gather data, analyse and visualize it at a high level. Google has Looker Studio while Microsoft has Fabric . Both are freely available in their respective office suites.

I’d love to hear your take on this and see examples of others tracking and analyzing their metrics. Most people do this in their workouts and their finances. But I haven’t seen this applied in other areas like relationships, vacation planning, etc.


This edition was slightly different from the previous ones. I’d love to get your comments to improve this newsletter. Please let me know what you liked, what could be improved, and what you wish for.

?Sascha Brossmann

Fractional CPO/VP/Head of Product, Advisor & Coach for Startups and Scaleups (B2B SaaS Platforms 0-1-scale) | Keynote Speaker | Community Builder

8 个月

Have you already had a look at https://observablehq.com/?

回复
Godwin Josh

Co-Founder of Altrosyn and DIrector at CDTECH | Inventor | Manufacturer

8 个月

Harnessing data-driven decision-making involves delving into metrics and analysis. NoCode and AI tools offer a dynamic approach to navigate this landscape, allowing for streamlined processes. You talked about metrics and NoCode in your post. In the realm of AI-powered analysis, considering scenarios with sparse or unstructured data, how would you technically leverage NoCode tools to enhance decision-making, ensuring robust insights? Your thoughts on tailoring these approaches for intricate situations, such as optimizing resource allocation in resource-constrained environments, would be insightful.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了