What do Lord Kelvin and Peter Drucker have in common?
Probably the most famous quote from management guru Peter Drucker is:
If you can’t measure it, you can’t manage it. – Peter Drucker (1909-2005)
However, Scientist Lord Kelvin beat him to the punch and called out a similar principle even earlier:
If you can’t measure it, you can’t improve it. – Lord Kelvin (1824–1907)
For one thing, this shows again that Physicists are usually beating everyone else to exciting insights about how the world works. :-)
Lord Kelvin also found the second law of thermodynamics, which postulates that everything will eventually end in unstructured chaos anyway, but that’s another story, so let’s not get distracted.
While I full-heartedly agree to the above principle about measuring, I would extend it to:
If you don’t know what you want to manage, you’re wasting your time measuring. Likewise, if you’re not committed to do what it takes to improve a metric, you might as well not bother measuring it at all.
All right, after that motivational downer, I want to reflect a little bit on metrics and reporting, what we should measure, and how we should think and talk about those numbers.
How to think about metrics
Why we care about metrics
Impact and outcomes (Output metrics) – In all we do, we prioritize and target our energy on doing a few things that we believe have the most impact on a given customer or business outcome. There are many things we decide not to do to keep that focus. Once we’ve done what we have set out for, we’d better know if our believes and assumptions were right (i.e., if we can build upon them) or wrong (i.e., what we can learn from them). Metrics help us to track whether our actions lead to the anticipated outcomes. They help us identify where we need to course-correct.
Defects and actions (Input metrics) – As hinted above, not all plans work as anticipated. Looking at the right (input) metrics helps us see where things don’t develop according to plan and prediction. Once we are aware of those areas, we can assess impact, and develop strategies to fix the issues. Input metrics are typically leading indicators, and while we care about the effects and outcomes, input metrics are where we can learn why things don’t quite work and take proper actions. Here is a quick read-up on input versus output metrics: https://www.dhirubhai.net/pulse/focus-inputs-alfons-staerk/
Early warning (Health metrics) – Last not least, metrics help us to avoid being blindsided. Like a canary in the coal mine, a good set of ‘early warning’ metrics help us avoid to discover an issue through an escalation, and instead proactively identifying it ourselves. No one wants to get an angry email from a customer or their boss.
When you develop the set of metrics that you want to track for your product and program, you want to make sure to track all three categories. Each of them is equally important and serves a specific purpose. However, you need to think about how you use them strategically and intentionally.
How we track and report metrics
Outcome metrics (impact and outcomes) are the ultimate goal, but they typically lag and move slowly. You want to present them to leadership and stakeholders, but do you want to do it weekly? Do they change fast enough? What’s the right forum for them (quarterly, monthly, or weekly reports)? If you present them weekly for reference, are they the data that you want to draw attention to every week?
Input metrics are critical to managing your product and program. They are leading indicators and most often change weekly. You do want to look at input metrics (defects and opportunities) weekly, but you also want to make sure you look at and focus on the ones where you would consider taking action. If you are not willing to take action or ask for support to take action, it’s just noise and distracts everyone. Make input metrics actionable or think harder what the right actionable input metric would be! Last not least, some input metrics are noisy. If that is the case, think about how you can report them differently to separate noise from a real trend. Metrics are all about learning, not about showing that you have many numbers.
Health metrics are critical, and you should look at those weekly or even daily. They are your insurance that you are not caught on the wrong foot. However, by definition, they should be very dull and not change much. No news is good news! If you have a story to tell about your warning indicators every week, then there is a more fundamental issue at hand. With that, in most cases, those metrics are something you and your team need to look at very frequently; however, you don’t want to report them to a broader group frequently (e.g., through regular reports). Instead, those are the metrics that, while not looked at by a large group regularly, should kick off an immediate heads-up to your leadership when you see things going sideways.
How to talk about metrics
For me, the most frustrating experience in metric reviews is to see a sea of data with no apparent focus or structure. In those cases, it takes me a while to catch on the slide structure, and by the time I have, I have missed the call-outs. The second worst thing is to have the same call-out on the same data, that didn’t change anyway, every week. The third most frustrating thing is to have a call out, that’s not related to the data on the slide – it utterly confuses me every time. Bonus frustration: having a new slide where the structure needs to be explained instead of being self-evident.
Slides are stories. They need to be able to speak for themselves without additional explanation. The stories they tell need to engage and focus on the ‘news.’
Let’s start with the simple thing – the presentation.
Visual presentation – Make it digestible
As we think about how we can turn slides into stories and data into statements, we need to give focus to presentation. A sea of data is not a story; it’s a distraction. One hundred rows don’t convey insight but chaos. Data that are not organized along a logical flow isn’t providing a signal, but increasing noise.
The flow of your data – First of all, think about the right logical flow of your data. What is the correct organizational principle that will guide the viewer along and help them make their findings? Often this is obvious (e.g., funnel steps or input metrics that feed into an output metric), but give it a hard thought. Having the right structure is the difference between a strong slide and story, and a weekly struggle to get through your WBR section.
Help drive focus – Most times, less is more. What data is needed? What data would you take action on? What data is critical, versus supplemental, and how can you visually highlight the critical data? Can you bold specific data, can you visualize a funnel structure in how you present your data? Make it easy for the viewer to see for themselves what you can see in the data. Also, make the hard choices not to show data that doesn’t matter. We’re all proud of all the data we can find; however, focus wins the game in a presentation every day. Metrics meetings are crisp and focused presentations of the state of the union, not word search puzzles.
Be clear what you talk about – When you talk about something that is not on the slide, be clear about it before you get into your story. Try to avoid that situation though – if you launched something new and it doesn’t show in the data yet, then talk about it once it does. When you talk about something on the slide, make sure to refer to where the data is – searching for the needle in the haystack is no fun in a fast-paced meeting.
Data with Intent! – WHY? SO WHAT?
We don’t talk about the data just because we have it. We have an intent! Be sure to talk about that intent. Why should I care? Why does this particular data matter? You know it, but I don’t, so please explain it to me!
Here are some categories that are usually leading to exciting stories about data. Don’t feel you have to tell each of them every week, tell a story if things have progressed or changed in a meaningful way.
Progress – We all want to know if we are making progress against our goals. A key to seeing that on a weekly or even monthly basis is to have a ramp plan. If we want to achieve a particular goal by the end of the year, where should we be this month, next month? How close are we to that ramp goal, and if there is a gap, what can we do to close the gap? An output metric without a ramp plan is useless! We need to know if we can feel good or should be worried. These callouts are typically related to the outcome metrics we committed to.
Learnings – What did we learn from the data. Are there any surprises or positive trends that we didn’t anticipate. This is where input metrics come in. Focus on the differentials, though, don’t repeat the same insight every week (without taking action). This is where we should talk about surprises and experiments and their related learnings and outcomes. Focus on what’s new, don’t tell the same repeating story again and again without changing the game from one meeting to another (if you decide something is not worth managing, remove the metric from the official deck). Most of the learnings come from input metrics that we are tracking.
Attention needed – Sometimes, trends turn in the wrong direction without warning, and without us having done something specific to anticipate that development. Those are the canaries in the coal mine. Be sure to call out when such things happen. Also, be sure to have some insights on what happened, or at least a plan and a timeline on how to get those insights. These alerts are essential data points and not bad. They tell us when we need to focus on something. Take them as an opportunity to fix something early on before it gets bad. However, don’t wait for a pre-scheduled meeting if you notice that health metrics erode – kick off an email thread with your leadership immediately and take action!
A lot of the follow-up questions from leaders usually poke into one of the above areas. By looking at the categories in that framework, thinking through your story along those lines, and presenting it succinctly, you will convey that you are on top of your game. You will show your confidence, ability, and success, instead of being caught off-balance.
Last, not least:
Don’t try to fill the space/time – If you don’t have a story in any of the above areas, don’t make one up. Power usually lies in not giving in to the temptation to fill empty space. Just say that nothing happened to the data that is worthy of a call-out, and make a short statement as to what changes you expect to see shortly and why (if you don’t anticipate any, then your program is dead, and the slide should be removed). Related, there is no rule that you have to have three call-outs. I would much rather hear one strong call-out and finding than three repetitive ones.
Learning from the data can be fun if we let it be! Use metrics, reports, and reviews as an opportunity to learn about your space and tell compelling stories to leadership.
#Metrics #DataDrivenDecisions #Management