Maximizing Tech ROI: The Metrics Every CEO and CTO Needs
from Analyticsverse

Maximizing Tech ROI: The Metrics Every CEO and CTO Needs

Is your technology team a black box? Let’s change that.

Your technology team is likely one of your company’s most significant investments, but how do you know you are getting your money’s worth? Metrics can provide critical insights and perspectives into the current reality and the way forward, and understanding how your technology teams’ performance impacts overall company performance.

In today’s challenging investment and economic climate, ensuring your organization maximizes its impact is critical. Historically, technology teams have been challenging to measure objectively, but with some brilliant research in the recent past, that’s not true anymore. Every CTO and CEO ought to be having a conversation around objectively measured and benchmarked metrics that cover the results and the leading indicators that show how the results are likely to move, as well as the levers you have to pull to move them in the right direction effectively.

In organizations where I’ve moved the conversation beyond “trust me,” ˉ\_(ツ)_/ˉ, or custom in-house developed metrics, and centered conversations about technology team performance on industry-standard metrics, the relationship between technology and the rest of the organization has radically improved. Providing transparency builds trust. I’ve driven this transformation in organizations I’ve managed; you can drive the same transformation in yours.

Are you confident in your technology team’s performance? Do you know exactly where to focus your investments for maximum payoff?

The Power Of DORA

The first step in developing a metrics program for your technology team should focus on macro-level metrics for the whole team that encapsulate the entirety of the product development process. Metrics that reflect the results of your technology team’s effort are effectively a speedometer and warning lights, giving you critical telemetry for your performance. Just as importantly, these metrics allow comparison across organizations, companies, and industries.

The industry standard metrics of choice for this function are a group of four metrics popularized by the DevOps Research and Assessment Organization (DORA.) These metrics are backed by significant research that links them to technology team and overall company performance. Moving these metrics in the right direction doesn’t just lead to better results for technology teams but also your company.

Two metrics reflect your organization’s speed: change lead time and deployment frequency. Both these metrics bring a Toytota Process/Lean approach to software and product development. At a high level, change lead time reflects on the purely technical, engineering side of code deployments: it tracks the elapsed time between the code being complete and the code being in your production environment, in use by your users. It reflects how quickly you move code from “inventory in progress” to “shipped finished product.” Code isn’t adding value for your customers or business until it is in your customers’ hands, and minimizing this metric results in more value being delivered to customers more quickly. Deployment frequency is the best proxy for batch size, another way to speed value creation and a focus of the Lean movement. Small batches move more quickly through the development process, with lower risk in many ways, and more frequent deployments indicate smaller batches.

Two metrics measure the quality and safety of the process and systems. The first is the “change failure rate,” measuring the percent of the changes made in production that introduce unwanted changes (bugs and defects) requiring further work to remediate. The other quality measurement is “mean time to resolution” (MTTR), which measures the time between introducing an issue in your production environment and fully resolving that issue. These metrics are straightforward in their impact on your business, your customers, and your team’s productivity. More time spent rolling back changes, rolling forward with additional changes to remediate a failed change, and more time spent investigating and remediating production issues is bad for your customers. The unplanned work and rework distract your team from moving the roadmap forward with new products and features.

The Myth of Speed vs. Quality

Decades ago, the technology industry operated under the assumption that speed and quality were tradeoffs: speed led to low quality, and shipping high-quality products took time. The research behind DORA proves that linkage is no longer true, and the two tend to move together. Teams that deploy frequently and quickly tend to have fewer quality issues, which are easier to fix and less impactful for your customers. The opposite is also true: teams that deploy slowly and less frequently tend to have more issues that take longer to investigate and remediate. Slow and infrequent deployments are bigger, more complicated changes; they are high risk, more likely to fail, and more complicated and time-consuming to debug when they do fail.

The mechanisms teams have most readily available to increase speed (automation and small change sizes) are the exact mechanisms that lead to fewer issues (automation) and faster recovery time (small changes).

How Do You Measure Up? The Importance of Benchmarking

For CEOs, DORA metrics provide a few significant advantages. First, thanks to the DORA organization, these terms have standard definitions consistent across organizations and industries. Leveraging that standardization, the DORA organization publishes yearly benchmarks for these four metrics in its “State of DevOps” report. The report defines the range of values for each metric for high-, medium-, and low-performing teams. It makes it very obvious where your organization is excelling and where your organization needs to improve its performance. Comparing your technology teams’ performance against the benchmark gives you and your peers an objective perspective on how your team is performing in an absolute sense.

As I’ve said, the goal is to be data-informed, not data-driven. These metrics should be the start of the conversation, not the end. The realities of your specific environment (industry, product features, customers, company lifecycle) will impact your team’s realistic peak performance target. The realities of how you deploy your team’s resources will impact the pace of change and improvement you can expect. Seeing these metrics compared to the benchmarks can lead to reconsidering how you deploy your resources and where you invest your team’s time.

The most important comparison isn’t with other organizations; it is with your past. How is your team trending over time, and how are the four metrics moving compared to each other? What goals are you setting for your technology team to improve these metrics? What projects have you prioritized that are expected to have a material impact on these numbers?

Beyond DORA: Custom Metrics for Unique Challenges

These four DORA metrics bring significant transparency to your technology team’s performance, but they aren’t going to answer all the questions about your team’s performance and impact. In every environment where I’ve installed a DORA metrics program, we added metrics reflecting our unique priorities, challenges, and opportunities. System or product uptime is a common addition, as is tracking defects in the production environment. As you investigate the sources of poor performance, you’ll find your own metrics to add to this list; create a standard definition, start tracking performance over time, and add it to the conversation about technology team performance.

## Metrics Driven Prioritization

Teams I’ve managed have used these metrics to help our peers across the company, especially in Product Management, understand the necessity and payoff of investing in process and technology-driven initiatives. Investing capacity to improve these four metrics significantly impacts all the new feature and product work that comes after it.

These metrics become the language we use to evaluate the impact and prioritization of projects. The payback period becomes the lingua franca of the evaluation process, helping us identify the most effective projects and finding a balance between short- and long-term projects with valuable impacts. These metrics also become the yardstick by which we measure if our changes had the desired results.

Do you have a metrics program in place? How has it changed how you manage your technology team? Does the entire company have visibility into the tracking metrics? Do the metrics impact your goal-setting process?

Leading vs. Lagging

But understanding the DORA metrics is just the beginning. In our next installment, we dive deep into the leading indicators that truly drive your tech teams’ performance. Stay tuned….

I’d love to hear from fellow CTOs and CEOs. What metrics are transformative for you? How do you ensure alignment with business goals?


Dean Hinnegan-Stevenson

Enabling Enterprises to Harness the Power of ML and AI

1 年

Appreciate you highlighting that organizations should compete with their "past selves" and not other orgs. Too often I've seen teams go number chasing once they have their DORA numbers. Once you've got them, its more like getting a yearly check up from a doctor that you can focus on the areas that will improve your health!

Stuck in storyland:( Measuring DORA on stories (tasks), when it takes multiple dependent stories for the customer to get usable software.

回复

Not at CTO but really interesting stuff!

Abhi Dhar

Global Technology & Digital Leader | Board Director, TransUnion CIBIL | Former CIO/CTO TransUnion, CIO/CDO Walgreens | Angel Investor | Strategic Advisor

1 年

Right on. Definitely agree with Leon Chism

Brad Jaehn

VP, Customer Intelligence and Experience at Caesars Entertainment. Former VP of Product at McDonalds and Gogo. Cornell Alum. Founder with a Highly Successful Exit. Mental Health Advocate.

1 年

Please start selling posters that say “The Myth of Speed vs. Quality”

要查看或添加评论,请登录

Leon Chism的更多文章

  • The Strategic Impact of Generative AI

    The Strategic Impact of Generative AI

    Thanks to the transformer architecture, Generative AI and the Cambrian explosion going on in AI has thrown many an…

    3 条评论
  • Metrics That Drive Performance for CTOs and CEOs

    Metrics That Drive Performance for CTOs and CEOs

    Have you ever felt like driving a car without a dashboard? That’s how it feels to lead a technology organization…

    4 条评论
  • A CEO and CTO’s Guide to Effective Management

    A CEO and CTO’s Guide to Effective Management

    Why are metrics in technology teams sparking conversations lately? Because every CEO and CTO knows that understanding…

    10 条评论

社区洞察

其他会员也浏览了