2023 DORA Report
This is the latest issue of my newsletter. Each week I cover the latest research and perspectives on developer productivity. Subscribe here to get future issues.
This week I’m summarizing the newly released 2023 State of DevOps Report . For those unfamiliar, DORA (DevOps Research and Assessment) is a long-running research program that focuses on helping engineering teams get better at delivering software. Each year, DORA publishes a report based on their research investigating which capabilities are driving software delivery and organizational performance. ?
I also interviewed Nathen Harvey (who leads DORA at Google) on my podcast this week, so if you’d prefer to listen instead of read, you can find the full interview here. I’ve incorporated some of what I learned from Nathen into today’s summary.?
Key takeaways from this year’s State of DevOps report?
The DORA research program tries to understand the relationship between different ways of working and relevant engineering outcomes. The “outcomes” the researchers look at fall in two categories: organizational performance and the wellbeing of developers.?
While many people are familiar with DORA because of the four measures used to assess software delivery performance, we can see from the model above that these metrics are actually just one part of a broader program to drive meaningful improvements. The real substance of DORA’s research lies in the capabilities. ?
The annual report is conducted through an industry survey that is promoted online; this year, the report had nearly 3,000 respondents. Prior to each survey, the DORA team determines which outcomes they want to measure against (e.g., organization performance, team performance, and employee wellbeing), as well as any additional research questions they want to explore as part of the annual study.?
Here are my key takeaways from this year’s report:?
Teams that focus on the user have higher organizational performance?
One of the main findings in this year’s report revolves around “user-centricity,” which refers to how well a team understands their customers and whether they take action on customer feedback. Nathen mentioned that the research team was inspired to explore this topic due to the growing industry interest in platform engineering. In essence, platform engineering teams sometimes don’t understand their users well enough, and end up building things developers don’t need or use. The research team was interested in understanding whether this concept of user-centricity was a driver of performance for both internal and external facing teams.?
To study user-centricity, participants were asked about how well their team understands the needs of users, how aligned the team is toward meeting user needs, and whether user feedback is incorporated when prioritizing work.
The study revealed that teams with a strong user focus have 40% higher organizational performance. Here’s the research team’s advice for internal and external-facing teams to apply this finding:?
Quality documentation amplifies the impact of other capabilities
Documentation refers to the internal written knowledge that people use day-to-day. To study the impact of this topic on performance, the researchers measured the degree to which documentation is reliable, findable, updated, and relevant. Then, they calculated one score for the entire documentation experience.?
Documentation is interesting because it amplifies the impact of other capabilities on organizational performance. For example, the study found that quality documentation amplified the impact of continuous integration on organizational performance by 2.4x, continuous delivery by 2.7x, and reliability practices by 1.4x.?
领英推荐
Quality documentation was found to positively impact individual job satisfaction and productivity as well.
The technical capabilities that impact performance
DORA’s research also explores whether specific technical capabilities have an effect on the following performance measures. I’ll include them here as they’re defined in the report:
Teams that have loosely coupled architecture (which are also called loosely coupled teams in the report) are able to make significant changes to their systems without involving other teams. This enables them to move faster; and as shown in the table, it’s the only capability to have an effect on all of DORA’s performance measures.?
In our conversation, Nathen also pointed out the significant impact of code review speed on software delivery performance. “We saw that speeding up code reviews led to 50% higher software delivery performance… If your code reviews are already fast or maybe even non-existent, don't try to make them faster. That's not your bottleneck. But where your code reviews are slow, I think you have a big opportunity there.”?
Other notable changes since previous reports
1. Team performance is a new construct introduced in the report. In the past, DORA’s research focused on organizational performance. Nathen explained that the way organizational performance is measured is by asking questions such as how profitable their organization is and whether they’re meeting business goals, however some practitioners can be disconnected from those outcomes. Team performance is closer, but still beyond an individual’s remit.?
2. MTTR was replaced with Failed Deployment Recovery Time. MTTR has caused some confusion in the community: is the “M” for mean or median? Additionally, the report notes that practitioners seeking to learn more from failures, such as those in the resilience engineering space, are moving past MTTR as a reliable measure for guiding learning and improvement.?
3. The way Change Failure Rate is measured changed. In previous years, respondents were presented with six options (0-15%, 16-30%, etc.). This year respondents were presented with a slider so that they could select any value between 0% and 100%. The metric now provides more precision in the answer.?
?4. “Elite” reemerged as a cluster in the benchmarks. Last year, there wasn’t an elite category that emerged in the research, so it wasn’t included in the report. ?
Final thoughts
DORA’s annual reports are a great resource for understanding the practices and trends that are benefiting other organizations. I’m always eager to learn about the metrics and how they’re measured, and this year I also thought the insight about user-centricity was especially interesting.?
A special thanks to Nathen Harvey for generously sharing his time in the community and on my podcast to discuss his team’s research and how it might be successfully applied. I always look forward to reading DORA’s research and am glad to see the program continuing strong.?
That’s it for this week! If you’re interested in reading a guide for running an internal survey to identify problems impacting developer productivity, send me a connection request with the note “guide.”
DORA goes beyond metrics—it's about transformative capabilities driving superior software delivery! Thanks for sharing.
That's fantastic! The 2023 State of DevOps Report by DORA is always insightful. Looking forward to your summary and key takeaways. Any standout findings or trends you've come across so far? #DevOps #SoftwareDelivery
Lean Portfolio Management | Product Operating Model
1 年Chris Grubb
Tech Lead | Helping engineers, teams and organizations become more effective and data-driven
1 年Thanks for sharing Abi Noda. The DORA core model is a great starting point to frame the work to be done as technical capability investments. It has helped me to drive the right conversations with stakeholders to kick-off engineering enabling initiatives.
Sven Müller