DORA 2024 - usage of Artificial Intelligence

DORA 2024 - usage of Artificial Intelligence

The influential DORA group have recently released their 2024 “State of DevOps” report. In my previous article I dived into the DORA report on Software Delivery Performance.

In this article I will look at usage of Artificial Intelligence. This is a new topic which has been added for the DORA 2024 report and gives an interesting insight into how this new and exciting field is developing.

our survey strongly indicates that AI has produced an unignorable paradigm shift in the field of software development (DORA State of DevOps 2024)

Estimates suggest that leading tech giants will invest approximately $1 trillion on the development of AI in the next five years.

Changing working practices: https://agileplays.co.uk/the-risks-of-local-optimisation/

How much is AI being used?

Developers are increasingly looking to AI to help perform their role in a more effective way.?

76% of respondents are using AI in one or more daily responsibilities.

AI now appears to be well established as a tool. The numbers do not vary greatly between industry sectors. There's some indication that data scientists have slightly higher usage and hardware engineers somewhat lower.

There is some variation by size of organisation with higher adoption in smaller businesses. This is likely down to the high level of change in the AI industry being better tolerated in smaller organisations. Compliance and procurement tend to be faster moving than in larger businesses and this makes it easier to keep up with the frequent introduction of new tools and models.

Where is AI being used?

The two most common use cases are very distinct.

AI is used most in writing code (75%) and summarising information (71%).

Summarising information plays into the GenAI language model skills. A high adoption rate on code writing may be more of a surprise. However, the driver here seems to be the embedding of AI tools into IDEs. Since 73% use an IDE with embedded AI, this fairly closely matches the AI adoption for code writing.

The degree to which the AI features are used is probably harder to assess. As the report notes, it can be a challenge to distinguish between frequency of usage and depth of usage. It is possible to use a tool with significant frequency but not to be reliant on the tool.

Why are developers using AI?

The answer here seems to be somewhat disappointing. Rather than there being a clear usage model, the biggest driver seems to be that AI is new. Developers want to keep up with new technology and to experiment with AI to avoid being left behind.

"if you don’t use it, you will be left behind quite soon" (DORA State of DevOps 2024)

That is not to say that AI isn't bringing benefits, as we shall see, but the initial motivation of adoption seems to be avoiding falling behind.

The biggest driver in AI adoption for coding seems to be fear of missing out

Are developers seeing a benefit?

Those using AI generally reported productivity increases, although we should be careful to recall that some of those not using AI may have abandoned it because they did not see improvement. Continued use will generally imply some value is being gained.

67% of respondents reported that AI had caused at least some productivity improvement.

The graph below shows the split of observed improvement. It seems clear that benefits for now are primarily viewed as small, although a small fraction of users see major gains.

AI productivity improvements - State of DevOps 2024

Do developers trust the output?

With AI generating code, it is interesting to see how the code is being used. A key factor here is how much the developer trusts the code generated by the AI.

If trust is low, developers will use AI primarily for examples, and are less likely to put AI-generated code into production. If trust is high, developers will feel confident that they can directly use the code in the product.

There seem to be a substantial number of developers who doubt the output of AI tools.

39% of respondents reported little or no trust in the code

The profile of level of trust from developers is shown below. It's clear that there is still a long way to go before most developers will be comfortable to have AI tools directly generating product code.

the degree to which respondents reported trusting the quality of AI-generated code was generally low (DORA State of DevOps 2024)
AI trust by developers - State of DevOps 2024

How is AI improving developer roles?

Across society as a whole, one of the key topics being discussed around Artificial intelligence is the impact on jobs. There is significant anxiety in the general population about this. The DORA report included an investigation of the perception by developers of how AI will affect their roles in the future.

In general the data shows is a belief that increasing AI adoption increases organisation-level performance. It also, but to a lesser degree, increases team level performance. If we dive into these increases, we see that they are due to improvements in:

  • Flow - how much focus a person tends to achieve during development tasks.
  • Productivity - the extent an individual is effective and efficient in their work, creating value and achieving tasks.

The combination of improved Flow and Productivity are also tending to increase job satisfaction. This might be expected, as people value their ability to successful contribute to a wider goal. In Daniel Pink's model, this increased job satisfaction would be primarily from "Purpose" and a certain level of "Mastery".


What is AI not improving?

Respondents saw an improvement in Flow and Productivity, but AI was not having a positive affect on every area. They felt that by contrast AI was having little impact on:

  • Time spent doing toilsome work - the percentage of an individual’s time spent on repetitive, manual tasks that offer little long-term value.
  • Burnout - encompassing its physical, emotional, and psychological dimensions, as well as its impact on personal life

And the results also showed that AI was having a negative impact on

  • Time doing valuable work - the percentage of an individual's time spent on tasks that they consider valuable.

These are very interesting results and suggest a level of concern over what work AI will be picking up.

The suggestion is that using AI, developers will deliver more value and generate more impact and that this will aid job satisfaction.

However, the split of kind of work being undertaken by developers does not improve as a result. Developers spend the same amount of time on repetitive, manual work with the same risks of burnout.

Although developers achieve more, they spend less of their time on valuable work

DORA hypothesise that the net effect of AI is to allow the developer to do valuable work faster. This means that developers now have more time than they can fill with valuable work, and so fill their time with lower value work.

We should be clear that this is an interpretation of the data, and other interpretations are possible. However, DORA do appear to be failing to address an obvious conclusion - the elephant in the room as far as AI is concerned.

If developers can do valuable work faster but are limited by the amount of valuable work available, the obvious solution is to employ less developers.

Work most improved by AI

DORA looked at which types of work has been improved by AI adoption. The highest category here shows a high level of improvement

  • Documentation quality - The perception of internal documentation in terms of its reliability, findability, updatedness, and ability to provide support.

This is not a great surprise. The ability of large language models to manipulate text is well known and using this to create and manage documentation is an obvious improvement area.

Work moderately improved by AI

A more moderate level of improvement was observed in a number of areas

  • Code Quality - The level of satisfaction or dissatisfaction with the quality of code underlying the primary service or application in the last six months.
  • Code Review speed - The average time required to complete a code review for the primary application or service.?
  • Approval speed - The typical duration from proposing a code change to receiving approval for production use in the primary application or service.
  • Technical debt - The extent to which existing technical debt within the primary application or service has hindered productivity over the past six months.
  • Code complexity - The degree to which code’s intricacy and sophistication hinders productivity.

Although the improvement was only at a moderate level, these results are extremely positive. There are many areas here related to the quality of deliverables which are positively impacted by the use of AI.

We should be a little cautious, as the data does not tell us why these improved. Where we see areas like review speed improving, it is unclear whether AI is directly driving better processes. Alternatively, whether AI is improving documentation and helping teams understand the code. Or possibly AI is having an effect on reducing code complexity which is the root cause.

Work negatively impacted by AI

Two areas are reported as being negatively affected by the move to increased AI usage

  • Delivery throughput (moderately worse)
  • Delivery stability (highly worse)

These two negatively affected areas should be taken very seriously.? At the start of the article, I emphasised how these have been the key DORA metrics for DevOps improvement for the last ten years.? To many people these are the standard measures for good practice in DevOps.

Both, especially stability, are being negatively impacted by AI adoption. Again the DORA report shows the data, not the underlying root causes.?

We can reasonably hypothesise that the rush to AI adoption is disrupting some of the processes which have been adopted over the last decade.

The move to AI adoption needs to be well balanced by the basics of DevOps good practices.

AI for products

An increasing number of organisations are choosing to investigate or incorporate AI into their products.

81% of organisations have increased incorporation of AI into products

I found this data surprising. Adoption of AI as a tool is well under way, especially with integration into IDEs putting this onto people's desktops. But such a high figure of integration of AI into products was not what I expected.

Looking into the data, this is rather disappointing. It clearly shows that the main driver is not the value which AI can add to the product. Instead, the driver appears to be the value of AI as a marketing headline.

Everyone wants an "AI enabled" badge on their product

Related to this is the fear of missing out and of competitors gaining market share by adopting earlier.? Interestingly this does not suggest that actual product value is the main driver.

Indeed the data does not suggest that product performance is actually positively impacted by the adoption of AI.? However, product performance correlates with overall team and organisational performance, which are improved.

If we consider this, it implies that the higher performing teams are using AI effectively to improve their products. If the average product performance is not increasing, then the lower performing teams are probably damaging product quality by the rush to introduce AI.


Good practices

DORA has some recommendations for how best to approach this introduction of AI into the organisation. As with their general DevOps practice, they recommend an experimental approach based on continuous learning, looping around the steps below:

  • Measure the current baseline state
  • Propose an improvement hypothesis.
  • Plan the improvement
  • Implement the planned improvement

However, incremental approaches alone are not enough to manage a significant change of the scale of Artificial Intelligence. This should be backed with a central corporate approach.

it is also abundantly clear that there are plenty of potential roadblocks, growing pains, and ways AI might have deleterious effects (DORA State of DevOps 2024)

The organisation should be clear on the mission and policies for AI adoption. Employees should be clear on the AI mission, goals and adoption plan to achieve those goals. Organisations need to focus both on the vision and on the specific policies to address areas of concern such as code and tools. Clarity and openness can reduce concern and help keep the teams focussed on delivering value.

It is also important to identify the specific opportunities and risks around AI adoption. There are potential drawbacks identified in the report, such as reduced time on valuable work or the impact on delivery stability and throughput. There are also areas where decisions may not be opportunity-driven, such as including AI into products. Understanding how AI can be beneficial and also where it raises risks allows the organisation to support the learning process and to translate those learnings into action.


Jay Alphey


I help scaling tech organisations with systems and structures to achieve repeatable delivery.

If you want to discuss how I can help your organisation, DM me.

This article is also available on the AgilePlays Web site at https://agileplays.co.uk/dora-2024-usage-of-artificial-intelligence/

Nick Gardner

Data Science for Finance at ARM and Small Business Owner

4 个月

Nice article. Great to see you’re well. I'd like to share my experience on “Why are developers using AI?”: Quite frankly, it just saves time and provides a 'checkpoint' when other expertise are unavailable. Experience informs the prompt given to the pre-trained transformer (usually the architecture for a large language model), which then sources a generic answer to the problem I’m working on. From there, I can refine the solution as needed. In my opinion, the model provides a response that represents the 'mean' of the distribution of possible answers. When you think about it, this makes sense given the self-attention mechanism. The answer itself isn’t necessarily optimal, but it takes care of the 'grunt work' (for lack of a better term).

回复

Great article Jay Alphey with some very interesting insights. Have you come across anything similar in a broader operations context? How could or is AI being used across business operations to drive similar improvements and engagement?

回复

要查看或添加评论,请登录

Jay Alphey的更多文章

  • Timebox, don't scopebox

    Timebox, don't scopebox

    We have all heard the term "timeboxing" used, especially when talking about iterative development. "Timeboxing" is the…

    2 条评论
  • How much team stability is good?

    How much team stability is good?

    As a leader, your choices about how you set up teams impacts how you can manage work. By making decisions on how teams…

  • Lean, software waste and bananas

    Lean, software waste and bananas

    We all know that "waste" is bad. We use the term "waste" for all sorts of really bad things.

  • Agile in Three Dimensions

    Agile in Three Dimensions

    It feels like social media feeds are full of "Agile is dead" articles. However, it seems everyone has different views…

    6 条评论
  • Predictability isn't free

    Predictability isn't free

    At a past organisation I had been working across Engineering on improving flow and we were seeing significant…

  • Schr?dinger's cat and software productivity

    Schr?dinger's cat and software productivity

    A CEO wants to understand the value she is getting from the team whose salaries she is paying. A VP wishes to reward…

  • Why local optimisation fails and what to do about it (part 2)

    Why local optimisation fails and what to do about it (part 2)

    When faced with problems it is easy to rush into making changes in your own team to respond. It is always easier to…

  • Why local optimisation fails and what to do about it (part 1)

    Why local optimisation fails and what to do about it (part 1)

    At a startup communication is easy and work flows efficiently. Everyone knows what is most important and gets on with…

  • Building a "root cause" mindset

    Building a "root cause" mindset

    In a fast-moving or scaling environment, our processes cannot be static. Learning and continuous improvement must be at…

  • Leading with questions

    Leading with questions

    In a traditional organisation, the role of the leader is to know all of the answers for the teams. In a stable…

    6 条评论

社区洞察

其他会员也浏览了