Scaling the Sprint Review
Luiz Quintela
Lean and Agile Consultant, Trainer, Product Manager, AI for Productivity, OCM
You are probably asking yourself why I decided to write about the Sprint Review. Well, I happened to be attending a review just two days before I wrote this. Besides, it is a well-known fact that I spend a lot of time speaking about product ownership topics, I did at least 6 webinars about it in 2019.
So, I wrote a bit about the Sprint Review for team and group of teams based on my last webinar in 2019. Parts of this article may seem obvious to many, but I am often surprised how it is not that obvious to so many people.
The Sprint Review event is the venue where the development team demonstrates the results of the Sprint and respond to potential questions from the audience.
The Product Owner explains which product backlog items met the definition of done and which have not and tracks total work remaining toward a goal and the stakeholders and/or users provide feedback that may affect the product backlog.
The Sprint Review, when done right, is a great example of inspect and adapt. Based on feedback and other factors such as an ever-changing market conditions may cause the Product Owner to either add, delete, or change backlog items and the order of importance. In some cases, an empowered and competent Product Owner may decide to pivot out of a product.
I recommend the following when I teach new Product Owners and Scrum Masters. The Sprint Review exists to gather feedback so you can inspect and adapt. Heavily technical demos, endless PowerPoints, way too long an event are in the list of avoid at all costs. They cause key people not to attend or even if they do they just pretend to be interested and, you have a serious dysfunction.
That is why it is critical to have any demo from the perspective of the end users and/or stakeholders as well as demonstrating business value but I often see Sprint Reviews focusing in the output instead of the outcome. I was a developer for a long time, and it took me a while to understand that and to learn how to demo from the right perspective for the audience that must attend the Sprint Review.
The Product Owner, the development team and the Scrum Master should be taking notes of anything and everything that the audience offers as feedback. I cannot emphasize this enough.
They must combine and summarize their notes and the Product Owner must then read it back to all present. The Product Owner must not rely on sending emails with the summary or a PowerPoint, odds are nobody will read them.
One of my best “war stories” was this organization that employed about 16 people including a Scrum Master and Product Owner who would create beautiful PowerPoint presentations that nobody seldom or ever read. The same organization that also had 74 primary metrics with a punitive focus! Speaking of waste creation and how not to run a development department…
Even when the demo is from the correct perspective, you must avoid controlled demos! Let the attendees actually “pilot” what you demo! You get a ton of valuable feedback when people interact with what you created instead of just looking at it.
The need to have a demo that people can interact with was highlighted to me a few years ago when a CEO told me during a break that it was obvious that the stuff was not done because it was being run from a developer’s laptop and he could not “play” with it. However, it met the definition of done and was ready for deployment, but we did not think about how people perceive things differently than we did.
Before I get into the scaling of the Sprint Review, I would like to note another few key dysfunctions that must not happen. A Product Owner that ignores feedback or acts like he or she knows more that the end user will cause havoc. Years ago, while mentoring and training a group of teams I had the opportunity of witnessing one of those cases.
The teams were working on a new customer support system and they did develop an awesome system. that went through several Sprint Reviews.
Eventually it went live and I got a message that a “rebellion” started in the floor space where customer support teams were located. A few hours later someone told me that the system changed their workflow, and nobody consulted them. The Chief Product Owner knew better than the users!
Every time I tell this story, I hear something like “You are making this stuff up!” or “Really?! Who does that?!”
Unfortunately, it is true and happened. A large amount of time, people and resources went to waste. It seems that some people are very good at creating waste instead of doing their best to get rid of it.
Understanding your user’s needs is key for a Product Owner. That is why at the very beginning of this text I mentioned competent and empowered!
You are probably curious why I wrote this much, and I did not even mention anything about how to scale the Sprint Review. Repeat after me: “If you can’t Scrum, you can’t scale it!”
Let us now discuss scaling the Sprint Review. I have another “war story.”
I was helping an organization with their scaling effort. They were doing relatively well and after I attended their Sprint Review, I noticed a few things that go directly in line with the things I just wrote about.
They had 12 teams in the group and every team had 20 minutes to do their review. Even with short breaks you end up with close to a 5-hour event. Assuming the audience can survive that, there was the typical case that if you give people 20 minutes, they will use all of it and they will likely not use it well. Not to mention that one team would come, do their demo and leave and then another team and so on.
The day after the review, I suggested to the Chief Product Owner cutting down on the time for each team to 10 minutes. He was a tad hesitant that the teams would complain so we went with 15 minutes for the next Sprint Review. Running experiments is critical to good Scrum and fortunately I was working with a very competent Chief Product Owner.
Progress! We just cut an hour out of it. Note that the Chief Product Owner did not tell the teams any of the things I mentioned above about how to demo. We just wanted to experiment with what happen by just making it shorter.
The following Sprint Review came, and we noticed that the average demo was about 7 minutes including a few questions from the audience. The next step was to limit the demo to 10 minutes per team. Not only we did that, but we told the teams that the demo had to “connect” the work of each team to the other. We did not tell them how to do it, we just proposed it.
The next Sprint Review we had in less than 2 hours, the demos were great and from the right perspective. In a month we went from close to 5 hours to less than two! That allowed all teams to see the work as a whole and understand how things come together.
As I write this, the teams were improving their demos by recoding a full system demo and making it available for anyone who missed the event. They are working to having environments that allow people to “test drive” the potentially shippable product increment.
The group of teams not only cut the time spent to demo, they also changed the focus to target the end user journey thru the system. A shorter event left time to discuss many other important things during the Sprint Review. The overall quality of the review improved and so did the outcome. All of it because they realized that more output does not translate into better outcomes.
I emphasize that we never told the teams how to demo or how to coordinate their demos. We issued a “challenge” and they did a great job meeting it.
We just used Scrum basics. Time boxing, self-management, experimenting. Had we used a heavy handed, prescriptive, “safe” (wink-wink) approach we would have robbed the teams from having the opportunity to communicate and collaborate. As they mature in their agile journey, we will continue to see improvement.