Who puts the V in your MVP?
Photo credit: Jefferson Santos via Unsplash

Who puts the V in your MVP?

We’ve been doing it since before the beginning.

In 1943 Donald Michie and Jack Good were working at Bletchley Park on a machine known as the Heath Robinson, after the cartoonist who drew outlandish contraptions. They were attempting to break the Lorenz cypher used by the German high command, an even greater challenge than the Enigma cypher broken by the team which included Alan Turing.

The machine was known as a Heath Robinson because of its complex and unlikely appearance: paper tapes running at high speed around an apparatus called a ‘bedstead’, connected to a maze of wires, mechanical relays and, crucially, a few electronic valves. Getting the Heath Robinson to work reliably was an enormous challenge - such a challenge that it led to the creation of the Colossus, the first digital computer, by Tommy Flowers and team.

The story of Colossus is fascinating, but there are still lessons to be learnt from the creation of the Heath Robinson. After the war, Michie wrote that when he and Good were first trying to get the machine to work, their supervisor, Max Newman, was under enormous pressure to show operational results. At the same time, they knew that the most valuable use of the machine was, counterintuitively, not immediately trying to directly decrypt messages, but attempting to discern statistical patterns within the coded traffic. Understanding such patterns would take time, but it would greatly speed up subsequent decryption.

Michie wrote that, ‘Once he had laid it down, Max Newman was not someone that in his senses a person would continue to oppose,’ and that while he and Good would ‘go through the motions’ during each day shift, ‘many evenings were spent in a clandestine ghost shift, with one or two volunteer Wrens and an engineer,’ resulting in ‘tabulations, statistical summaries and empirical rules,’ that made further success possible.

Most of our work does not have the same urgency and importance as that carried out by the pioneers of cryptography and computing at Bletchley Park. However, I think that many of us working in technology have been in a similar position to Michie and Good, where we are being pressured to show ‘operational results’ at the expense of building foundational capabilities into our system.

For example, reflect on the Minimum Viable Products (MVPs) that many teams are asked to produce as their first tangible deliverables. How viable are these products really? The project sponsors demanding such products typically expect viability to be demonstrated through visible features and user experience. By contrast, the people building these products also expect viability to be demonstrated by the ability to manage deployments, to run tests, and to remain stable in production for longer than the duration of a demo.

These other characteristics of viability are often not visible or valued by non-technical sponsors, but are nevertheless built into the product by conscientious product teams. Often this work is not in the plan, but the team will do it anyway, partly because it is the right thing to do, and partly because they know how much work it will save them in the future. They work the equivalent of Michie’s clandestine ghost shift in order to make their products truly viable. (If you find a software engineer working late on a problem, I am willing to bet that it is more likely to be a piece of automation, a way to improve performance, or improvements to code readability and efficiency than on a new visible feature.)

Organisations which aim to be self-conscious innovators often set up ‘skunk works’ teams: teams of expert specialists who are allowed to operate with fewer constraints than other teams, in the hope that they will deliver new ideas and disruptive capability. Sometimes these teams are successful, sometimes they fail, and sometimes they merely demonstrate that when you have more resources, more attention and fewer constraints, you can go faster than otherwise.

I think that we should recognise the other form of skunk works: the unrecognised work that development teams do to make their products truly viable, despite the pressure to do otherwise. Furthermore, we should take the opportunity to teach project sponsors why these invisible features matter: why their outcomes (better user experience; codes broken faster) are dependent on attributes that they can’t see (working pipelines; statistical insights).

In a report released after the war, the list of challenges experienced by Heath Robinson includes slipping and broken tapes, malfunctioning relays and, right at the end, with elegant understatement, ‘Over emphasis on (temporarily meager) operational results at the expense of research work’. `We should make sure that we cannot say the same, or equivalent, about our own projects.

(Views in this article are my own.)

Sukhvinder Aujla

Enterprise Architect at Workday

5 个月

I use to work for Rational Software who had the Rational Unified Process which was an iterative process which focused on proving the architecture early. That meant front loaded cost/effort which was always a much "discussed" point in comparison to an MVP where architecture seemed to be secondary in favour of features.

Ricardo T.

Lead Product Manager | Speaker | Trainer

5 个月

Great view. I often battle explaining that what works is not necessarily viable.

回复
Hemant Patil

Architecting your data in the Cloud | Enterprise Blueprints | Part of Bain and Company| Chief Data Architect

5 个月

What a superb (and for me, extremely topical) article, David Knott!! For some reason, the term "MVP" is ascribed to product development. But I think "viability" which in old-school terms would be called ITIL Alignment or Non-Functional design, is relevant to pretty much all software. If it is going to be mission critical; and yes, customer-facing products ARE that by their very nature, then it needs to be designed for what you call invisible viability. Keep 'em coming, David Knott!!

a great insight as usual. I am a massive fan of the MVP approach, but it often gets interpreted as a 'loose' specification for delivery, and that is clearly not the intent. It is about delivering usable not perfect, as that delivers early value and gives the user the ability to better understand their next steps. I appreciate the 'skunkworks' value add, but at the same time it is not unusual for coders to over engineer in order to future proof. To use Nicola's analogy , the brief is to create a Model T but the engineer interprets that as a Rolls Royce. Embedding effort and resource that may never benefit the objective or the financials of the project.

回复
David Parsons

Data Architect

5 个月

Great article as always. I would say that the saving grace of building software systems now as opposed to say, 20 years ago, is that there are recognised patterns in place to deliver the non-functional elements of a system (unit testing, CI/CD pipelines, high availability, scalability, security) and the major cloud providers provide robust implementations of these. An architect can pick the right components to deliver the hidden but essential elements of a system and then focus on the user facing features. At least that's the theory!

要查看或添加评论,请登录

David Knott的更多文章

社区洞察

其他会员也浏览了