I just won't shut up about BDD (spoiler: I love it)
Adrian Pickering
Accessible customer communications at scale - sign language, easy-read simplified language, spoken word, readable font, and world languages. Your one-stop-shop to see your messages reach 4,996,466 extra customers.
Anyone who has worked with me in recent years is probably quite aware that I am passionate about behaviour-driven software development. If you are too, then read no further. If you're not yet fully convinced, maybe some of these words will offer a little insight.
In computing terms, I'm quite old. I was programming in the early 1980s and ever since, not a week has gone by without me at least tinkering in code. Much of this is commercial - paid work - but a hefty amount is for the sheer pleasure of creating something.
I can remember how I churned out (mostly) games in my first couple of years. I followed a process called "What has just occurred to me". First, I would write the title screen. Maybe add scrolling credits (with just my name). Then I may craft the first screen or maybe the player sprite, a two-colour, eight-by-eight pixel unanimated block in those days.
Perhaps the music and sounds effects came next.
I put all my up-front effort (and therefore enthusiasm) into the bits that were in the front of my mind. I coded to my immediate expectations. This process saw some very ropey reinterpretations of Pitfall! and Pac-Man and even the odd racing game.
This is not wholly unlike a little-known practice called "Bug-driven development" - not the BDD I am writing about today.
Briefly, bug-driven development declares everything as a bug, including unwritten functionality, until it isn't anymore. Let's say you wanted a hospital patient processing application. Before you have an app for that, you have a bug - you can't see a list of patients. Each feature is treated like this until there are no more bugs to be found. A normal triage takes place, work gets delivered accordingly. To me, this is the very spirit of good agile - the most iterative means of delivering the highest immediate value. Even the name is quite logical - there really is no difference to a user between a defect and a missing feature and spending energy attempting to classify and differentiate may well be considered wasteful. Agile wasn't a thing in 1981 so I'm going to claim my seven or eight year old brain's natural management approach to be ahead of its time.
By around my third year of programming, I had been instructed that I was doing it all wrong. That I should work out the game mechanics and all the boring bits first. Get the nitty-gritty out of the way first. Understand how everything fits together, write it all down and only then start to write the code. All of the difficulty would be over and done with before I write my first GOSUB. Programming wasn't quite as much fun but more of what I started got finished. Actually, that's not entirely true. More of what I started to code got finished. I often gave up on the design work long before I started crafting the source. This, you may be aware, is a cousin of what we today call Waterfall. Phased delivery of requirements, design, implementation and testing. Waterfall is a good fit sometimes but it really isn't (and should not be) the default methodology any more. Iterative failures occur much sooner (and therefore less expensively) than do "horizontal" Waterfall projects. By this, I mean that when a project is going to fail catastrophically anyway, you are much better if it fails before all the money is spent. Aerospace and insurance industries, with thoroughly investigated, fixed requirements, seem to be the best at making a great job of this approach, but it can spell disaster for retail or marketing industries, where ideas may not be as tightly defined or unchanging.
Waterfall lived on in my world well into the start of this century and projects failed often. Pretty much everyone in the industry expected their efforts to be delivered late or not even do what the customer really expected. There are numerous reasons for this, but the one that is most addressable is that mistakes accumulate. The sooner a problem can be found, the less has been built on top of it. Let's say somebody misunderstood the client's desires in some subtle but significant way and, even though it was listed in a bullet point on page 408 of a 700 page PDF, nobody picked up on it until Customer Acceptance Testing. The solution is rejected and the engineers start again from the very beginning. The in-error interpretation was used to shape architectural decisions which dictated technology choices which drove recruitment which affected implementation… It's all rubbish.
If this were at least an agile development exercise, the failure would likely have been picked up much sooner, when the functionality in question was first delivered. Product owners, if the chosen flavour of agile employed them, would hopefully have had sufficient comprehension for it not to have got that far, as long as the issue lies in the understanding and not in the articulation.
Behaviour-driven design (or behaviour-driven development, I really don't mind what the second D represents) offers a beautiful, elegant alternative.
Features are distilled into the behaviours they exhibit and these are formally articulated in a language called Gherkin. Gherkin is a particularly ingenious invention because the grammar helps us to structure our thinking but the vocabulary is whatever you want it to be. In short, paths through a system (or component parts of an application) are described consistently in terms of "under these circumstances, when we try doing this, we can see the outcome".
Here is an example from Wikipedia:
Given that a customer previously bought a black sweater from me
And I have three black sweaters in stock.
When he returns the black sweater for a refund
Then I should have four black sweaters in stock.
Programmers have been writing tests like that forever, only theirs look more like this, also taken from Wikipedia:
@Test
public void testSumPositiveNumbersOneAndOne() {
Adder adder = new AdderImpl();
assert(adder.add(1, 1) == 2);
}
There's a very important difference between the first test and the second: the first is inclusive, the second is some garbled mess that few can properly comprehend.
What makes it so useful is the way we are asked to author our scenarios: each line contains exactly one piece of pertinent information or action. And this matters because the constraint structures your thinking and facilitates automating the behaviour as a test. It becomes executable specification. Programmers take the BDD documents (Feature Files) and turn them into exacting automated tests.
This really is a Very Big Deal. Executable specification, if executed, offers so much more over PDFs and DOCXs. It never becomes obsolete, or at least as soon as the solution and the specifications diverge, the continuous integration pipeline, a part of your developers' code commit process, will sound an alarm that something has failed, what that something is and how exactly it failed. All of a sudden, your specification is proof both of the product being built right and that the right product has been built.
For the developers, the homogeneity of BDD makes understanding the requirements easier and tracking down edge cases and special circumstances all that less troublesome. What is more, particularly in Service-Oriented Architecture (SOA) applications, the lines of code behind the scenes that automate the Gherkin serve as worked examples for how to consume APIs or programme and extend the product.
The grammar of Gherkin also lends itself to edge-case discovery, meaning that product owners and business analysts are led towards describing behaviours that otherwise may not have occurred to them. Gherkin accepts parameters or arguments to permit heavy reuse of behaviours. And it's quite easy to create macros to generate all of the permutations. With relatively little effort (compared to traditional analysis and test approaches), it is possible to test and prove 100% of all behaviours. In the real world, this happens mainly when creating services or APIs, rather than testing entire applications. Although products like Selenium and Microsoft CodedUI let us plug our tests into user interfaces for true end-to-end BDD.
BDD isn't quite all things to all men, but it's as close as I've seen in the software development world:
- Product owners get validation that the implementation matches the requirements
- Business analysts have a mechanism to aid behaviour discover
- Documentation consumed by programmers is more accessible and readable
- Programmers also make (and therefore offer) worked examples of APIs
- Testers enjoy extensive automation - reliable, repeatable, cheap-to run tests that free them to concentrate on high-value activities
- Customers see high quality product that fits their needs
If you are not already practising BDD, I urge you to dip your toe in the welcoming, warm water today. It is probably the single best change you can make to reliable software delivery.
Picture by https://unsplash.com/@jonathanpielmayer
Software Development Engineer at indonesia
7 年I need people who can integrate pixie image editor that I bought from codecanyondotnet to my application if you can do it please my Kontact. I will pay you
Good article again.
"half-stack" SW engineer | Oracle DB specialist | SQL wizard ?? | .NET developer
8 年Sufficiently concise to be written by analysts and architects, sufficiently structured to be picked up by developers, sufficiently simple to be understood by customers. Nice intersection of features. Thanks for enlightening.