Testing Smarter with Michael Bolton
This interview with Michael Bolton is part of our series of “Testing Smarter with…” interviews. Our goal with these interviews is to highlight insights and experiences as told by many of the software testing field’s leading thinkers.
Michael Bolton is a consulting software tester and testing teacher who helps people solve testing problems that they didn’t realize they could solve. He is the co-author (with senior author James Bach) of Rapid Software Testing, a methodology and mindset for testing software expertly and credibly in uncertain conditions and under extreme time pressure.
Read the full interview here: Testing Smarter with Michael Bolton
Excerpts from the interview:
Hexawise: What testing concept(s) do you wish more software testers understood?
Michael: Oh dear. [laughs] That’s an awfully long list, and it varies from day to day.
I wish that more testers understood that test cases are not central to testing. To be an excellent tester means to explore, to experiment, to study, to model, to learn, to collaborate, to investigate. To discover. None of these activities, these performances, are captured in test cases. But testers keep talking about test cases as being central to the work, as though recipe books and written ingredient lists were the most important things about cooking.
I wish more testers understood that testing is not about “building confidence in the product”. That’s not our job at all. If you want confidence, go ask a developer or a marketer. It’s our job to find out what the product is and what it does, warts and all. To investigate. It’s our job to find out where confidence isn’t warranted; where there are problems in the product that threaten its value.
Hexawise: I see that you’ll be presenting at the StarEAST conference in May. What could you share with us about what you’ll be talking about? / What gave you the idea to talk about it?
Michael: I’ll be giving two tutorials. The first is about critical thinking for testers. We describe critical thinking like this: “thinking about thinking, with the intention of avoiding being fooled”. That’s central to our work as testers. Testers think critically about software to help our clients make informed decisions about whether the product they’ve got is the product they want.
Many people treat certain ideas as Grand Truths about testing. But many of the claims that people make—especially some of the testing tool vendors—are myths, or folklore, or simply invalid. People often say things that they haven’t thought about very deeply, and some of those things don’t stand up very well to critical scrutiny. That’s one of the reasons I started developing and teaching this class in 2008. I was dismayed that testers and other people in software development were accepting certain myths about testing unskeptically.
Not only that, but testers and teams allow themselves to be fooled by focusing on confirmation, rather than challenging the software. So, in the class, we talk about the ways in which words and models can fool us. In a safe environment, it’s okay—and even fun—to be fooled, and to figure out how not to be fooled quite so easily the next time.
On Tuesday, I present a one-day class called “A Rapid Introduction to Rapid Software Testing.” RST (the methodology) is focused on the mindset and the skill set of the individual tester. It’s about focusing testing on the mission, rather than on bureaucracy and paperwork. We sometimes joke that RST (the class) is a three day class in which we attempt to cover about nine days of material. [laughs] In “A Rapid Introduction to Rapid Software Testing”, I try to do the three-day class in one day. It is a rapid introduction, but we’ll be able to explore some of the central ideas.
Read the full interview here: Testing Smarter with Michael Bolton
Read previous interviews:
- Testing Smarter with Mike Bland
- Testing Smarter with Alan Page
- Testing Smarter with Dorothy Graham
- Testing Smarter with James Bach