Can focus groups work for usability testing?
Best practice says that usability testing should be done one-to-one.
But let's be pragmatic: this isn’t always possible.
We recently were planning some usability testing of a digital service prototype we built for one of our clients. But our client had already planned focus group sessions, and we had to fit our usability testing into this group format.
So, how do you conduct robust usability testing on a digital service with six participants, one room, and two facilitators?
How do you gather valid insights when you can't delve deeply into each user's individual experience?
In order to get the most valuable research for our client, we decided to 'lean in' to the focus group format rather than fight it.
In the space of 3x 40 min sessions, we validated the IA, content, UI and discoverability of an entire digital service...
Here's how.
1. Leverage the power of group discussions
One thing group sessions have going for them is the dynamics of group discussion.
Granted, this can be a downside too. We've seen many sessions dominated by one or two overly vocal participants.
But when facilitated well, these discussions can be powerful.
We've run usability testing where participants have completed a task on a digital service prototype, then come together to discuss their experience.
Throw the prototype up on a projector and when a participant offers some feedback, walk through it with the group.
We've found this can spark participants' memories. Often others will offer feedback or insight into their experiences that they wouldn't have otherwise voiced.
2. Test your content
Use this simple highlighter method to test whether your content works for your users.
Content testing is a great activity for groups, as it can be completed individually and reflected upon afterwards.
This means that participants record their own responses to content before the group discussion influences them.
When using our content testing method, we find it's best if not every participant is looking at the same piece of content. In a group of 6 participants, we only test 2 or 3 pieces of content across the group.
This means you get to test more of your content, but also that participants don't feel that they're being compared to others in the group.
You don't want your users to feel like they're being tested on their comprehension skills! After all, they're testing the content writer's skills, not their own.
Card-sorts are a great use of group settings too. And users love it! We find card-sort activities bring some nice high-energy vibes to the room if everyone is getting a bit quiet.
3. Bring all the devices
Aim to have a device for every participant in the group session, even (or especially) if it's their smartphone.
Even if you can't observe every participant's actions as they test your service/product, being able to have everyone participate and report back is valuable.
We have experimented with having a device for every 2 participants, and asking one participant to test the product while the other observes. They then swapped roles for the next task.
This worked okay, but it effectively outsourced the role of 'researcher' to the observing participant. This made some participants uncomfortable - as if they were 'testing' their partner.
It also relies on participants knowing what they're watching for. For example, do they notice how long the user takes to find the link they're looking for? Are they watching for how the user understands that particular menu icon? Are they observing what their eyes are looking at and checking behavioural nuances (like squinting).
All in all, we found it easier to have each participant use a device and then embark on a facilitated group discussion, with the aid of a projector/screen.
4. Record EVERYTHING
(With the appropriate consents, of course.)
You may not be able to hold one-to-one facilitated sessions, but appropriate recording of your group sessions can go a long way.
And the best research is applied. Sharing what you've discovered is key to making sure that the product/service reflects the research findings. This is much easier when you've recorded everything!
If possible, record:
- audio of the whole session
- screen captures, or just photos, of key moments in each participant's journey with the product
We also found having someone on hand to take constant notes during the session is a huge benefit. They can record key participant quotes and insight, but also help answer any questions. Assistance with the logistics of getting devices up and running is always a help too (no matter what we do, it's an issue!).
Compromise on method without compromising on results
So while we're always recommending running one-on-one usability testing and contextual inquiry, at the end of the day, pragmatism must win.
And in the case of this particular digital service, a research compromise didn't result in a service compromise. We managed to use the group discussion format to rapidly test the architecture, content, flow and usability of our prototype.
Have you ever conducted usability testing in a focus group session? What worked for you?