Tips for Usability Documentation

Tips for Usability Documentation

Over the years of teaching usability classes and providing feedback on deliverables, I’ve amassed a list of actionable recommendations for improving participant screeners, usability discussion guides, and research reports. In the name of putting the list to good use, I figured some folks here may find it useful. Enjoy!

Screener

  • Use clear formatting and indentation to make the screener scannable
  • Screener should start with an overview of the study, including the expectations for the participants (one-hour one-on-one interviews; two-hour focus group; ten-day diary study, etc.)
  • Screening requirements should be as cut and dry as possible; judgement calls can lead to unqualified participants
  • Screening requirements should list the number of people desired per behavior/demographic
  • If you have too many behavioral requirements, some will have to become “nice to haves”
  • Provide a clear study schedule, with at least 15 minutes in between sessions and a lunch break for the moderator
  • There should be a one-to-one correlation between screening requirements and screening questions
  • Number the screening questions to facilitate conversation
  • Always provide multiple-choice answers to screening questions, where possible
  • Put instructions to the recruiter next to each answer choice
  • Mark open-ended questions as such, otherwise recruiters may treat the answer choices as multiple choice
  • Instead of percentages, use hard numbers to indicate how many people you want; percentages can be misinterpreted
  • Determine respondent technological savvy by asking about their frequency of online purchases and mobile app use

Discussion Guide

  • Rely on bulleted text over paragraph text to draw stakeholders into the content
  • Convey the study goals and methodology concisely and without UX jargon
  • Methodology section should quickly convey how a usability study works, in case not all stakeholders know
  • Number tasks and questions to facilitate conversation
  • Ensure to include the expected path and resulting page for each task so you can determine pass/fail
  • Ensure usability tasks don’t include the name of a link/button participants will use – ensure tasks are non-leading
  • Tasks should have a single activity with a clear stopping point – avoid compound tasks that should be broken into two
  • Softly convey tasks; for example, “You’ve been asked to…” instead of “You need to…”
  • Including a goal for each task helps facilitate conversation with stakeholders
  • Use the time after participants complete the task to ask a few relevant follow-up questions
  • Generic follow-up questions lead to generic answers; ensure questions are specific about a topic
  • Ensure to mix quantitative Likert scales with qualitative open-ended questions
  • “Are you familiar with” can put participants on the defensive; consider instead “What is your understanding of..."
  • “How easy or difficult was the task?” or “What was your overall impression?” can be too open-ended for some participants; center participants by using Likert scales and probing on their answers
  • Don’t ask the same exact set of questions for all tasks – participants will get bored
  • Avoid yes/no questions; instead of asking if something is useful, ask what’s useful or not

Report

  • Ensure the report can stand alone as documentation of the study; it should include study goals, methodology, participant screening requirements, and an explanation of what was studied
  • Convey what you learned at a high level in the exec summary; save the details and how you got there for later
  • Avoid harsh language in the report; part of our job as researchers is to soften the blow of receiving bad news
  • Use bold words to make important phrases stand out
  • Avoid passive voice (e.g., participants were asked, participants were able to, the task was accomplished, etc.); this skill takes years to hone, but it can really improve a person’s business writing
  • Ensure to synthesize the data and report in aggregate – don’t just report on what individual participants did or said
  • Only report on that which you can observe; instead of “participants did not know,” it’s “participants expressed confusion about,” or similar
  • (Generally) avoid reporting on how many people, one or many, asked for something or uncovered a finding; focus on the findings themselves; some stakeholders may want to see this data though, so this suggestion will depend on your context
  • It’s best to pair findings and recommendations on the same page to facilitate discussion as you give the report
  • Ensure to include participant quotes to support findings
  • If using color-coded boxes and arrows to convey findings, include a legend at the beginning
  • Avoid crossing arrows – adds visual complexity
  • Never put full participant names in the report or attribute data to a named participant
  • Tables with Likert scale results can take up a lot of space; use graphs instead
  • Include a prioritized action item list at the end of the report

Rob Fitzgibbon

UX Design Strategist

10 个月

Great stuff Dan!!

回复
Jennifer Aldrich

Senior Content Designer at TikTok

10 个月

Thanks for sharing this, Dan!

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了