Guerilla UX: Uncovering user insights with Agile Research Methods
Yvonne Doll
Leading Global Design Teams | AI Design | User Research | Mentor | Design Direction
I recently had the great pleasure of speaking at Lesbians Who Tech & Allies at the NYC conference. I spoke on a topic that is near and dear to my heart, Agile User Research. I've implemented flexible, nimble research processes for a variety of organizations from those with large resource budgets to those next to no research budgets. The key issues that will derail user research initiatives are similar.
I think the puffer fish is a perfect analogy for agile user research. Often, the issues we expect to loom large in users' minds turn out to be minor, while the seemingly small concerns can inflate into much bigger problems than we anticipated. Agile user research helps us quickly identify and respond to these surprises, allowing us to adjust our approach accordingly. The puffer is also an expert at identifying and responding to risks instantly.
So what makes the puffer sad?
We're all familiar with the classic graphic of the product development cycle, where research is just one step. While the endless loop reminds us that a product is never truly finished, I envision it more like a pufferfish—constantly spiking with small, targeted micro-research studies throughout the development process. These micro-studies provide our development team with continuous, actionable insights, allowing us to reduce risk while keeping pace with ongoing development.
In user research, it’s essential to keep your RESEARCHER HAT ON when dealing with internal feedback. Your goal is to reduce confirmation bias and the false consensus effect while valuing stakeholders' expertise. Treat their input as a valuable data point but frame it in the context that their perspective represents a small segment of the audience, as their insights often exceed those of typical users.
Involving stakeholders directly in research helps shift their mindset from confirming biases to uncovering insights—hearing from users firsthand is far more impactful than hearing it secondhand. Use micro-studies to "show, don’t tell," and present findings to stakeholders in terms of KPIs, OKRs and technical constraints so they know you are a valuable partner who has the business goals at the core of the research.
The Eisenhower Matrix is a time-management tool that helps prioritize tasks by categorizing them into four quadrants based on urgency and importance, allowing you to focus on what truly matters. I've sort of co-opted this and applied to a decision making matrix for how much or how little do to in any given scenario. Going back to stakeholder buy-in, it's important to know when to be dogmatic about research and when to be pragmatic about research. Our goal is to never impede development unless we uncover a huge risk.
When accessing the risk of any given feature here are 5 key questions to ask yourself:
领英推荐
Data is a best and language is a virus (bonus points if you get that reference) and if we are not data scientists it can get tricky to ensure we are acting on solid hypotheses based on data alone. Data tells is what but not why. So what are we to do if are researchers and do not have access to data scientists?
There are some breadcrumbs in your data can can provide you with low cost, low risk hypotheses that you can test in production. Here are just a few to get you started:
For example, imagine that 3% of your audience frequently returns to their cart without visiting other pages on your site—this could indicate 'price check behavior.' To test this hypothesis, you could send a discount to this segment. Another scenario is users adding items to their cart but never starting the checkout process; one hypothesis is that they’re using the cart as a 'save for later' tool with no immediate intention to buy. In this case, consider adding a 'save for later' feature on the product page (but avoid placing it on cart pages)
Traditional user research methods whether moderated or unmoderated often seem like trying to navigation a cruise ship down a narrow passage: slow, lumbering and disruptive. Here are some way to make these methods more agile.
? Micro Investigations: Test small features for quick, actionable insights.
? In-App Feedback: Context: Ask only about the feature they are on or a task they just completed. User recall is only a few pages.
? Qualitative interviews: NO FISHING EXPEDITIONS! Only ask questions that you know how you will user the answer. If you are unsure of how the data can acted upon, hold that question for deeper blue-sky research projects.
? Qualitative interviews: Focus on specific user actions or behaviors. Limit to 30 min. Define clearly the problem you are trying to solve.
? Rapid Prototyping: Quickly test prototypes with users to validate designs before committing.
? Usability studies: “send and play”, rapid remote tests with fewer participants, and focus on high-impact issues. Send a script ask users to use a screen recorder to walk through tasks and talk out loud as they go.
?? AI Tools like Grain, Dovetail, or Thematic, or ChatGPT can expedite qualitative analysis. Not only can they identify themes, they can do sentiment analysis, emotion detection, clustering and automatic tagging, natural language processing, and anomaly detection. This can save your research team a lot of time.
Look at you, you made it to the bottom of the page! Huzzah! Thanks for reading. Would love to hear your comments below!
?? Customer Success Manager | Customer Support Engineer | Technical Support | Team Management | Bug Reporting & Feature Enhancements | GDPR Compliance Advocate | WordPress Specialist ??
1 个月Omg I didn't know that lesbians who tech existed. Thank you!
Yes, my last name is really apps | Sr. UX/product designer | bowtie expert
1 个月i'm so glad you posted this. i was so sad to miss the summit. also.. puffer fish is kinda the perfect imagery for users/customers. :)