Applying AI in UX research: context matters
Plato's allegory of the cave

Applying AI in UX research: context matters

Practitioners praise some efficiency gains in process tasks, but are skeptical about the real value in analysis and insight gathering, despite the many marketing claims.

Human-centered research and design consultancies are increasingly asked by clients to do their work much faster, using AI tools (e.g. these ones listed by BOI (Board of Innovation) ) as shortcuts.

Does this approach make sense at all? Let's zoom in on UX research.

Survey of UX researchers shows efficiency gains and low trust

The US-based company User Interviews conducted in August 2023 an online survey of 1,093 UX researchers (mainly of the US we presume, although this is not clarified) on how they use AI in their work. The survey explored which AI tools researchers are using, what aspects of the research process they are automating, as well ethical concerns, benefits and shortcomings.

77.1% of the researchers queried are using AI in at least some of their work, but nearly all of it is for process activities: scheduling, screening, recruiting, transcriptions, translations, editorial support.

Efficiency gains (and time savings) are considerable. The fact that they use AI and told their clients so, also improved - somewhat bizarrely - their professional credibility.

But few UX researchers use AI for insight gathering and analysis itself, and were struck, when trying, by "the low-quality, inaccurate, surface-level, or incomplete outputs". Even qualitative coding provided poor results, the survey showed.

"Many people also said that AI needs so much human review (described by one respondent as “hand holding”) to catch errors that its limitations undermine any efficiency gains."

Ethical concerns such as biased outputs and negative socioeconomic outcomes (e.g. discrimination and unemployment) were also a recurring theme in the responses.

In general the researchers in the User Interviews survey audience seemed to have very low trust in AI, and many wished AI solution providers were more transparent about their data protection policies (or lack thereof).

What do industry experts say?

First a caveat: there are a lot of marketing pieces and practical guidance articles. They explain what AI-powered tools might or could do for your company, and usually end with some considerations and concerns to take into account. We will not dwell on them, as they are easy to find and not so insightful.

In a well written and in-depth piece of April 2023, UX strategist Greg Nudelman analyses the value of AI for UX research and sees four dimensions in how UX techniques will be affected by AI: (1) those that will likely see full automation, (2) those that will be radically augmented, and (3) those that will become increasingly valuable. He also devotes an entire section to "AI bullsh*t".

He argues that there are certain UX skills that AI will have a hard time understanding and simulating and that these will actually increase in value, such as core skills (aka dealing with humans); workshop facilitation; formative Research, field studies, ethnography, and direct observation; vision prototyping; and augmenting the executive strategy.

The bullshit section zeroes in on "AI applications to UX research that are far-fetched, oversold, and over-complicated, or just fail to grasp the rudimentary principles of UX Design" such as AI strategic analysis tools that claim to replace humans in coming up with novel ideas and business use cases, AI heuristics analysis replacing user research and design, AI acting as users for the purposes of usability research, and tools that claim to build your Persona using AI.

Feifei Liu and Kate Moran of the Nielsen Norman Group focus on issues and limitations when using AI-powered tools for UX research, stressing the importance of a skeptical approach in their July 2023 article . In particular, they presented the results of their evaluation of four new AI-powered UX-research tools: three AI insight generators that summarize user-research sessions based solely on the transcripts of those sessions, and suffer from the fact that they cannot take in context, and one collaborator tool, which acts like an insight generators, except that it can accept some contextual information from the researcher, but still has similar limitations according to Liu and Moran.

The limitations were many, including the fact that these tools can't process visual input, often gave extremely vague summaries and recommendations, have a very limited understanding of context, lack of citation and validation, unstable performance and usability issues, and bias.

If you like something more polemic, consider this savory and very recent UX Collective piece by Pavel Samsonov entitled No, AI user research is not “better than nothing”—it’s much worse (of which we copied the image, above), who argues that algorithm-driven design will lose companies a lot of money.

Context matters.


See also our previous article Human-centered design beyond the mere "what customers want"

要查看或添加评论,请登录

Experientia的更多文章

社区洞察

其他会员也浏览了