Keeping Analytics Solutions in Check with Customer Needs
Brian T. O'Neill
I help B2B AI and analytics product leaders remove sales and usability friction with UX design. | Host: Experiencing Data podcast | Founder: Data Product Leadership Community | MIT Sandbox startup advisor
AI and Machine Learning Are Not a Panacea for Underused Analytics Services
Ears, Eyes and Empathy Guide the Best MVPs
This article was originally published on Designing for Analytics.
Since AI, predictive, and prescriptive analytics are big right now, there is a tendency for companies to “want” to use this technology and throw it into their marketing jargon as well. Boards and executives are worried they’ll be left in the dust if they don’t have a clear AI strategy. The reality is that advanced analytics investments have no value if customers for which the solution was intended can’t or won’t use the analytics to make decisions.
The people won't just come because you built it using the latest tech.
Putting aside “lab” projects—whereby a data science or analytics group may be searching for a good business problem to solve from a pile of data—if you’re considering developing a new predictive capability to address a user need or business problem, you may be able to test it for potential utility, usability, and value without making a huge technology investment as your first step. You can put together a set of mocked up designs–a prototype–based on a logical customer workflow and use this to inform how, or whether, to proceed.
Now, I am not a data scientist, and there may be other reasons to engage in advanced analytics or data science “practice” projects in your business such as developing a team, its domain knowledge, or its technical skillsets. However, this article assumes that the particular project in front of you came with some stakeholder expectation of ROI at the end.
The Power of Design Prototypes to Uncover Need
Let’s assume that you’re perhaps taking a high-touch analytics workflow–which we will refer to as (W)–and the business now want to leverage machine learning to reduce the more manual analysis process that the customer currently uses. Currently, while you may not know all the steps your end user performs to go through (W) today, you know that your customer follows a “recipe” that they’ve been using for some time to evaluate the presented data and form an opinion—a prediction—about how they might take action in the future. You’ve heard this recipe is pretty involved and requires substantial “tool time” and eyeball analysis from the customer in order to form their own prediction/decision.
Your users also know this domain well and live in it every day. Nonetheless, your analytics team thinks it has the data and resources to make a prediction derived from an advanced analytical method like machine learning that would simplify the UX, and allow the customer to make a more informed decision. On top of that, the team also believes a predictive analytics solution might also be able to factor in new data points that to date, the user has not been able to consider when going through (W).
So, is there value jumping into the data science, engineering, and implementation effort to find a new and improved way to help the customer with (W)?
Maybe.
It depends on what you care about.
The Risks of Tech First, User Experience Second
Let’s look at some risks of jumping in without intimately knowing the customer’s problem space:
- Users may not adopt/engage with the solution, meaning the business doesn’t see outcomes from the technology investment (77% of the time, this is still a problem as of early 2019 - ouch)
- Building a solution to a problem that doesn’t exist, or isn’t that important in the eyes of the person using the service
- You may need a lot of dependent technology in place before you can even do predictive analytics (E.g. data engineering, data acquisition, architecture, etc.)
- Technical debt may be accrued that could slow down future iterations or worse, become harder and harder to let go of given the sunk cost
- The business has less appetite to spend the same amount of resources on version two once version one finally goes out the door. As such, the pressure to get v1 right is higher.
There’s another way though.
Using Eyes, Ears, and a Design Prototype to Inform the Required Tech
Using the [totally free of cost!] body parts above your neck, you can spend a little time, and almost no money, interviewing your customer about their current process, problems, needs, and attitudes about the data they're working with. You can create a basic workflow document sometimes called a "journey map" to map out the users' recipe for (W) as they perform it today. Pay special attention to bookends: understand what requires them to fire up their browser/tool to use your current analytics, and on the back end, understand the exit points and conditions under which an outcome or decision is made. Entire articles can be written on these previous sentences, but the point is, you're not starting this analytics project with a technical implementation.
It starts with people.
Once you've gotten your head around the problem space, you can create a set of mockups–a design prototype–and use them to help define the problem space and uncover the true, latent needs of the customer. This is your first MVP. Show the users how this new predictive analytics-driven solution might work. Involve them in the creation. Use the designs not as a final solution, but as a means to help you discover what’s really needed, and what might not be. Don't forget those bookends. As a general rule, with regards to visual fidelity of your prototype, my suggestion is this: the more data and information you need to show, the more relevant the visual fidelity becomes in the success of the final design. Don't let the fidelity of the prototype stop you from collecting information.
With 1x1 customer conversations and evaluations, and your low-cost design prototype in hand, let’s look at some of the benefits you might get by focusing on the UX up front. You can:
- Learn if customers value the predictions you are hoping to show. To do this, ask them to perform the relevant tasks comprising (W) using your MVP. “Does this information help you accomplish (W)? How so? Is any information missing? Is anything surprising, and if so, what?”
- Determine the conditions under which the customer would actually take action on the prediction, if that's the desired outcome. Practice active listening and observation here; you may learn here whether there is additional information that you need to provide before the customer will actually use the service. Your goal is to identify friction and problems that hinder them from taking action on the insights, not to just “inform the model.” Take special note that “gaps” in your design may be solvable without using the latest tools and tech. You may not need that time machine and flux capacitor just yet.
- Determine if the users trust the prediction, especially if it’s showing something “unusual” to somebody who’s been using a “tried and true” manual process for 20+ years and may know a lot more about the domain than you do. Do they need to know how the system generated it? If they’re suspicious, what type of evidence might they need to trust the system more? (Quick plug: stay tuned to my podcast Experiencing Data for upcoming episodes on UX and Explainable AI).
- Learn what the user would do next with a software-based prediction. Ask: “Where would you click/go next [in the tool][IRL]? Who would you tell? Would you take action right away? If not, why not and at what point do you have what you need to proceed?” You may also want to observe whether users try to validate the prediction using their recipe (esp. if your technology is proposing to replace this process).
Let’s re-iterate why we’re doing this: it’s all about designing for outcomes, not the outputs.
If your analytics team usually works in a silo and tends to see no “uptick” in user engagement after releasing new analytics features or capabilities, then chances are, you aren’t getting to know the customers and their problems well enough. You might be creating analytics and "doing data science," but you aren’t providing what they really need: decision support. And when there is no decision support occurring, then there is no positive business outcome or ROI.
Fortunately, there’s a better way to make analytics serve the people who are supposed to be leveraging them to make decisions. Let's look at a more concrete analytics scenario.
Scenario: Data-Driven or User-Driven Customer Retention Analytics?
Let’s assume you work at a company with one or more subscription-based businesses. You work on analytics and data science projects. Your analytics team believes that it would be possible to help your company’s retention team by providing them with an analytics tool that could generate a list of at-risk customers to call and “warm up” around the time of their annual renewal.
Now, this team is quite talented once they get on the phone, but currently, it takes a fair amount of manual CRM reporting, website analytics assessment, and email marketing analytics review for them to determine who they might want to call. In other words, while they trust it, their recipe for making a list is laborious, so they don’t spend a lot of time making calls each day.
Your analytics group—using the retention team’s recipe and your team’s access to additional customer data and technical skills—believes they could predict a much more accurate list of customers truly at risk for renewal. However, you estimate it is going to take six months to wrangle the data, test the models, and develop a decent enough tool for the end user to try out. Unfortunately, the requisite data is in five different systems spread across the business, but with the right engineering investment, it should be possible to get it all dumped into the right place so your team can start working on the modeling and generate a much more accurate call list.
So, is this project ready for your IT/engineering/data science group to implement?
For the design-driven project, it might not be.
Why?
If the Customer and Business Problem Isn't Clear, the Analytics Will Fail to Deliver
Design-driven companies talk to stakeholders and customers. They ask a lot of questions and seek to elucidate the problem space. The design-driven company–through its time talking to the retention team and reviewing prototypes–has come to learn that the team’s biggest challenge right now isn’t the accuracy of who they are calling when they do manage to get on the phone. In this scenario:
- The business problem is the low volume of outbound calling—a tactic that the business knows works to retain customers.
- The UX problem is the high tool effort and analysis required to generate a list of names to call.
Is this an advanced analytics problem? Maybe not. The retention team HATES dealing with the CRM tools and website analytics, but they don’t mind making phone calls. They just need to be able to pull a report that doesn’t take them 25 clicks and five different data sources to cross reference.
To reiterate: the customer problem right now isn’t in the quality of the list of names; it’s the level of effort required to generate the list using their existing process and recipe. The outbound retention team is good at dealing with unhappy people on the phone, and helping those customers see the reason to keep paying for your products and services.
Had you also conducted some light research with the business stakeholders, you might have also learned that the retention department’s VP wants their staff on the phone as much as possible (goal time), not in the analytics tools and CRM researching who to call (tool time). Even if the VP’s staff doesn’t always retain a customer or finds out a customer wasn’t even at risk to begin with, the VP has stated that “time-on-phone” is a KPI for their retention team.
“Even if we’re not always talking to an unhappy customer, there is inherent value in our company doing periodic outbound calls to customers. This is time well spent.”
So, what are the meaningful differences in approaches here?
Design Puts the Actual Problems in Focus
In the design-driven strategy, the solution is oriented around addressing the customer pain and achieving a business outcome, without imposing a particular analytics solution too early.
You can hear the stakeholder in the background. “If you can help my team generate these lists of names more easily, they’ll get on the phone more often, leading to more customer retention in the short term. That’s a great win for the business, and a lot less effort for us too.” This team already has a strong idea of what data is required to pull a good list.
In the data-driven strategy, the team is favoring a particular implementation—predictive analytics in this case—before understanding the customer’s problem sufficiently. As such, the initiative may fail to bring an ROI to the business soon enough (or potentially, at all).
The analytics team is “trying” to use predictive analytics to create a “better list.” Their problem space is defined by “list quality.” But, that’s not the retention team’s problem. The retention team’s domain knowledge and recipe for identifying at-risk customers is just fine; it’s just very laborious to put into practice on a regular basis.
Remember: this project wasn’t about “practicing” advanced analytics techniques; there was an assumption that a business outcome—improved customer retention—would be realized.
Had you picked the design-driven strategy, you might have found that a particular skill set (e.g. “data science”) wasn’t even necessary to solve the first iteration of this particular problem. However, this strategy did seem more likely to produce an ROI and solve an actual customer problem–in less time. In fact, you may have accomplished several things with this strategy:
- You helped another department spend more time doing what they do best (being on the phone), enabling business outcomes sooner (more retention)
- You obtained useful domain knowledge from the horse’s mouth about their process for customer retention, and what data may correlate highly with attrition such that you can inform future iterations that will leverage more advanced analytical methods. When it's time to improve the quality of the list, your team has some good information on which to build.
- The product/analytics/data science team has now shown value and has built trust in the organization with a new department. Instead of being an “analytics factory” or a lab for data science, they’re now known as “a group that gives me tools to do my job better.”
Concluding Thoughts
While the opportunities to leverage advanced analytics techniques such as machine learning are much more accessible today, make sure it’s the right tool for the job by understanding the problem space before you implement anything. Analytics are supposed to facilitate decision support and create positive business outcomes. Whether it’s predictive analytics, AI, or whatever the next hyped tech is, if the human in loop can’t use it, won’t use it, or doesn’t trust it, then it doesn’t matter.
That voice in the baseball flick Field of Dreams said, “if you build it, they will come.” The recent numbers say otherwise: Gartner says, "80% of analytics insights will not deliver business outcomes through 2022," and "80% of AI projects will 'remain alchemy, run by wizards' through 2020."
On the other hand, consulting firm McKinsey reported in 2018 that design-driven companies are significantly outperforming their counterparts with 32% higher revenue growth and 56% higher total shareholder return in a study of 300 companies. These companies have learned that design is a strategic advantage that not only prioritizes user experience, but leads to better business outcomes.
If your data product or analytics group feels like it's contributing to the [ongoing] Gartner stats and you'd rather be seen in the McKinsey report, feel free to contact me about how I can help you use design to improve your analytics. You can also join my Insights mailing list for more articles like this.
Principal Agent & REALTOR? @ REAL Broker | Founder @ The ALCHEMY Group
1 年Nice one : - )
AI in Society | AI Adoption & Innovation | Program Director | Business Intelligence Academic Programs | Ph.D. in AI / Thoughts are my own
5 年Excellent article.? This says it all: "if the human in loop can’t use it, won’t use it, or doesn’t trust it, then it doesn’t matter."