Building Customer Service Chatbots
Arte Merritt
Gen AI Strategic Partnerships & Marketing; Entrepreneur & Executive in Generative AI and Conversational AI
Advice from the experts
Customer service is one of the key use cases for automated chatbots. Enterprises are moving more to automated solutions, not just for reducing costs, but for providing better customer experiences.
I had the pleasure to speak with three experts in customer service chatbots who provided valuable insights and tips for building automated digital assistants.
The panel included:
- Loren Lacy, Group Product Manager, Conversational AI Digital Platforms, Intuit
- Michael Haisten, Principal Consultant, Intelligent Self Service Solutions, Jacada
- Mira Lynn, Manager, Conversational AI, GoDaddy
Why build a chatbot?
Chatbots can be an integral component to customer service strategy. The panelists highlighted the efficiencies of automated experiences as well as improving the overall customer journey. The overall goal is to help users get the information, or help, they need, as quickly as possible, so they can get back to whatever they were doing.
Some users prefer self-serve
Our panelists agree that chatbots are not an either/or when it comes to customer service strategy - they are an “and” - an additional channel to provide to users. Some users prefer self-service through conversation, or even help articles, while others may want to interact with a person. It is important to provide the users options - the channels they prefer.
At GoDaddy, Mira found there were different use cases and behaviors in their chatbot versus their help-article search. She attributes it partly to the learned behavior of the different interfaces. Users of search tend to use keywords and truncated phrases. They see what the results are, and continue modifying the queries based on the results. With conversational interfaces, however, users enter the whole statement of what is happening. They were asking specific questions that there were not help articles for yet. It came down to troubleshooting - users of the chatbot were troubleshooting, whereas users of the help search tended not to be. GoDaddy found some content is better suited for one channel over the other.
Gain efficiencies through automation
A benefit of automated chatbots is helping users get the answers they need in the quickest way possible. Rather than reading a help article, a chatbot can help walk a user through a solution, step by step, through questions and answers, in a natural way.
At Intuit, Loren’s team wants to go even beyond the step-by-step approach, and have the digital assistant take the action on the user’s behalf. If the digital assistant can complete the action, the customer is back up and running as quickly as possible.
When it comes to live-agent chat, the chatbot can help route the user to a person more efficiently as well. The chatbot can ask clarifying questions and pass along the context to the human agent. As Michael points out, this is a form of partial automation - collecting information that can be passed to the live-agent to shorten the conversation.
Getting started
Building conversational interfaces can be challenging. It is hard to know all the things a user may say, or how they may say them.
Leverage live-chat insights
A common strategy amongst the panelists to get started is to look at the live-chat channel, if you have one. Live chat would be the most analogous to an automated chatbot - in how users interact and the types of questions they may ask.
Live chat not only provides a large corpus of training data, but insights into which use cases to handle. Michael recommends taking a phased approach to start with a few use cases that handle the bulk of the volume, and layer on from there.
Intuit followed a similar approach through looking at their phone data. They found cases where there were high resolution rates, and short handle times, without a lot of back-and-forth - things that were easier to automate - and started with those.
Afterwards, Intuit applied text analytics to the voice transcripts to figure out what customers were asking, and how they were phrasing their questions. They further augmented this data through crowd-sourcing tools. For example, asking “how would you ask for your tax refund status?” - and see how people phrase these requests. Both Loren and Mira highlighted the importance of steps like these to remove internal bias, as the team may use terms, or phrases, the average user does not.
Mira also recommends talking with the live agents themselves, as a resource. It is not only to help find out the most common requests, but to find out what the live agents would like the chatbot to handle upstream earlier, and pass along in the context.
After launch it is important to monitor the interactions and iterate to tune the model. Launching the chatbot is only the beginning. As Loren mentions, launch is “day zero.” It is important to look at the data, how users are interacting and the chatbot is responding, and quickly fix the issues. Otherwise, people will lose confidence in the digital assistant and go elsewhere - including calling the live-agent support, which defeats the purpose of the chatbot.
Supporting multiple channels
Choosing which channels to support is important. The panelists all mentioned being where the customers are, and offering users choice.
GoDaddy and Intuit started with text based digital interfaces for their chatbots, as those tended to be easier, and expanded from there based on where users were interacting.
Michael also recommends looking where your customers are and starting from there, before expanding to other channels. He recommends having a unified conversational AI strategy that can be used across the channels - for example, making use of a development framework that enables adding support for multiple channels in a phased approach. Sometimes he sees companies taking a siloed approach by channel, or department, which ends up resulting in a lot of rework.
Both Intuit and GoDaddy have dedicated conversational AI teams. In the case of Intuit, the team is further broken down by product (i.e. Quickbooks and TurboTax) and channel. While the majority is common, there are some unique pieces depending on the channel. For example, with the chatbot, users are in a logged-in state, and there is information known about them, and where exactly they are in the product. Whereas with voice, the starting point is the user’s phone number, and while a lookup can be done, there may be additional steps to authenticate the user and go from there.
Context is key
Incorporating personalization and keeping track of context can be incredibly useful in helping customers.
Intuit incorporates personalization and context to provide a better experience. For example, if a user is having a problem connecting their bank account, and then clicks the digital assistant, the chatbot can start with, “I see that you had a problem connecting your bank information, is that what you are trying to get help on?” As Loren points out, making use of context instills more confidence in the chatbot.
GoDaddy is similarly incorporating context. Mira recommends anytime there is context about the user that might be helpful, make use of it. While previously, there may have been trepidation with incorporating personalization, as it may have been seen as “creepy,” the industry is past that as it is all in the service of getting the customer what they need, answering their question, or helping fix their issue before they may even know it is broken.
Generating awareness
It is important to make it easy for users to find the automated chatbots, and provide choice as well - between automate chat, live agent, and help documentation.
Both GoDaddy and Intuit offer the automated chatbot directly in the customer service path - through “contact us.”
In the case of GoDaddy, they start the user with the automated assistant, but still give the users the choice of a live agent, as it is important to let the customer decide. With IVR, the flow is a little different, as it is a bit more difficult to offer options. In the case of IVR, the user will start with automation, prior to connecting to a human, if the user still needs help.
Intuit similarly offers the chatbot through “contact us,” and provides a means for users to invoke help as well. They run experiments between their help documentation and digital assistant to figure out where to surface each, and which interface is performing better. If a user wants to call, they use a conversational framework to find out more information, in order to route the user to the right person - rather than dropping them into IVR and having them re-answer questions.
Intuit also provides additional information to help the user decide - for example, showing the hours of operation, or the wait times for a live agent. It all goes back to giving the user options and letting them decide which is best for them.
Michael recommends the same approaches as GoDaddy and Intuit as well. An important piece he highlighted is to make it clear where the self-service chatbot is, so users do not have to hunt for it. For example, in one situation he saw that the company hid the chat button when agents were busy, which meant users were not able to use the self-service automated option too. If the user does not see the chatbot, they may not come back looking for it later, as they may not know it even exists.
Educating users
Educating users on what the chatbot can, or cannot do, is important for providing a good user experience. Letting users know up front, as part of the welcome message, what the chatbot can do as a guideline is quite helpful. Handling fallbacks gracefully is important to help users get back on the “happy path” as well.
Intuit has an opening statement listing some of the things the digital assistant can do. They worked really hard on handling the fallback state as well - including integrating with their help system. If the chatbot does not know what to do, the utterance is sent to the help system, and the chatbot suggests some related articles that may be able to help. They incorporate a feedback mechanism as well, to see if the user found the articles helpful or not.
In addition, Michael suggests adding buttons for some of the more common Intents in the greeting - to help serve as “guardrails” around what the chatbot can do.
To better handle error situations, GoDaddy has been experimenting with contextual fallbacks, instead of a default fallback. This way the chatbot can at least indicate it knows generally what the user is talking about, and may be able to ask additional questions to help get back on track, or perhaps even understand that the user is switching topics.
Measuring success
Success can come in many forms. It may be helping users get passed where they are stuck, get an answer quickly, or even get routed to a live-agent efficiently.
There are two key metrics for Intuit: (1) the interaction rate - how often do people accept the offer to interact with the chatbot; and (2) the escalation rate - how often do users opt to talk to a live, expert specialist. Related to the second is containment - if the user does not escalate, that can be a good sign that the chatbot was able to help the user.
Intuit also keeps track of “do no harm” metrics. They could drive for super-high containment, but that may make people so frustrated that they cancel their service. They do not want containment at the expense of users' subscriptions or success in filing their taxes, or whatever the underlying product metrics are.
GoDaddy is moving in a similar direction. They measure a helpfulness rating at the end of the content to understand what may not be working. It is an indicator of areas they may need to work on. In a way, they are looking at driving “unhelpfulness” down. They are also looking at the entire, customer-service view to measure the chatbot in the overall experience.
Michael pointed out an interesting additional metric - customer effort. For example, one of Jacada’s customers in the insurance industry measures and compares the average time the chatbot spent helping a customer through a rescue issue, versus the time for a live agent - with the goal that the automated experience will be shorter.
Impact of Covid-19
While the global Covid-19 pandemic affected our panelists’ companies in different ways, there was a common strong belief in the value of conversational interfaces.
As a solution provider, Jacada initially saw an increase in demand for customer service chatbots. Travel, insurance, retail, and healthcare companies were facing a crunch that they could not handle the volume, and needed a way to automate. Given the short timeframes to launch, however, some of the experiences were less conversational and more menu based. The interesting impact though, is how companies’ thoughts and views opened to including automated, conversational solutions as part of their long term, future plans.
Intuit already has a virtual workforce and did not suffer from lack of capacity to handle customer requests. They did, however, see a huge surge in Paycheck Protection Program (PPP) questions as they are a provider, and the digital and voice assistants were absolutely critical in helping users get information and their questions answered. Given the systems they have in place to handle crises, they were able to get the information and content in place quickly.
GoDaddy helps businesses get online. Given the global situation, they saw a huge volume of people wanting to do so. While the automated, conversational AI has been able to help users with that, they have not seen a shift in users more likely to use the automated solution because of Covid-19. It has, however, changed the conversations internally at GoDaddy about the importance and value of automation for their customers.
The future: pro-active chatbots that complete tasks
There are interesting opportunities ahead for chatbots - including enabling digital assistants to be proactive, and to complete tasks, in addition to answering questions.
Both Intuit and GoDaddy are looking to expand the use of their automated digital assistants further into their products, and make them more proactive in understanding the customer journey, and reaching out when help is needed. They want to increase the automated functionality as well. Instead of telling users what to do, they want to enable the chatbot to solve the task for the user.
Michael sees advancements in machine learning in helping make the chatbot experiences better and easier to develop. One area in particular is using machine learning and automation to improve the feedback loop of the chatbot - understanding what users are saying and knowing what is needed to fix the response, without requiring as much human involvement - i.e. "automating the automation."
The chatbot space continues to grow and the underlying technologies are constantly improving. It is exciting to learn how the panelists are incorporating chatbots and where the industry is going in the future.
Odoo and Mobile Apps Developer
4 年Thanks for sharing
Scaling Ops & Teams | Writer @ The Thursday Trailblazer | prev. Fair Square (YC W20), Clubhouse, Deloitte
4 年One of the biggest takeaways was "Intuit applied text analytics to the voice transcripts to figure out what customers were asking, and how they were phrasing their questions." From my past experience, training the chatbot - figuring the material to train on - was one of the most difficult parts!
Developer & Platform Product Marketing Leader | Community Builder | Google
4 年Great insights on bots for the enterprise. Lots of companies want to start bringing in AI into their business but don't know where to start.