Gyst Case Study - Customer Success Story #2
Customer Success Story - Healthcare

Gyst Case Study - Customer Success Story #2


"We reduced caller input errors by 20 percent and increased engagement in the IVR by 36 percent”

- Director of Customer Service


The customer

This health insurer is committed to putting health first – for its teammates, its customers, and the company itself. Through its insurance services, they make it easier for the millions of people we serve to achieve their best health – delivering the care and service they need, when they need it. These efforts are leading to a better quality of life for people with Medicare, Medicaid, families, individuals, military service personnel, and communities at large.


The problem

Calls coming into the contact center are directed to a voice self-service application that attempts to handle patient claims, provider benefit inquiries, and insurance premium quotes for both service providers and individuals (members) insured under their plans. The application receives in excess of 10,000 calls per day.

Since members call the application relatively infrequently, they tend to be less skilled at navigating the call script. Additionally, they tend to be less inclined to learn how to use the IVR. Many will opt for a human as soon as they encounter reprompt/retry messages, or when the IVR?prompts become cognitively challenging, or if they feel self-service is becoming unproductive for them.

Providers on the other hand, call the application several times per day and are generally calling for a specific, well-defined purpose such as benefits coverage or a claims inquiry. They know from past experience that the IVR is the fastest way to answer their inquiries and that dealing with an agent may actually take longer.

Part of the problem this customer faced was handling this wide range of caller types that make up their daily call volume.


The solution

During the initial proof of concept (PoC) period with the customer, we implemented our technology to determine what effect dynamically and automatically adjusting the audio playback rate of voice prompts in their IVR would have on voice self-service performance. During this period, we ran A/B tests on over 20,000 phone calls over a one week period. We also used Gyst Analytics to collect data on caller behavior as it relates to engagement within the voice application. Existing voice prompts were speed adjusted in direct relation to individual caller skills.

During the trial, audio playback speed adjustment levels of 100, 110, 114, 117, and 119 percent were used. Subsequently, we tried altering the speeds to 106, 112, 115, 118, and 121. A playback level of 100 here indicates the normal playback rate of the audio, 110 represents 110 percent of normal, and so forth. Audio was adjusted in accordance with the detected skill level of each caller at each conversation turn in the voice application.

See the PoC results on over 20,000 phone calls on our?web site?here.


Thank you for sharing this interesting case study. Have you noticed any specific trends in the data that could provide long-term benefits for customer service strategies?

要查看或添加评论,请登录

Daniel O'Sullivan的更多文章

社区洞察

其他会员也浏览了