Measuring Customer Satisfaction (CSAT)

Measuring Customer Satisfaction (CSAT)

A colleague was recently looking for insights on CSAT metrics and advice on calculations. Like so many things in business and in life, the answer is situational. First, we must determine what you are trying to learn, and then how you are trying to learn it.


For Customer Satisfaction (CSAT), there are two common ways an Enterprise Software Support organization measures it. One is a transactional survey specifically about Support, typically triggered upon case closure. The other is some variation on a semi-annual tell-us-about-our-company broad customer survey. I will start by discussing transactional Support-only post-case-closure surveys.


Side note: there are many methodologies companies use for similar feedback, including “Net Promoter Score (NPS)”, “Customer Effort Score”, “Gap analysis”, and others. All are valid for various purposes. This paper does not intend to say that CSAT is the best or only approach to get customer feedback. The discussion here is relevant to other similar practices.


A five-point scale

In my experience, a 5-point scale is appropriate. 1 (“Highly Dissatisfied”) and 2 (“Dissatisfied”) are negative scores (“DSAT”). 3 (“Neutral”) is OK, but you should always try to do better. 4 (“Somewhat Satisfied”) and 5 (“Very Satisfied”) are satisfactory. Build a scorecard of the results for your organization, count the number of 4s and 5s (satisfied customers), and divide by the number of responses to get your percentage of satisfied customers. Also take the average of the scores to learn your overall satisfaction rating. You might also want to measure "top box" (only 5s) for internal recognition.


Say you had 10 results, with scores of: one @ 1, one @ 3, two @ 4, and six @ 5. Your eight 4s and 5s put you at 80% customer satisfaction. Your overall satisfaction score is 4.2 (somewhat satisfied). And you have 60% “top box” – your most-satisfied customers.


What is a “good score”? Well, that depends a lot on your industry and your history. Whatever number you get when you start surveying, you can and should always try to learn from it and drive your scores up. If you have “80% satisfaction”, as above, drive for 85%, and if you are at 60%, drive for 70%.


Ask good questions

Here are some sample questions based on what my company has used. The first one is the "overall" score, which we focus on, but the rest give us insight into that number. We try to differentiate the Support experience from the Product experience, as a bug may upset a customer, but we want to know how well we handled it. As with anything around surveys, forms, and reports, only ask/measure what you care about and will act on. The more questions you ask, the less likely you are to get responses. Though the example below is a little long, usually around five (5) questions is appropriate.


  • How was your overall Support Experience? [1-5]
  • Primary reason for overall rating? (text)
  • TSE Professionalism [1-5]
  • TSE Technical Skills [1-5]
  • Ease of resolving my issue [1-5]
  • Quality of product [1-5]
  • Time to resolution [1-5]


Responding to results

What should you do when you receive a survey back from a customer? If a 4 or 5 (satisfied) comes in with no or few comments, it’s good to log the scores and share them with internal staff, but there is usually no compelling reason to reach back out to the customer unless they were recovering from a “get well plan”. If there are comments calling out something special somebody did, it is nice to reach back out to the customer to thank them for their positive feedback.


A negative survey (1 or 2) should result in an investigation of the case in question, followed by customer outreach, especially if their reasoning is not clear or obvious. There are several reasons a customer might respond with a dissatisfied response:

  • Wrong button: it sounds funny, but I’ve seen it multiple times. A customer hit 1 when they meant 5. That’s because some surveys put the better scores on the far left, and others on the far right. Or, if there are just numbers without words or icons, it can get confusing as to worst vs. best. In these cases, if you can get confirmation of their intent and your systems allow it, it is OK to update the official record with the intended (higher) score.
  • Invalid response: I have had situations where my customer’s customer has had a problem and somehow found their way to my Support team. Our agent correctly and politely directed them back to our customer’s Support team. We subsequently received a negative CSAT survey because we “refused to help them and sent them away.” My personal belief is that a survey from somebody who is not entitled to Support should not be counted. However, you should still ensure that your agent was as diplomatic as possible, and you may want to still reach out to the end user to ensure they understand where to turn for future assistance.
  • Upset with product, not service: There are usually two reasons for a negative product rating. One is a customer who is upset with a bug or other service interruption. Not all incidents are resolved quickly enough, and not all bugs are addressed in a timely manner, if at all. Even bugs which get resolved may take time to properly investigate the root cause and verify the resolution. The other is a customer who dislikes a product design issue or is requesting a feature. Many companies may direct the customer to an “Ideas” web page to submit feature requests, but those are often perceived as a black hole. Depending on the value of the customer to your company (customers are often placed in tiers based on revenue or strategic value), this may require outreach from either the account team, product management, or an executive sponsor for the account. “Thank you for reporting this to us and sharing your views. We will let you know if or when this is included in a production release.”
  • Legitimately upset with service: This is your learning opportunity and why you are sending surveys in the first place. Is this a skills issue with the agent (soft skills or technical skills), or is it revealing a process or tools failure? Negative surveys are bad, but if you don’t use them to improve your service delivery then you have missed the main point of the exercise.


Broad surveys

For surveys like a semiannual survey of your full customer base, try to limit the number of questions (20-30 is a good target), and make sure each area (Support, Sales, Product, whatever else is important to you) has some representation. The guidelines around a five-point scale and how to measure each area are the same as the transactional survey discussed earlier. For higher response rates, I've worked for companies who have offered small rewards such as gift cards or company swag for responding. A critical factor here is that the company must take action based on the feedback received, and, if possible, be visible about having done so.


Setting SLAs (Service Level Agreements)

SLAs are a contractual agreement between a vendor and a customer about the vendor’s performance. These tend to be around system availability, Support case metrics, and other performance factors. In some cases, there may be a penalty the vendor pays to the customer for failure to meet these criteria. I have written more on SLAs here: https://www.dhirubhai.net/pulse/slas-wish-lists-miles-goldstein/.


You should never set an SLA on customer feedback (i.e., CSAT) - it's way too easy for a customer to say, "I gave you a 1, therefore you owe me money." SLAs should be based on objective and measurable performance, such as system availability and things like case resolution time.


As for what you should set any SLAs to, I would start by looking at your history. If you are already able to respond to high-priority issues within an hour, resolve most cases within a day, and have 99.9% uptime, then those are good places to start. If you cannot do any of those things (yet), then be cautious committing to them contractually. And beware of penalty-based SLAs!


Note that while you should not base customer SLAs on CSAT numbers, you can use them for internal staff goals and recognition (see https://www.dhirubhai.net/pulse/setting-goals-support-staff-miles-goldstein).


End with a quote

There is an old saying, “You can please some of the people all of the time, you can please all of the people some of the time, but you can’t please all of the people all of the time” (attributed to John Lydgate and apparently used by President Lincoln). That may be so, but don’t let it stop you from trying.

Thank you so much for sharing this wonderful piece with us. I believe, many people will find it as interesting as I do.

回复

要查看或添加评论,请登录

Miles Goldstein的更多文章

  • Diversity and Bias

    Diversity and Bias

    Diversity and Bias – D M Goldstein, February 2025 Have you ever looked at a group or team and noticed a high percentage…

    3 条评论
  • Bug Prioritization

    Bug Prioritization

    Bug Prioritization – D M Goldstein, January 2025 One question I have seen many times from my peers is how to prioritize…

    2 条评论
  • Where Should Support Report?

    Where Should Support Report?

    Where Should Support Report? – D M Goldstein, December 2024 I have taken part in online discussions and surveys about…

    3 条评论
  • The Accidental Career

    The Accidental Career

    The Accidental Career – D M Goldstein, November 2024 “Life is what happens to you while you're busy making other plans”…

    8 条评论
  • Article 36 (Time Management)

    Article 36 (Time Management)

    Article 36 (Time Management) – D M Goldstein, October 2024 I did a Google search for “Article 36” and found links to…

  • My Article About Artificial Intelligence

    My Article About Artificial Intelligence

    My Article About Artificial Intelligence – D M Goldstein, September 2024 Artificial Intelligence. AI.

    3 条评论
  • A Question of Balance

    A Question of Balance

    A Question of Balance – D M Goldstein, August 2024 “Balance” usually refers to a stable state of equilibrium among…

    1 条评论
  • Write or Wrong?

    Write or Wrong?

    Write or Wrong? – D M Goldstein, July 2024 There is an adage that you should, “write what you know.” That sounds like…

  • Escalation

    Escalation

    My ancient American Heritage Dictionary (Houghton Mifflin, 1976) defines “escalate” as, “To increase, enlarge, or…

  • Open to Work

    Open to Work

    Open to Work – D M Goldstein, May 2024 I recently joined the ranks of laid-off technology workers. There are endless…

    10 条评论

社区洞察

其他会员也浏览了