XM Weekly Inspiration | 12092022

XM Weekly Inspiration | 12092022

Hello.

Welcome to a new week!

To retain existing customers and acquire new ones, businesses seek information on ways to improve and one of the channels for improvement inspiration is through customer feedback.

Described as information coming directly from?customers?about the satisfaction or dissatisfaction they feel with a product or a service, customer feedback is extremely important because it provides a customer's perspective on what, why, where, when, who and how to:

  • Improve the quality of products or services?
  • Improve the quality of people (profiles, skills & knowledge)
  • Improve the quality of processes & policies
  • Improve the quality of tools & systems??

Whether the feedback received is an opinion, a complaint, from a walkthrough, a product/service usability issue, a software bug or a product feature request, customer feedback is either direct/indirect or solicited/unsolicited. While there are several channels through which businesses receive feedback from customers my experience management (XM) inspiration this week is focused on surveys as a channel which is quite popular with businesses seeking to gather specific information about how customers feel about their experience, conduct market research or gauge customer expectations.

Speaking of surveys, as a customer that has patronized quite a number of brands this year, I get requests to participate in surveys quite a bit and I make it an obligation to share my thoughts with these businesses so they can improve (as applicable) so that they can improve and I don't have to switch providers.

No alt text provided for this image

From the email surveys I have checked out this year, I have observed mistakes that businesses make when designing and/or deploying customer feedback surveys. Interestingly, the emails I get are always nicely worded thanking me for my patronage and asking me to share my experience with them so they can improve. This gets me every time so I can't help myself. I am always more than happy to share information that will help businesses improve and stay in business.

After clicking on the link in the email, what next? Let's walk through my feedback survey expectations.

I expect the survey questions to be:

  • SPECIFIC i.e. targeted at an experience I had with a specific product, service or a touchpoint at a specific time
  • RELEVANT i.e. applicable to me as a customer i.e. don't send me a survey that has nothing to do with me and/or my history - demonstrate customer understanding
  • TIMELY i.e. don't send me a survey 6 to 12 months after I buy from you because I wouldn't remember a thing
  • RATABLE i.e. make sure that I can read your questions, qualify my experience and rate my emotions or perception of my experience.
  • ACTIONABLE i.e. show me that you want more than routine single/multiple choice questions - this tells me you want more detail so you can act

With the above in mind, it's time to share with you my take on the common mistakes I have identified.?


Common Customer Feedback Survey Mistakes Businesses Make

No alt text provided for this image

From my experience as a customer and a customer experience consulting practitioner working with B2C and B2B clients, there are 10 common mistakes businesses and professionals make when designing and deploying customer feedback surveys and they are listed below.

  1. Deploying a survey. Yes, deploying a survey in the 1st place, is the 1st mistake. In some cases, do you really need to deploy a survey just for the sake of it or are there other ways you can get feedback from your customers especially when you typically record zero to low survey response rate? There are other ways to explore getting customer feedback such as ethnography, social media reviews, one-on-one interviews, live chats, online opinion sites, etc. At my company, we use more one-on-one interviews especially post the completion of projects with B2B clients so we are not just hearing what they are saying, we are seeing it as well and can tell the impact of our actions on their overall experience.
  2. Not having an objective. This happens every time and recently with my cable television provider. I was asked to rate my experience with customer service and after that, asked why I didn't accept a promotional offer. Each survey should have one objective. In some cases, you can kill multiple birds with one stone by deploying an end-to-end CX survey that takes into cognizance your customer's entire experience, for example, a new customer's onboarding experience from sales to support, etc but it should flow as one objective is getting feedback from your customer about his/her overall experience with your brand
  3. Sending an irrelevant survey. I experience this mostly with banks. There was an instance when I was sent a survey requesting information about my most frequent transaction with the bank and my channel of preference meanwhile, I had not used the account since I opened it 6 to 8 months before getting the survey. As a customer whose account was probably dormant, I should not have received this email survey in the first place.
  4. Asking for the information you should have. This is connected to common mistake 3 which is asking a customer to tell you their most frequent transactions, especially for a bank. You should have this information. If memory serves me right from my 10+ years of experience working in 3 banks, every bank can tell or has access to customer information, transaction types, etc. If any bank is unable to access information on a customer's preferred touchpoint easily, it's time to think about omnichannelizing (I just had to create the word ??) the kind of experience you give your customers.
  5. Sending a survey to a customer with an unresolved query. Ever experienced this? That's the impact of routine surveys automated in CRM systems and has happened to me quite a number of times. Businesses I have patronized have a habit of sending me a survey after contacting support requesting feedback even when my issue has not been resolved and an agent has promised to get back to me. When this happens to me, I always give a poor rating and this makes a mess of customer feedback data especially when it forms a part of an agent's quality assurance (QA) performance appraisal.
  6. Beginning a feedback survey with an NPS question. This is a common mistake I also see global brands make. A business sends a nicely worded email, thanking me for my patronage and soliciting my feedback to help it improve and when I click on the link I read the question -- "how likely are you to recommend xxx to family and friends?" This tells me that the business is not interested in improving and is prioritizing its marketing efforts or focused on ticking its NPS box. What do I do when I read this as the 1st question? I close the survey.
  7. Frequency of the feedback survey. From a post-transaction perspective, this common mistake is about the number of times you ask a customer for feedback especially when the customer buys from you frequently. This is where some level of intelligence needs to be applied. If I make money transfers daily, do I need to give feedback on my fund transfer experience daily?
  8. Having too many questions. There is a difference between a study and a customer feedback survey. While the latter should be a limited number of questions, you would generally expect the former to include a lot more questions. This is why, it is important to integrate customer feedback management into the applicable cycle stage of a customer's journey and store the data so that at the end of the day, you don't need to ask a lot of questions as you can connect the dots by merging customer feedback data and operations data across all stages. For example, instead of asking a customer to give his/her feedback 6 months after buying, using and sharing a product by asking 15 to 20 questions, you can combine feedback data from the customer's buying experience, support experience and loyalty data. This way, when you engage, you streamline and reduce the number of questions focusing on asking questions that will help you connect the dots.
  9. Having too many answer options to one question. Have you ever seen 7 to 10 answer options to a survey question to rate your overall satisfaction? Well, I have and this is another common mistake. Here is an example of this mistake i.e. answer options to an overall customer satisfaction rating question - Extremely satisfied, very satisfied, satisfied, somewhat satisfied, fairly satisfied, natural, dissatisfied, somewhat dissatisfied, fairly dissatisfied, very dissatisfied, extremely dissatisfied, etc. What this tells me is that you are trying to skew the answer in your favour. I once had a B2B client that used this type of answer option and although their satisfaction scores were always high, repeat sale performance was poor. After we streamlined the answer options - reducing them to 5, we got a score that was more reflective of the state of CX in the business.
  10. Questions and answer options that are disconnected from a customer's experience. I have seen businesses design surveys from their own perspective and not from the customer's perspective #insideout . Survey questions are then so operational and customers have a hard time connecting the questions and answer options with what they experienced. One thing that features in this mistake is the use of jargon and usually, this is when I know that the person who designed the questions is far removed from my actual experience as a customer.

In conclusion, making these mistakes negatively impacts the quality of feedback you get which also negatively impacts the quality of improvement you make on your products, people, processes, policies, tools and systems. I will also add that the quality of a customer feedback survey also reflects how well a business knows its customer and is aware of its customer's journeys.

There you have it! These are the 10 common customer feedback survey mistakes I have observed that businesses make.

Are you making any of these mistakes? I hope this article gives you a reason to stop making these mistakes and THINK DIFFERENTLY about how you manage customer feedback in your organization.


Have a good one! ??

deBBie akwara (Pan Africanism CX Advocate)?

CEO,?Niche Customer Experience Group

Helping Businesses in Africa ?? Identify & Fix Gaps in Customer Experience to Increase Customer Profitability.

Gustavo Imhof

Author of Transformative Letter | Product @ TestGorilla | Career Coach with 250+ success stories | Senior Product Manager | Customer Discovery Expert | #1 International Best-Selling Author

2 年

I need to disagree with you on the one re starting with NPS on the survey. The mistake is actually NOT starting with your primary measure on your survey. In any survey ever, the cleanest and most reliable datapoint is the one collected first, followed by the second, third, etc... Due to spillover effect from previous questions. Let's say I decide I want Nps for my hospitality brand but follow the advice highlighted and put it to last The customer will be asked about cleanliness, service, quality of room service, amenities etc... When the Nps question come, it will be heavily waited towards those parts of the experience because they will be top of mind. If you run analytics, you eill conveniently see that you correctly predicted the drivers of the experience. Turn it on its head and start with the Kpi. You will notice that those attributes have a far smaller predictive power. Why? You didn't condition them and you have a much closer to the truth answer from them. The global brands that start their survey with CSAT, NPS, CES are in the minority that get it right and have more robust data to play with

要查看或添加评论,请登录

社区洞察

其他会员也浏览了