Navigating the Path to CES Implementation: My Journey and Insights

Navigating the Path to CES Implementation: My Journey and Insights

In my previous article, I delved into Customer Satisfaction (CSat) as a crucial metric for measuring Customer Experience (CX). However, there's another significant metric organisations can consider: the Net Promoter Score (NPS), which varies based on objectives, sector, and organisational stage. Following CSat and NPS, Customer Effort Score (CES) emerges as another commonly used CX metric.?


Customer Effort Score (CES) is a metric derived from a customer satisfaction survey that measures a product or service's ease of use to customers. A Customer Effort Score reflects the amount of effort a customer had to exert to use a product or service, find the information they needed, or get an issue resolved.?


There are three different formulas or methods to calculate CES available in the web world:

Option 1: Example If 100 people responded to your Customer Effort Score survey, and the total sum of their scores amounts to 700, that means your CES score is 7 (out of 10).

Option 1


Option 2: Example, If 100 people responded to a survey, 60 of them responded positively, and the rest 40 responded negatively. By subtracting 40% negative answers (40/100 x 100) from the 60% positive answers, you get a CES score of 20%.

Option 2


Option 3: Example: If you have opted for a 7-point scale and 60 respondents out of 100 have opted for 5/6/7/ i.e. positive scores then the CES will be 6 (60/100X100).?

Option 3


Regardless of what constitutes a good CES score, it's essential to recognize that there's no universally accepted industry standard or benchmark. Rather than comparing against external metrics, the emphasis should be on internal growth and improvement.


By conducting periodic surveys for the same tasks and analysing how scores evolve over time, you can gauge progress and aim for continual enhancement in customer experiences. Thus, your standing in CES scores is more about your journey of improvement rather than meeting any external standard.?


Now the key question at this juncture is: How do I start this journey at the ground level?? So today I am going to share my hands-on experience and challenges in implementing CES.?


My journey in kickstarting the implementation of CES began with a focus on data management, particularly at the Call-To-Action (CTA) level. Recognizing that CES questions are typically triggered by specific actions, I delved into understanding how these CTAs and events were structured at the backend.


Diving into this revealed significant gaps in the current management of CTAs and events. It included aspects like inconsistencies in data recording, inefficient tracking of customer interactions, or lack of standardised procedures and nomenclature for event handling. Addressing these gaps became imperative as they formed the fundamental groundwork required for setting up CES effectively. Consequently, I had to temporarily pause the CES initiative to concentrate on streamlining this aspect.


The process of addressing these gaps and streamlining CTA/event management became a pivotal narrative in itself, highlighting the intricate journey of preparing the groundwork for CES implementation. For a detailed account of this effort, you can explore the narrative provided here.

Once the CTA and their respective nomenclature were streamlined, my CES journey started.?

  1. I mapped merchant journeys across dashboards.
  2. Identified relevant journeys for different products.
  3. Created 5-point CES questions for each CTA.
  4. Designed qualitative follow-up questions based on CES question score choices.
  5. The tool not only included “how easy or difficult” questions, but also other aspects like usefulness.


The entire journey of implementing CES revolves around the meticulous orchestration of survey creation and deployment, treating it not merely as a task but also as a comprehensive program management endeavour. The process and steps that I went through:?

  1. It starts with thorough documentation: The process begins with detailed documentation, outlining every aspect of the survey program. This includes Call-To-Action (CTA) items, associated event names, stakeholder responsibilities, and approval processes for survey questions.
  2. Then comes stakeholder engagement: Identifying stakeholders responsible for different aspects of the survey dashboard is crucial. Each stakeholder needs to be involved in the process, from question approval to the alignment of survey launch dates.
  3. Followed by Alignment on Dates and Updates: Ensuring alignment on survey launch dates and any updates regarding features is essential. This involves constant communication and coordination to avoid discrepancies and ensure timely execution.
  4. Then design the placements/ CTA post which at what juncture the survey question should appear: Deciding where survey questions should appear is important for backend reference and user experience. It involves strategic coordination with the backend team, for them to time launch accordingly. This is also important for the right placement to gather relevant data without disrupting user flow.
  5. Regular process alignment and checks: The entire process requires alignment at every stage, from data input to scheduling surveys according to stakeholder priorities. Regular checks are necessary to ensure data accuracy and program integrity.
  6. Being prepared for technical challenges and decision-making: Anticipating and addressing technical issues demands foresight and decisive action. This may involve troubleshooting technical glitches, optimising survey performance, and adapting to evolving requirements. For this what is the key is to have a team with which you work from day one and have visibility and clarity of your vision. So that they can understand the delicateness of the matter and support you when needed.?
  7. Above all Participant-Centric Approach: Amidst the technical intricacies, being a researcher your key role is to maintain a participant-centric focus. Considering factors like timing, frequency, and question types ensures a positive respondent experience and enhances data quality.


At its core, implementing a Customer Experience Survey (CES) isn't just about crafting questions or formulas; it's a comprehensive process that spans from initial planning to execution to analysis. As a researcher, success hinges on clarity, decisiveness, and the ability to garner buy-in from stakeholders. This requires a skillful blend of strategic planning, clear communication, and an unwavering commitment to generating actionable insights.


Moreover, it necessitates the capacity to rally support from individuals across the organisation, even those outside one's direct reporting lines. In essence, it demands a holistic skill set that goes beyond traditional research abilities.


Reflecting on my own journey in implementing CES initiatives, I've found that this multifaceted approach is not only effective but essential for success. I hope these insights prove valuable as you navigate your own path in this realm.



要查看或添加评论,请登录

社区洞察

其他会员也浏览了