Are you "client focused"? Can you show it?
Nathan Hutchison
Helping APAC tech organisations deliver better value and experience with compassionate leadership and accountability
I can’t remember any organisation I’ve worked with ever saying they aren’t client focused or putting the customer at the centre of everything they do. Whenever I’ve dug into it though, there’s been little to no measurement to show this, with most measurements based around quantitative metrics used to measure the performance of the team.
Typical measures:
·????????Response time > are we measuring if the response we provide is useful and understandable? We we have to continually revalidate as we didn’t capture the first time?
·????????Billable time > are clients more concerned with how much we bill them or that they are getting good service, value and the outcomes needed?
·????????Resolution time > do they understand the resolution provided and did it help them to get their jobs done? Did we get it right the first time?
·????????Uptime > it may be up, but is the service or app fit for purpose and performing as needed?
·????????Project > could be on time and on budget but did it help them achieve the outcome they needed?
One way to look at this is to couple experience level agreements (XLA) with service level agreements (SLA). Customer satisfaction (CSAT) is usually our starting point but that is specific to a service interaction, sometimes much later than after it has finished and generally has low uptake. Net promoter score (NPS) with a stakeholder may introduce bias from the stakeholder relationship which isn’t experienced throughout the rest of either organisation.
领英推荐
Few seem to be using customer effort score (CES) which could help drill down into some more specifics with the service level targets you’re measuring and how they are being perceived and experienced from the end users. It may help to identify where expectations don’t line up, or where we need to make improvements, without understanding the experience we’re leaving a lot to assumptions and risking the watermelon effect.
Some examples could be:
·????????How responsive is our team? > Response time
·????????Are you happy with your systems/app performance? > Uptime
·????????How likely are we to get it right the first time? > Resolution time
·????????Was the project successful, are you adopting/using it/know how to? > Project/change
Are you using XLA (experience level agreements) and how are you capturing them? Do you have any examples of using XLAs? How you have found the uptake and response??
Infrastructure & Operations Management | ITIL & Service Delivery Expert | Cybersecurity & Risk | Artificial Intelligence AI Integration, Automation & Development | Technical Project Management
2 年“You can't improve what you don't measure.” key is getting the right measurements and not greenwashing/tick boxes - particularly in assurance space. Ive seen a lot of problems with depth to what is being measured.