How to Find Metrics for Your L&D Project
Jess Almlie
Learning & Performance Strategist. I help learning/talent leaders and teams stop taking orders and start working more strategically, intentionally, and with measurable impact.
This is the second in a series of weekly articles about measuring the impact of L&D. Be sure to subscribe to ensure you receive them all.
------------------------------------------------------------
Last week's article focused on Measure First, Design Second . The concept that we need to figure out what metrics we are trying to impact in the business before we start designing a learning experience. Ideally, those are business metrics that already exist.
But how do we find those metrics? Where do we even start?
We need to know or find the metrics that already exist in the business and then understand which apply to this particular learning experience. But sometimes, that can be a bit tricky.
If you don't already have access to metrics across the business, you are working with a different team, or haven't established a business partnership that includes exchanging metrics with the stakeholder(s) involved in this project, you will need to start with some questions. Ideally, you would be able to ask these questions in your first few meetings.
Start with questions about team metrics in general. In my experience, most stakeholders are not used to tying specific interventions to their metrics and cannot answer a point-blank question about the metrics they are working to impact. We need to guide them consultatively by asking about metrics in general before we get to specific learning outcomes.
Here are some questions to use with your stakeholder, starting with overall measurement questions then a specific learning intervention question. Finally, there are a few bonus questions that can be used in scenarios where you encounter a metrics savvy stakeholder. I use these questions in this general order, starting with more general questions and then narrowing to the specific learning intervention, inserting additional probing questions to learn more as needed.
1). Overall measurement question: How do you currently measure the performance of team members? Most teams, organizations, and companies have a way to measure the performance of their team members. Some are more detailed than others.
Here are a few examples that I have encountered over the years. Of course, the terms and metrics differ by team/organization, and this is not a comprehensive list.
Contact center agents: Talk time, speed of answer, idle time, quality scores (accuracy and customer service), ability to find answers quickly, number of support calls to supervisors, customer satisfaction surveys, etc.
Sales team members: Requests for proposal submitted, finalist presentations, sales closed, revenue generated.
Retail associates: Sales generated, number of items sold, customer satisfaction surveys, number of store credit cards opened, etc.
Other team members: Customer satisfaction (CSAT) scores, task turnaround time, employee retention, open requisitions, open service tickets, accuracy measures (accounting, data entry, etc.).
领英推荐
Leadership: Team measurement overall (the comprehensive of their team's performance metrics), progress on strategic initiatives, employee retention, 360 feedback surveys, etc.
2). Overall measurement question: Using your performance management process, how do you know who to promote or how to assign raises? This question asks from an even bigger perspective, but it is one that most leaders need to consider regularly. The answer will tell you the type of performance that is rewarded most visibly on the team.
3). Specific learning intervention question: Regarding the learning project we are discussing, how do you envision your team will perform differently after they complete it? or What will be different after completion? You can point back to the metrics the stakeholder shared earlier if they need a prompt. For example, "Do you envision this will increase quality scores?"
Many stakeholders may still note something vague, like "I just want them to communicate better." At this point, we ask, "How will communicating better make them a more successful team member?" or "How will communicating better impact customers?"
The answers to this question are key in tying the learning experience back to a tangible metric that already exists. If you don't get an answer right away, don't be discouraged! Sometimes this does take several layers of probing questions.
The remaining two questions are helpful, but also difficult to get from stakeholders. Not because they don't want to help, but most haven't thought about learning in this way before. They came to you because they have a business pain and want you to solve it. It is possible to create a learning program based around the measures you identify in the first three questions. That is why these last two are "bonus" questions. If you can get answers to the last two, you will more likely be able to determine an assumed isolated impact of your learning experience. I haven't been able to do this very often, but I still have a dream!
4). BONUS question for learning scope: What are some of the other factors that impact performance, outside of team member skills and knowledge? Or, What could get in the way of this working? Here you are beginning to get a sense of the degree to which a learning intervention can make the desired changes. There are often factors like needed technology/slow technology, resources, process efficiency, coaching from supervisors, incentives/rewards, that impact performance.
Even if you don't get an answer, this question can serve as another way to reinforce the narrative that a learning experience cannot solve all problems and perhaps plant a seed for later conversation or further performance analysis.
5). BONUS question to determine learning impact: Taking all factors into account that could impact <desired change> (e.g., better communication), what percentage do you think learning will be able to change? This is the ultimate question to determine the potential isolated impact of learning alone and even if you do, it likely won't be scientific. However, if you can agree on a percentage with the stakeholder, you can adjust any changes in metrics after the learning intervention by this percent, assuming there are other items impacting as well. This question works particularly well with strategic initiatives tied to larger projects in which many changes are being made at once and not all of them are tied to learning.
Finding metrics within the business can be difficult, especially if you haven't asked these questions previously. Stakeholders may be surprised and not prepared to share. However, if you begin asking these questions and gather metrics regularly, they will become more accustomed. It's part of "training stakeholders to work with learning teams ."
Once your work comes full circle and you can tie performance results back to your work, stakeholders will understand and be more willing to continue sharing. It's part of what establishes you as a trusted business advisor vs. an order-taker.
What happens if you ask these questions and the needed metrics simply do not exist? I've certainly run into this scenario as well. We will explore that topic next week. Stay tuned!
----------------------------------------
Want help determining your metrics or how to design with them in mind? My company,?Learning Business Advisor Consulting, is here for you!?Message me ?to talk about how we can work together to improve your impact!
Founder & CEO, Group 8 Security Solutions Inc. DBA Machine Learning Intelligence
8 个月I'm thankful for your post!
Lead Learning & Leadership Consultant
1 年Just a brilliant series of articles Jess Almlie, M.S.! Thank you for sharing your knowledge :)
Senior Director of Learning Experience at SAP | Past President A2ATD (Ann Arbor Association for Talent Development)
1 年This is fantastic, Jess! These questions are so simple but so profound. I'm saving this for future use!
Director of Learning and Development at Scott Humphrey
1 年Thanks for sharing! One of the most challenging parts of our role in L&D is measurement and these are some great questions to guide conversation with stakeholders to unpack what successful measurement can look like.
MSc HRO Candidate at LSE | Talent Strategy & Leadership Advisory @ Gartner | Change Management | Performance Consulting | Psychometrics & Assessment Centres | Ex-Deloitte
1 年These are some great questions Jess. For a recent project, I worked with stakeholders over weeks to understand their current performance metrics and how the correlate to the learning activities we have on the program. In some cases there was a direct correlation, in others our learning activities led to an indirect improvement in that metric so we decided to track it still. Do you think a direct correlation is a necessity or indirect impact should also be accounted for? I love the isolating impact part, would surely introduce it into my conversations moving forward :)