3 Things We Learned from Analysing Corporate Learning
Panda Training Oy | We Sensemake Learning | https://panda-training.com/

3 Things We Learned from Analysing Corporate Learning

It's a good time of the year to look back at what we've done so far and see what we learned about learning after roughly 6 months of analysing training effectiveness. Perhaps unsurprisingly, data is very revealing. It showed us that common sense isn't always common practice. Here are our top 3 insights from 2018!

(1) Relevance is king

Numerous times again and again we see a direct correlation between the relevance of a training to employee's work and the perceived learning. It's very understandable: if people feel that the topic is relevant to their work (or life), they are more likely to be motivated to learn and subsequently more likely to apply the knowledge in their work. Unfortunately, checking the perceived relevance before sending everyone to the training isn't common practice and we've encountered cases where up to 50% of the learners didn't feel that training was relevant for them. What are the implications, you may ask? We could claim that in such cases half the training budget is essentially wasted.

Near future cases will allow us to connect relevance also directly to post-training learning that employees report 3-6 months after the training.

(2) NPS

NPS stands for Net Promoter Score and is used as a management tool that can be used to gauge the loyalty of a firm's customer relationships. If you've never heard of NPS before, here is a Wikipedia article. A couple of months ago we re-posted an article about NPS as a metric. Read it if you haven't yet, it's very insightful.

The NPS question itself is very confusing. It reads "How likely are you to recommend this training to your colleague or friend?" To explain our point, here is what one of the learners told us as a feedback:

"I was confused about the scale in this question. I ended up answering 5 to this question. I have and will recommend the training for my colleagues and friends. However there are many people that I cannot or will not recommend it too, but I still feel that I will recommend the training to more people than what a 5 is worth."

Second big problem with NPS is the science behind it. Unlike all the other metrics that get reported as means, NPS is calculated by subtracting the amount of 0-6s (detractors) from 9s and 10s (promoters), disregarding 7s and 8s (passives). To say the least about why it seems like an awfully bad idea, just imagine that in one group 10 people would give one training a 6, while in the other another 10 people would give another training a 0. In the NPS universe there is literally no difference between those answers and both groups would get the same NPS score.

Another reason for NPS being a bad metric is that the best research questions are about past behaviour, not future behaviour. It's hard for people to say what they will do in the future and quite often these kind of questions are meaningless.

That being said, for now we decided to still keep NPS in the questionnaire due to how ubiquitous it is. Yet, in the future we are likely to get rid of it as a metric that might seem like it's the one that brings value to our clients, while it may potentially be harmful.

(3) Unmatched expectations

Another learning of ours is yet again very simple but could be good for the professionals to have a proof of: unmatched expectations wreck havoc. Last minute change of training content from one to another, when the people's minds have already been set on one, leads to participants seeing every aspect of the training in worse light than those really are. Even in the situation when the substitution is better than the original.

What this means is: either deliver what you promised or don't deliver at all. Or don't promise.

Bonus: IT trainers

We've observed a general consistency between how good a trainer's expertise is and how good his or her skill of delivering the knowledge is (how good of an educator they are). However, that seems not to be the case in many IT trainings: participants often score the trainer's delivery of knowledge considerably lower than in other trainings and lower than how they score the expertise of IT trainers. This might be a direct indicator of the IT training industry. In the presence of a hard skill, education and learning parts, presentation skills, facilitation skills often fade away. That should not be the case, as learners expect a higher quality of knowledge transfer. The potential is huge and is often easy to harness since practical aspects of IT are often easy to practice and learners frequently praise learning by doing methods.

As you can see, many of our learnings are common sense, but they aren't necessarily common practice. It's one thing to be aware of them, it's another - to know what works and what doesn't. That's exactly what we do in Panda Training - get in touch if you want to know more.

Dima Syrotkin, CEO at Panda Training Oy


Chris James∞ Founder∞Excel Trainer

Do you or your colleagues struggle to use Excel in an automated way? I help people develop their spreadsheet skills.

5 年

"Common sense isn't always common practise" true in my experience with this. These insights are thought-provoking and important for us in the training word - to condider and re consider.

回复
Stefana Sopco

Marketing Manager @PortXchange [B-Corp] | Decarbonizing Ports | 2x FortyUnder40 | Top 100 tank storage influencers | Women in Tech | WISTA NL | Role Model Diversity & Inclusion | Animal Lover

5 年

Interesting article Dima.? We’ve also been working on a set of free resources and tools intended to support companies dealing with learning at scale. Yourlearningvalue is a test which provides you with insights into your corporate learning approach in under 60 seconds! Unfortunately the learning gaps in companies can affect their revenue in way that sometimes do not offer any turning back.? Let me know what you think about the test.? www.yourlearningvalue.com

要查看或添加评论,请登录

社区洞察

其他会员也浏览了