Taking humanity out of the equality equation

Taking humanity out of the equality equation

Unconscious biases are always going to be present when undertaking a recruitment process. What can be done to ensure fair representation and broaden diversity?

Deep rooted biases are hard to disregard

It’s all well and good saying that coaching individuals around their unconscious biases can help achieve a more equal and balanced workforce. Unfortunately, as humans, we always have the power of choice and this includes acting on our own biases; however minor they might be.

This happens a lot when working in recruitment; you have one bad experience with an individual and their profile can affect how you perceive others in the future. This is an example of experienced based learning.

Unconscious bias isn’t present in babies but through associated learning, it develops between the ages of 17 months and 6 years old. This, unfortunately, subsequently becomes an integral part of our personalities and shapes our beliefs.

This got me thinking...if this is engrained over a lifetime, how long would it take to unlearn these biases? As a society we generally don’t want to wait for change, so what are some quick wins to help broaden diversity without affecting fairness?

Taking steps to overcome biased processes

Whilst researching this topic I came across an area where diversity is a real issue - the field of Physics. In this instance, there was a case study carried out in Ireland that revealed Physics is male dominated and its fellowship awards are also male dominated having a split of 75% men to 25% women. The scientific community wanted to address this without affecting equality or fairness. They came up with a brilliant idea which removes human bias from the equation completely without affecting fairness.

“Removes human bias from the equation completely”

All universities in Ireland are allowed to submit 6 students who apply to attain a fellowship in Physics, but they were getting few female applicants through and of all the applicants, men were predominantly successful. So how do you maintain male applications whilst increasing female applications and guarantee a fair assessment of all applications without bias toward either sex?

At this point you have 2 options:

Option 1) Split the number of applicants out of the existing pool, allowing for 3 women and 3 men to apply. You might think this is fair but essentially it has reduced the number of men getting the opportunity to apply by 50%. This has the potential of pushing male applicants away from the field!

Option 2) Keep the number of male applicants allowed to apply the same at 6, therefore not changing their likelihood to successfully apply and add 6 new slots which are only available to female applicants.

Option 2 guarantees there are equal opportunities for both sexes, and the process hasn’t negatively affected fairness for male applicants.

The universities took this further and removed the judge’s ability to act on their bias by assessing all applications blind, awarding the fellowship on merit alone. This simple change made the entire process fair and equal and resulted in a 50 – 50 split of fellowships awarded to both men and women assessed unbiasedly against each other. Everyone wins and no one is better or worse off than anyone else.

Now this is in a very niche setting, but I believe it has some real world applications.

An opportunity for Artificial Intelligence to prove its worth within recruitment

In terms of volume hiring within recruitment campaigns and especially at graduate hiring level for companies where they have a set number of application slots, this would start delivering fairness and equality at a grass roots level and aid recruiting purely based on merit.

This is also an area where the application of AI within a recruitment process could aid fairness and equality by removing the human factor from the “application sifting”. This would allow AI based algorithms to sift through applications and deliver an interview shortlist without a human, and their potential bias, influencing the decision.

Although this sounds futuristic, it is not fiction but reality. In a recent report by PwC, it is outlined how L’Oreal have implemented an AI system to do just this. The AI system they have implemented has made hiring “10 times faster, it increased retention by 25 percent and 25 percent more applicants are interviewed”, all whilst reducing human interaction with the hiring process.

People will inevitably be involved in the recruitment process, and biases could potentially be acted on at this point. The only way to minimise the effect is to ensure a diverse, equally represented interview panel is involved in the final hiring decision.

This is policy led and is actually a very easy process to implement and manage with the current ATS systems out there.

I’d love to hear if this, or something similar, has been implemented in your hiring process; how you managed the implementation and did you experience any push back?


要查看或添加评论,请登录

Henry Lee的更多文章

社区洞察

其他会员也浏览了