Legal Ruling on AI in Recruitment: A Potential Game-Changer
PeopleWeek S.A. - Human Resources Software
HR and Enterprise Collaboration Software company based in Switzerland.
With all that has happened in international politics and business in the past 10 days, not to mention a global IT meltdown and Joe Biden’s historic decision yesterday, those of you interested in recruitment are probably unaware of a potentially significant legal ruling last week relating to the use of AI in recruitment [learn more]. On 15th July, in a landmark case, a federal judge in California denied the attempt of a well-known global HR software provider to dismiss a class action lawsuit against it that alleges its AI-powered job screening software perpetuates biases. U.S. District Judge Rita Lin ruled that the HRIS provider could be seen as an employer under US federal anti-discrimination laws because it performs screening usually done by its clients. The lawsuit, filed by Derek Mobley, claims he was overlooked for over 100 jobs due to his race, age, and mental health. This case is the first class action against AI screening software and may set a precedent for the legal aspects of AI in hiring.
In a recent article about the potential of AI in HR [Blog Q1], I mentioned that the area of HR that has seen the most usage of AI (and Machine Learning) up till now is recruitment but that it is not without controversy. I pointed out that whilst many organisations are now leveraging AI to screen applicants, its critics are concerned about potential data privacy risks and the possibility of an AI system being trained on past recruitment data, which may unintentionally perpetuate biases, leading to unjust and discriminatory results for specific applicant demographics. This goes to the heart of Derek Mobley’s case.
I first started looking into the use of Machine Learning in recruitment about 6 or 7 years ago. I was interested in deploying tools that would make the candidate filtering process more efficient AND more objective. For example, I was concerned about senior hiring managers (those who ultimately make the decision on who will join their department) having a propensity to hire people that “look like them”, and that this starts with filtering out applicants that do not “look like them”. When I say, “look like them”, I’m referring to candidates that have a very similar educational background, meaning they graduated from a small and elite group of universities and studied a narrow range of subjects. As you might expect, applicants selected for interview based on such specific criteria (which is not a pre-requisite for being successful in the role) tend to have very similar demographics. This type of approach is clearly discriminatory and, unfortunately, is prevalent across different countries, industries, and companies.
领英推荐
Machine Learning and AI, when used responsibly, can help organisations to identify hiring managers that are consciously or subconsciously applying bias (and, therefore, being discriminatory). In other words, it can increase objectivity. However, when the AI is not used responsibly and thoughtfully, it can compound the problem by filtering out applicants using unfair criteria. Indeed, the lawsuit against the HRIS provider asserts that its software trains its AI model to screen for the best applicants using data from the company's current workforce, which could perpetuate existing discrimination.
PeopleWeek often receives questions from prospects and existing clients about whether we will incorporate AI into our recruitment module (ATS) to help with filtering of applicants. Our response until now has been “not yet”. Our reluctance has been based on the ethical and legal risks. Despite last week's California ruling bringing this issue into the spotlight, the risks have been apparent for many years. However, some software providers and their clients have decided to run that risk. PeopleWeek will avoid entering this space until a clear legal framework allows us to do so ethically and legally, thereby protecting ourselves and our clients.
Paul