Bias in Hiring: The State of Bias in Hiring (Part Two of Four)
Organizations have been working to eliminate bias from the hiring process for?decades. In a 2021 study by Aptitude Research, nine in ten companies said they are concerned about bias in their?hiring process. Many companies acknowledge that bias can seep into every activity along the hiring continuum, from recruitment, to reviewing resumes, to interviewing and to the ultimate decision of who to hire.
Consequently, many businesses have attempted to reduce bias in various ways—training interviewers, for example, or using job simulations and written tests. But while a quarter (24%) of companies use assessment or selection tests,?they haven’t done an?in-house validation analysis.
Despite ongoing efforts to improve Diversity, Equity, and Inclusion (DEI), women and people of color are still stuck at lower levels in the organization and lower-paying industries.
And yet the business case for diversity is stronger than ever.?Year after year, research shows that companies with greater gender, ethnic and cultural diversity in their corporate leadership are?more profitable.
领英推荐
Now for some more positive news. According to the Talent Board’s latest research, companies report that their top three recruiting priorities are candidate experience, diversity and inclusion, and?employee referrals. This is reassuring news since many feared DEI would fall by the wayside under the pressures of the pandemic. But internal recruiters are overwhelmed by the volume and scale of hiring, trying to fill positions quickly in a highly competitive marketplace with low unemployment.
The key is to automate more recruiting activities intelligently and without bias. While many companies are adopting AI-driven recruiting technology, many aren’t. Aptitude Research found that almost?two-thirds (63%)?of companies rely on training recruiters and hiring managers to reduce bias and only 27% use AI tools. What’s more, many of these new AI recruiting tools are not backed by science and research. The danger is that bias can become entwined in the code resulting in more bias. A well-known example of this occurred when Amazon discovered that it’s AI recruiting models were?less favorable to women?because they were based on ten years of resumes of primarily white, male employees.
Interestingly, with this new focus on bias in AI hiring tools, there’s been a corresponding push for regulation of AI selection tools. The City of New York passed a?law beginning in 2023?that requires annual independent audits of selection systems. Illinois and the District of Columbia are also moving to require similar scrutiny.
Employers seeking robust and accurate AI recruiting technology need to ask hard questions of their HR tech vendors. The technology should ensure that substantively relevant data drives employment decisions and know what and how those factors are being scored. For example, claims about the results of any AI-based product should be backed up by explanations of how data is collected and analyzed and what it successfully predicts. The vendor should also be able to describe their methodology in an open manner, including a willingness to share and publish their findings whenever possible. And finally, the product should serve?both?organizations and individuals, always keeping in mind the impact on the candidate.