AI Recruitment ??: Dehumanising a person-centred process
Michael Banner
Leader | Software Engineer | Mentor | A passion for seeing people and products succeed
Today’s post is very much an exploration of my musings on the whole AI front, with a focus on the use of it within the recruitment space.
My LinkedIn feed is littered with stories of individuals fighting to get their chance at landing an interview, and in most cases we’re simply talking about an initial chat with a recruiter. There is also no denial that the job market (at least within software engineering) is very unstable at the moment, as we hear about big-tech making bold moves around ‘return to office’ (RTO)1 and other businesses downsizing via layoffs2.
With the above in mind, my question is this - is AI in recruitment creating an even more challenging environment for candidates?
A race for automation
Since the dawn of existence, humans have done their best to find ways to improve on their existing processes; whether that was was creating a repeatable way to generate fire, honing the wheel to aid in transportation, using mechanics to reduce the manual labour required to mill cotton and the introduction of computers to speed up computational tasks. Our inquisitive nature means we are always exploring new ways of making our lives easier is the likely reason that we are the dominant species on our planet.
“With great power comes great responsibility” - Spiderman
Don’t get me wrong, the innovation and automation of mundane, menial tasks is often a good thing. Sure, some manual things are still enjoyable (and largely subjective) such as when I choose to manually grind my coffee beans and use my AeroPress over simply using instant powder or getting a machine to do it for me. Others quite like the luxury of having an electronic coffee machine that does the whole process for them, but that’s fine - each to their own.
When it comes to recruitment, herein lies the dichotomy between what is fundamentally a human-centred process that is being tainted by computational “thinking”. The AI race seems to have resulted in every perceivable industry sticking its fingers in the AI pie, and recruitment is no different. Consequently there is an increasing number of AI-enabled screening processes and even the introduction of AI hiring managers who take the place of actual people.
Maybe I am old-school and believe that hiring processes should be a manual, human-centred task…
Foreword
Before I get stuck into what I think are the fallacies of AI recruitment, I want to start by stating that I do believe it has a place, but not in the way we’re seeing.
My view is that there is a very large focus on using AI as productivity shift for the recruiters and talent managers within businesses. I’ve seen less benefits as a job hunter, but more “benefits” as a hiring manager.
I’d love to see AI used more as an enabler for candidates to showcase themselves better. LinkedIn have been doing something similar to this for a while now, allowing you to get AI to check your profile against a job description to provide a suitability score.
As with most things AI, I think recruitment tools using the technology should be more supplementary than direct replacements. Tools which assist you in your job search, or improve candidate filtering on the recruiter side are fine - it’s the removal of humans altogether which feels like the dangerous part.
The Fallacies
I’m not a recruiter by trade, but I have run through many rounds of recruitment as a hiring manager across numerous companies. As a result, I have experienced the use of different Applicant Tracking Systems (ATSs) as a hiring manager, but equally as a job hunter on the opposite side of the fence. Therefore it is safe to say that I have enough knowledge and experience to pass on some form of judgement in this subject.
领英推荐
With the introduction of AI-enabled ATS platforms, I fundamentally believe that there are a set of issues that we, as an industry, have created about the capabilities of such systems.
1. “We can better filter candidates using AI-enabled screening”
Irrespective of company size, it can often be the case that tens to hundreds of people will apply for a particular role. It is also the case that the vast majority of applicants don’t fit the criteria on simple terms e.g. they don’t have the correct working eligibility due to residential status/visas.
The purpose of screening is to ensure that we “separate the wheat from the chaff”, focusing on the candidates that are actually relevant to the job in question. Using the example above it is clear that there can be super-specific criteria that mean someone is either entirely suitable for progression in the flow, or entirely unsuitable. In these scenarios I believe that having some form of auto-rejection would make sense - why would we entertain speaking to candidates who have no legal permit to even work in the role? The answer is, we wouldn’t.
However, I believe that the problem arises when we consider the rest of the criterion found on job adverts which by its nature is considerably less binary. Taking a mid-level software engineer job advert as an example, we might expect to see things like:
In this case, there are systems that use keyword analysis on CVs (resumés) to immediately filter out candidates that don’t hit a pre-defined list of things. I think the issue here is that CVs should not just be a wall of keywords - they’re carefully crafted documents that highlight the key experiences an individual has. This may touch on some of the above tech stacks, but equally might mention some more nuanced tech such as Next.js which is a React-based framework. An analysis looking purely for React.js would easily skip over Next.js as valid experience, possibly losing out on a solid candidate as a result.
The other thing to call out here is the additional mindset that candidates now have to possess - writing for AI-enabled processes. Gone are the days where you could guarantee that your CV and covering letter would be read by a human. This in turn means that we now hear of candidates filling their CVs with keywords in white fonts, so as to look invisible to a human but readable by the ATS systems. This is very much like the old-school SEO tactics of cramming keywords into your webpages in the same colour as your website background.
By introducing new mechanisms (barriers) to candidates, we can expect a trend in a change of behaviour from applicants. One might expect nicer, leaner CVs/resumes which are more to the point, however, the incentive of using AI-enabled systems is actually resulting in a different result whereby candidates keyword stuff as above, and who knows what else. This behaviour / response to a change in the system is very much akin to the British government wanting to reduce the number of venomous cobras in Delhi. The government went as far as offering rewards for every dead cobra handed in - a good incentive one might assume? Instead of simply finding wild cobra and killing them, some entrepreneurial locals decided to farm cobras with the intent to kill and exchange them for the cash. In short, we need to be wary of how changing processes dramatically can have an impact on how people respond, and not always as you might think.
Caveats
I’d always recommend that candidates tailor their CV to the job in question. Sometimes a generic CV will suffice, but more often than not there will be a need for some tweaking to get the balance of key information (as per the job spec) and additional context. This could help avoid a flood of keywords purely to get through such systems, but it doesn’t stop candidates potentially trying to game the system.
There is also (bad?) advice floating around the internet such as this:
Continue reading over on my Substack, where you can find additional posts and engage in conversation.
Click here to continue this article where you left off:
Enabling Human-Centric Software Engineering Teams
1 个月I think AI makes bad recruiters worse, and good recruiters better. It's all down to the integrity of the people behind the systems and processes and the compassion with which they use the tools. It's hard for candidates because they have to second guess what tools and processes they're up against.