Lessons Learned In Effective Technical Recruiting

Over the past 3.5 years, I've scanned thousands of resumes and interviewed several hundred candidates. As a Director of Engineering at a small startup, I spend about 50% of my time on recruiting efforts. It's tough competing with the deep pockets of the giant tech companies in the Bay Area, so I've had to evolve and streamline our hiring process to efficiently determine which candidates are worth hiring and be able to make quick decisions on offers to out-pace larger companies who are slower to respond despite having massive resources dedicated to hiring. An efficient process has also given me the leverage to find those diamond-in-the-rough candidates that larger companies tend to overlook.

The Candidate's Perspective

Assessment should be relaxed and friendly - remember you are always selling. You want the right to refuse, not be turned down by the candidate. I have cut several interviews short in my career when I was the interviewee. Great candidates can be selective.

Take the time to welcome the candidate, make them comfortable, and at the end thank them for their time. Even those candidates that you do not select should feel like they just had a great interview and would recommend the company to their friends.

The interview experience must be highly streamlined from the candidate's perspective so as not to lose the best candidates. The best candidates have options and do not need to engage in a long drawn-out interview process.

Candidates must not be made to feel nervous or judged. Do not take notes during an onsite interview. Do not allow a candidate to stall on an answer for too long. It is better to lead them to the answer if they can't get there themselves, but as the interviewer be sure to make a mental note. If you crush their mental spirit for the rest of the interview, you may miss an opportunity to recognize and hire a great candidate. Not everyone must have an alpha personality to be a great technical contributor.

Drive consistency across your interview panel

The worst interviews I've experienced (many times) consist of a stream of interviewers all asking the same uncoordinated puzzle or soft-skill questions. If you don't invest effort organizing your interview panel, this is what will happen.

All candidates should experience the same format for phone screens, practical tests, and onsite interviews. This ensures that we:

  • Drive consistency in assessment across candidates
  • Use engineering and candidate time effectively
  • Eliminate redundancy in questioning
  • Ensure an appropriate spectrum of assessment is performed

You must develop a set of interview questions. Include your team and encourage them to contribute. Discuss and agree on a set of criteria and the right level of questions to make an assessment of each criteria. Here's a list of criteria we use for each onsite interview:

  • Problem Solving
  • Domain Expertise
  • Collaboration
  • Understanding & Adaptive Learning
  • Execution & Dedication
  • Communication
  • Clear Thinking
  • Team Fit

All of these character traits can be assessed within the context of an interview session. Here are the interview sessions we conduct for Software Engineers:

  • Whiteboard Coding
  • Requirements Analysis and Design
  • Domain Knowledge
  • Behavioral Soft-Skills and Team Fit
  • Code Review and Code Optimization / Architecture

Some criteria are obvious to assess within an interview theme such as "problem solving" for "whiteboard coding". However, you can assess all other traits in any session. For example, "Execution and dedication" are evident with the candidate's persistence to solve the problem. "Collaboration" is easy to determine from how they interact with you as the interviewer. Even for the Behavioral interview, "domain knowledge" can be assessed in terms of their knowledge and understanding of best practices and techniques to resolve conflicts etc.

We use different criteria for the phone screen and different criteria again for grading the coding test submission.

Some general guidelines for determining the right level and order of interview questions are:

  • Only assess "just enough" to make a determination for the role. Interview questions and coding tests should be at a level that is acceptable, and not any harder, so as not to filter out good candidates.
  • Assess the candidate on the hardest and most important skills at the beginning of a phone screen or an onsite interview so you know what they are capable of before they have expended their mental energy on other tasks such as listening.
  • Prioritize assessment of the most important factors for the job role and recognize must-haves over nice-to-haves. If only 2% of candidates can pass your coding test, then soft-skills assessment is not as important because you no longer have a population to filter from. In this case, behavioral interviews should be used to filter out unacceptable attitudes in the small remaining viable candidate population.
  • Do not filter out acceptable candidates with irrelevant or hyper-specific technical questions. Instead, assess if the candidate has gained a broad understanding of the domain. This is an indicator that, in the ever-changing tech landscape, they are engaged and intelligent enough to understand and build an abstract understanding of what they've worked on that will be applicable to any new technologies going forward. Imagine asking a building contractor... "What is the difference between a nail and a screw, and when would you use one vs the other?" Ask yourself, would you hire them to build your house if they can't provide a coherent answer? I find that > 50% of engineers with great resumes do not understand the fundamentals or why they are using the technologies on their resumes.

Consistent and Real-Time Assessment

There are tools out there to track assessment of candidates, but I've not found anything that provides more flexibility than Google Sheets. Google Sheets allows sharing a single and live source of truth that multiple team-members can edit simultaneously.

I have developed a Candidate Assessment Sheet to track each candidate through the stages of their interview. When a candidate enters the process, we create a copy of the sheet from a template and immediately start scoring them against a pre-determined set of criteria. Each candidate sheet has a sub-sheet for each interview phase, including one for the assessment of their resume, initial phone screen, coding test, and onsite interview. Each sub-sheet itemizes several criteria being assessed and the interviewer must provide a score of 0-10 and accompanying notes to justify the score. Here's a link to a template you can use to track candidate assessment:

https://drive.google.com/open?id=1_zmGT5AgtE2R-u8lVZr8cyYGP57FvS5MlESzsVsVCAo

You're welcome to make a copy of this and then tailor it to your needs. I've removed several of the questions I currently use and left placeholders since I don't want to give away our interview questions!

Once you have added your own interview questions, mark your copy as read-only and share it with your hiring team. They should make a new copy for each candidate, put the candidate name and job title in the document title, and share it as editable with the hiring team. When the candidate progresses to the coding test phase, we create a folder for the candidate and move the candidate assessment sheet and any coding artifacts into that folder.

Scores and notes can be added to summarize the candidate's resume. The sheet makes it easy to score and make notes during the phone screen, staying focused on the interview agenda and not being distracted. Similarly, scores and notes are added when assessing a practical exercise or coding test submission to achieve better consistency.

However, for the onsite interview, scores and comments should not be made in front of the candidate. They should however be added immediately after the interview, so they are available to make a rapid assessment by the end of the onsite interview day. I ask my team to fill their interview section out immediately after leaving their session. I can keep tabs on how the interviews are progressing through the day. This helps me know if I need to make a determination or go into "sell mode" for a great candidate. It also lets me know if I should exit a candidate who is not performing well, saving their time and ours.

It has been very useful to have a quantized record of the candidate assessment for reference and comparison. I've found this valuable when discussing the candidate offer with the business stakeholders. It has also been valuable when we have revisited past candidates and for comparison with new candidates.

If you know you want to make an offer by the end of the interview, and you have the evidence to support it, you can move very fast. Don't let that stellar candidate get away. Make them an offer that evening or following morning before they interview with another company!

I hope you've found this post useful. Please let me know if you have any questions.











Clemens Utschig-Utschig, MBA

Head of IT Technology Strategy / CTO at Boehringer Ingelheim | ex-Oracle

5 年

We went a similar approach hiring for #bix .. where possible less than a week turnaround between f2f das with us and decision

回复

Excellent blueprint! Thanks for sharing.

回复

要查看或添加评论,请登录

Alex Worden的更多文章

  • Disposable Code: A New Reality for Software Development

    Disposable Code: A New Reality for Software Development

    Once upon a time, code was precious—crafted by hand, carefully maintained, and treated as a long-term asset. Now, in…

    2 条评论
  • System Design Workflow

    System Design Workflow

    In a few interviews recently, I’ve been asked “What metrics do you like to measure related to production engineering…

    4 条评论
  • Building Stronger Teams Through Individual 1:1s

    Building Stronger Teams Through Individual 1:1s

    In the world of engineering, the work we do requires deep thought, problem-solving, and innovation. It is essential…

    2 条评论
  • Enhancing Engineering Performance with Intent-Based Requirements

    Enhancing Engineering Performance with Intent-Based Requirements

    Imagine what your engagement and commitment to a project would be if you were asked to think critically and provide…

  • Running Predictable Agile Projects

    Running Predictable Agile Projects

    Introduction Over the years I've experienced many styles of software project delivery that have covered the gamut from…

    5 条评论
  • What Is Domain Driven Design and Why Would You Use It?

    What Is Domain Driven Design and Why Would You Use It?

    Domain-Driven Design (DDD) is a way to think about a software system from a top-down business-driven perspective…

  • Engineering-Driven Stories

    Engineering-Driven Stories

    This article describes the traditional agile approach to defining engineering backlog to deliver upon software features…

    5 条评论
  • By The Power of Backlog!

    By The Power of Backlog!

    I'm an advocate for Nike’s slogan “Just Do It”. It is a powerful attitude for an individual to get things done and I…

    2 条评论
  • We're hiring again at Bigfoot Biomedical!

    We're hiring again at Bigfoot Biomedical!

    Here's a job description casting a wide net for software engineers of many talents. The main stipulation being that…

社区洞察

其他会员也浏览了