Search
Close this search box.
Search
Close this search box.

Breaking Down Bias: Mitigate The Pitfalls of A.I. In Tech Hiring

© AdobeStock
Layering A.I. on top of today’s resume screens not only exacerbates the pedigree bias problem, but also creates a black box around the vetting process by obfuscating the bias. Here's how to fix that.

Since the start of the pandemic, 86% of businesses have moved their interview processes online. The extent of digital hiring transformation varies by company. For many, remote interviews are still challenging. But for others, the transformation has opened the door to introducing more technology to help keep up with the surging demand for software engineers. This includes an increase in artificial intelligence (A.I.) and other automated tools to identify the best job candidates.

While some people believe that using A.I. can reduce hiring biases by eliminating the human element, this is often not the case. Most A.I. tools in the market today are more accurately defined as machine learning. These include algorithms that are programmed to find patterns in historical hiring data that subsequently inform future decisions.

The problem is that using historical data to predict the future reinforces the status quo. Defining the “best” candidates as those most similar to past employees is essentially digital redlining.

Beware of pedigree bias

If a company has a history of only hiring a certain type of individual for engineering positions—i.e., White or Asian males—the A.I. tool will learn to prioritize candidates with similar profiles to the current employees in the company. This injects pedigree bias into the hiring process.

Automated pedigree bias begins excluding qualified individuals from nontraditional backgrounds at the earliest stage of the recruiting process—the application screen. According to our data from nearly 100,000 technical interviews, on average, fewer than 10% of direct applicants progress to the interview stage. This process filters out candidates without the education or past employment pedigree recruiters are looking for.

We often hear that when it comes to DEI “sunlight is the best disinfectant,” but layering A.I. on top of today’s resume screens will not only exacerbate the pedigree bias problem, but it will also create a black box around the vetting process by obfuscating the bias. The algorithm will continue selecting candidates based on pedigree proxies—i.e., candidates who may have attended a top university or have experience at a Big-5 tech company. This artificially shrinks your candidate choices and pipeline diversity.

One way to mitigate pedigree bias is to give more interviews to direct applicants. By analyzing direct applicant success rates from our technical interviews, we discovered that direct applicants are being over-screened compared to other talent sources. This over-screening, which is being exacerbated by algorithms, is bad for two reasons.

First, unless your organization has a robust talent pipeline coming from HBCUs and HSIs, direct applications are almost always the most diverse talent pool available. Over-screening this source based on pedigree will make it much harder to achieve meaningful DEI goals.

Second, you’re creating a more expensive pipeline by over-indexing on proactively recruited candidates. Direct applicants close (i.e. accept offers) at a higher rate compared to their proactively sourced counterparts. They are less likely to receive counter-offers, and spend less time in the hiring process. This means by over-screening direct applicants with A.I., not only are you likely to constrain diversity, but it also costs more and takes longer to make a hire.

If you have a 10% application-to-tech-screen rate, try pushing it to 15% or even 20%. If you’re still seeing success, keep expanding it to drive better overall hiring efficiency and equity.

Automation without empathy

Another way to break down hiring bias is to make sure there is a human element to your technical assessment.

This is particularly true for automated pass/fail coding tests that require a candidate to produce fully working code. We see about 55% of all offers go to candidates with incomplete technical interview solutions. This is because human interviewers can spot silly mistakes like typos, even if there isn’t a working technical solution.

Furthermore, the binary nature of non-human tests disproportionately impacts women and candidates from underrepresented backgrounds, who receive offers for incomplete solutions at higher rates than their White- and Asian-male counterparts.

Well trained interviewers can put candidates at ease, clarify what the team is looking for, and reduce false negatives that week out qualified engineering candidates. Consider designing interviewer training with an emphasis on providing clear and transparent guidance to candidates. This includes, at a minimum, telling candidates which competencies are being assessed; what success looks like; and that it’s okay to ask clarifying questions during the interview.

This will set candidates up for success, and also provide a baseline level of transparency that will prevent well-networked candidates from having more insight into the hiring process, which helps level the playing field.

The human + technology solution

The key to implementing digital transformation in hiring is to strike the right balance of human and technology. Use technology to lighten the cognitive load of your interviewers by suggestion questions based on the role and competencies being evaluated. Use video recordings to review your interviewers and train them to spot mistakes like ambiguity or preferential treatment.

Successfully implementing A.I. in recruiting and hiring is going to take an investment. Not just an investment in technology, but an investment in creating more  inclusive data science teams. This step is critical to ensure we’re not codifying today’s biases in the next generation of tech.

Using a structured interview rubric will help get the best of both worlds. It provides hiring managers with structural rigor to make data-driven decisions and removes the black box of a purely A.I. solution. It ensures that candidates are being assessed on a consistent and level playing field, while also leaving room for appropriate human empathy and judgement.

A human + technology approach lets interviewers focus on what’s important: building a rapport with candidates, providing clarity, and setting them up to show their best selves in the interview. Giving candidates an interview that is predictive, fair and ultimately enjoyable will unlock opportunities for employees to thrive and for teams to grow.


MORE LIKE THIS

  • Get the CEO Briefing

    Sign up today to get weekly access to the latest issues affecting CEOs in every industry
  • upcoming events

    Roundtable

    Strategic Planning Workshop

    1:00 - 5:00 pm

    Over 70% of Executives Surveyed Agree: Many Strategic Planning Efforts Lack Systematic Approach Tips for Enhancing Your Strategic Planning Process

    Executives expressed frustration with their current strategic planning process. Issues include:

    1. Lack of systematic approach (70%)
    2. Laundry lists without prioritization (68%)
    3. Decisions based on personalities rather than facts and information (65%)

     

    Steve Rutan and Denise Harrison have put together an afternoon workshop that will provide the tools you need to address these concerns.  They have worked with hundreds of executives to develop a systematic approach that will enable your team to make better decisions during strategic planning.  Steve and Denise will walk you through exercises for prioritizing your lists and steps that will reset and reinvigorate your process.  This will be a hands-on workshop that will enable you to think about your business as you use the tools that are being presented.  If you are ready for a Strategic Planning tune-up, select this workshop in your registration form.  The additional fee of $695 will be added to your total.

    To sign up, select this option in your registration form. Additional fee of $695 will be added to your total.

    New York, NY: ​​​Chief Executive's Corporate Citizenship Awards 2017

    Women in Leadership Seminar and Peer Discussion

    2:00 - 5:00 pm

    Female leaders face the same issues all leaders do, but they often face additional challenges too. In this peer session, we will facilitate a discussion of best practices and how to overcome common barriers to help women leaders be more effective within and outside their organizations. 

    Limited space available.

    To sign up, select this option in your registration form. Additional fee of $495 will be added to your total.

    Golf Outing

    10:30 - 5:00 pm
    General’s Retreat at Hermitage Golf Course
    Sponsored by UBS

    General’s Retreat, built in 1986 with architect Gary Roger Baird, has been voted the “Best Golf Course in Nashville” and is a “must play” when visiting the Nashville, Tennessee area. With the beautiful setting along the Cumberland River, golfers of all capabilities will thoroughly enjoy the golf, scenery and hospitality.

    The golf outing fee includes transportation to and from the hotel, greens/cart fees, use of practice facilities, and boxed lunch. The bus will leave the hotel at 10:30 am for a noon shotgun start and return to the hotel after the cocktail reception following the completion of the round.

    To sign up, select this option in your registration form. Additional fee of $295 will be added to your total.