Amazon Abandons AI in Hiring Due to Bias, But Should You Follow?

Earlier this month Amazon abandoned an effort to use AI in hiring because it found that the tool under development discriminated against women. Amazon set a strong example by shutting the project down once they learned of the bias. But as Amazon has shown us through countless examples, a miss doesn’t necessarily mean the end of an opportunity. The question is: what have we learned and how do we change course?

Our response at Koru: AI models must be rigorously vetted, from the data that goes in to the modeling approaches used on the data.


What does this mean?

The Amazon approach used unstructured data — resumes in this instance — that allowed modeling to pick up any correlations. This meant any resulting AI model could include biased signals such as “women”. We’ve seen this possibility in our work: one of our customers found that when they evaluated against resume signals, they saw a correlation between better brand schools and low tenure. (The takeaway: resumes should not be your primary data set. It’s a low signal document).

The most important consideration is what data goes into the algorithm. Simply put, if the data going into the model is inherently biased, the outcome will often be biased.


How we validate against bias at Koru

At Koru, we’ve validated that our measures of the Koru7 Impact Skills — grit, rigor, impact, teamwork, curiosity, ownership, polish — are not biased by race, gender, etc.  These are the inputs we put into our customer’s hiring models.

In fact, we’ve recently received a Badge of Excellence for Justification from the HR Policy Association Review Board. They shared that they “felt Koru exhibited a diligent approach to how they validated outcomes and performed quality assurance checks.”

When we incorporate custom competencies and signals outside of the Koru7 into models, based on customer requests, we specifically test that competency on a broad sample of anonymized job applicants to make sure we find no sign of disparate impact — adversely affecting one group of people of a protected characteristic more than another.

If you’re thinking about launching your own predictive models for hiring, here are the main things we recommend you think through first:


Checklist: Building predictive hiring models without bias

1. Review your data set for ingoing bias — not just the insights the model produces. Knowing data points such as demographics, age, ethnicity, etc will show you where you might be leading the data.

2. Use a big enough sample size — 100s are good, 1000s are better. If you’re looking for what drives performance in a sample set that includes 90% white males, your model may intentionally tell you that being a white male is indicative of high performance.

3. Build your models on positive signals. This includes things that drive business outcomes such as retention data and performance ratings.

4. Solve for negative biases with validation studies so that you’re not creating unintentionally harmful models. Your data scientist should be using good hypothesis validation with industry standard benchmarks so as not to harmfully impact protected classes (race, gender, etc). Conversely, a factor analysis can help look for variables that may have an outsized impact on your outcome such as GPA.

5. Build in checks and balances. Never rely on an AI model that hasn’t been tested for adverse impact on protected classes.

6. Work with a data scientist. Do the tips above sound confusing? Don’t do this alone!

 

Our work is never done. We will continue to monitor the diversity of our customer’s candidates at each stage of the process and for any possible disparate impact. Regardless of the tool, smarter and better recruiting still relies on human implementation. That’s why we’re so focused on working with our partners and educating them to pair data with their own analysis to make better, faster decisions with more confidence.

For more information on our approach and the science behind the Koru7 Impact skills, check out our whitepaper.

Jori Saeger Jori runs the blog at Koru. She finds and shares the most valuable and educational content for innovative talent acquisition leaders who want to change the future of hiring.

More from Jori Saeger

Leave a Reply

Your email address will not be published.