Skip to main content
HR Tech

AI might address recruiting bias, but HR pros aren’t leveraging it that way just yet

HR teams aren’t widely adopting AI tools to identify and address bias in human-led hiring.

3 min read

TOPICS: HR Tech / AI / AI Ethics

“The obstacle is the way,” purportedly, according to former Roman emperor and Stoic philosopher Marcus Aurelius.

Although Aurelius, himself, probably wouldn’t understand artificial intelligence, there’s a growing case that AI in hiring could help recruiters address and mitigate bias, rather than amplify their own. But HR teams aren’t there…yet.

Bias is a perennial concern for many recruiters and HR pros, especially when exploring the AI tools and platforms in the people function, and some HR leaders and technologists predict the technology will impact bias in hiring for the better.

There’s definitely room to grow when it comes to leveraging these tools to help address risk in recruiting associated with bias, according to HR professionals surveyed by HR Brew in December.

Only 7% of respondents reported currently using AI “to identify and address biases, disparities, or inequities when hiring, evaluating, promoting, or paying employees,” according to the survey.

HR Brew surveyed nearly 400 HR professionals, and 78% said they did not use AI to help address these risks, while 15% did not know if their organizations used AI in this way.

There’s opportunity for organizations to harness new AI tools and platforms to audit and actively flag potential bias, especially if bias mitigation is baked into the model’s training.

“We have to recognize that a lot of these systems, if ungoverned, can magnify biases of their data that they’re trained on, of the ways algorithms are built, but also the opportunity to overcome those biases, or to find them completely out of existence by a different process,” Greenhouse cofounder and CEO Daniel Chait told us at HR Brew’s Talent 2030 Collective summit in New York City last month. “But [there’s] also the opportunity to overcome those biases or to define them completely out of existence by a different process.”

Quick-to-read HR news & insights

From recruiting and retention to company culture and the latest in HR tech, HR Brew delivers up-to-date industry news and tips to help HR pros stay nimble in today’s fast-changing business environment.

By subscribing, you accept our Terms & Privacy Policy.

TA leaders, recruiters, and hiring managers can bring their own biases to the hiring process. Chait pointed to well-documented human biases, like the preference toward candidates with stereotypically white-sounding names, or those with similar backgrounds, educations, experiences, or former employers.

“These are the kinds of biases that show up in every interview loop, that are almost baked in so much that people don’t even think about them or realize that they’re there,” he said. “But, everybody knows that if you send in résumés with a different name at the top, you are much more likely to get a callback for an interview than other [names].”

It’s a sentiment that Shabrina Davis, Amazon’s head of manager enablement and inclusive hiring, shares.

“If it’s a strictly human experience, then that bias is there,” Davis told HR Brew at a February From Day One event in Washington, DC. “Certainly any technology that you add to this process, there is room for that [increasing bias], but it also solves it, because it levels the playing field.”

Davis also said that directing the technology to identify potential bias inside a human-led recruiting process can help remove this risk.

About the author

Adam DeRose

Adam DeRose is a senior reporter for HR Brew covering tech and compliance.

Quick-to-read HR news & insights

From recruiting and retention to company culture and the latest in HR tech, HR Brew delivers up-to-date industry news and tips to help HR pros stay nimble in today’s fast-changing business environment.

By subscribing, you accept our Terms & Privacy Policy.