Tech

Only one-half of HR pros monitor AI tools in hiring

AI and automated tools could change the way companies hire, but HR teams may not be doing enough to make sure they’re effective and unbiased.
article cover

Paul Taylor/Getty Images

· 3 min read

Quick-to-read HR news & insights

From recruiting and retention to company culture and the latest in HR tech, HR Brew delivers up-to-date industry news and tips to help HR pros stay nimble in today’s fast-changing business environment.

Only one-half of HR teams monitor their AI tools for bias and evaluate their efficacy, according to a new survey from hiring platform Greenhouse.

The survey comes as AI tools rapidly infiltrate HR systems, and platforms race to add AI capacity to their tools.

“If half of HR professionals don’t feel like they have the tools to evaluate the efficacy and potential bias of AI tools in their hiring process,” said Mona Khalil, Greenhouse’s manager of data science, “it’s a coin flip as to whether or not [candidates] may end up applying for jobs where the company…doesn’t have what they need to understand…whether or not they’re actually qualified for the job.”

The survey of 100 HR professionals found that 50% of respondents reported no monitoring or evaluating of their AI tools. Of the remaining respondents, it found that:

  • 20% of HR leaders manually review and check AI’s performance
  • 10% use random spot-checks
  • 8% use another tool to aid in performance monitoring
  • 12% don’t use AI tools in hiring

While AI tools and automation can offer HR professionals time-saving assistance and replace routine tasks performed by people pros—especially in the hiring process—HR leaders and candidates both worry about AI’s ability to limit bias and promote DE&I, according to the survey.

Flex the muscle. “People teams have to be more responsible when they’re evaluating the tools that they’re using to facilitate the process of bringing people to their organization and even exiting people,” said Greenhouse CPO Donald Knight. “We have not historically, as a profession, been as thoughtful around the tools that we’ve had. In defense of so many HR and people leaders, we haven’t had to…what has happened is there’s a muscle inside of our leadership acumen we have not had to flex.”

Amid state and local attempts to address bias in automated decision making, it seems as though now is the time for HR and TA professionals to develop a framework for how AI will be used in their organization and what processes will be implemented to assess its efficacy and ensure bias isn’t built into the tools.

At the federal level, the EEOC recently issued guidance indicating that companies are on the hook if the AI tools they use violate Title VII of the Civil Rights Act.

The Greenhouse platform doesn’t deploy AI tools in its software to rank candidates or assist in the decision-making process in hiring. Instead, the platform offers “assistive AI…that reduces cognitive load on boring tasks,” Khalil said.

What to ask. Khalil recommended HR teams ask about the data sets used to train the model and if it includes sufficient representation. She suggested asking about how the system handles candidates who have more than one racial background or those who don’t identify in a gender binary, for example.

“I would ask detailed questions about what steps they [took] to monitor for and mitigate bias in the process, both in the accuracy of the model as well as the output,” Khalil said.

Quick-to-read HR news & insights

From recruiting and retention to company culture and the latest in HR tech, HR Brew delivers up-to-date industry news and tips to help HR pros stay nimble in today’s fast-changing business environment.