AI

The government is on the lookout for disability bias in employers’ use of AI

The EEOC and DOJ each issued guidance for employers around the use of AI and disability discrimination.
article cover

Phonlamaiphoto/Getty Images

· 3 min read

On Thursday, the Equal Employment Opportunity Commission (EEOC) and the Department of Justice (DOJ) Civil Rights Division each put employers on notice: When using AI in employment processes, employers are responsible for inspecting tools for disability bias, and they better have a plan to provide reasonable accomodations, because federal agencies say they have their eyes on how using artificial intelligence could lead to discrimination under the Americans with Disabilities Act (ADA).

Why now? Though the use of AI in employment processes has been an EEOC priority since 2021, EEOC Chair Charlotte Burrows told reporters last week that data motivated the agency to focus more specifically on protecting individuals with disabilities.

“According to the US Bureau of Labor Statistics, individuals with disabilities are facing unemployment rates almost twice as high as other workers who do not have disabilities, and so…we thought it was really important to focus on disability issues first,” Burrows said.

What does it say? The EEOC’s new technical assistance document warns employers that AI could unlawfully screen out candidates with disabilities who are qualified to perform the job with “reasonable accommodations” and reminds employers they must provide these reasonable accommodations for all employment processes. Kristen Clarke, the DOJ’s assistant attorney general for civil rights, told reporters, “that includes when an employer chooses to use artificial intelligence tools or tests as part of its hiring process.”

The DOJ’s guidance provides examples of the AI-powered tools companies might use and explains employers’ obligations under the ADA for proper use. It also offers guidance for workers who may suspect they have experienced discrimination as a result of employers’ use of the tech.

Cool, what should HR do? Burrows shared that as of 2019 over 80% of companies used AI in some form. She advised employers to ask vendors about their approach to accessibility.

  • On “reasonable accommodations,” Burrows said to consider “at what point the vendor has thought about” how someone with a disability could request an accomodation on their platform. If a vendor isn’t “ready to engage” with that subject, Burrows called it a “warning signal.”
  • Burrows advised looking “under the hood” to understand how the tools screen out applicants. (More on that here.) If vendors “don’t don’t want to share that and say, ‘Oh, it's proprietary,’ that too, should be a warning sign.”
HR is challenging. HR news doesn’t have to be.

HR Brew keeps you effective in the fast-changing business environment.

  • Employers should know, according to Burrows, whether the tech asks questions that will prompt an employee to reveal a disability, such as “did you ever apply for workers comp?” Burrows said. “That’s a prohibited question, and so the employer needs to make sure that that kind of thing is not being folded into the tech.”

Clarke told reporters that by releasing the documents, the agencies are “sounding an alarm” about employers’ “blind reliance” on AI in employment processes, and that the guidance sent a “strong message” about the DOJ’s “commitment to using our federal civil rights laws to hold employers accountable when they take unlawful action that unfairly denies people with disabilities access to job opportunities.”—SV

Do you work in HR or have information about your HR department we should know? Email [email protected] or DM @SusannaVogel1 on Twitter. For completely confidential conversations, ask Susanna for her number on Signal.

HR is challenging. HR news doesn’t have to be.

HR Brew keeps you effective in the fast-changing business environment.