NYC passes new legislation targeting bias in automated hiring software

The legislation would require employers to verify that automated hiring software vendors are conducting regular independent bias audits.
article cover

Francis Scialabba

· 3 min read

What goes into AI algorithms is typically more closely guarded than the recipe to Mr. Krabs’s Krabby Patty. But last week, the New York City Council passed a bill intended to increase transparency when it comes to potential biases arising from automated hiring tools.

The legislation, which passed by a large margin, Protocol reports, requires vendors of AI tech used for hiring to obtain annual third-party “bias audits” and share results with employers who use, or are interested in using, such systems to hire NYC residents. The bill tasks employers with displaying the results of the audits publicly and informing candidates that automated hiring software has been used, giving applicants residing in NYC an opportunity to “request an alternative selection process.” 

The goal, according to Donald Tomaskovic-Devey, a professor of sociology at the University of Amherst who spoke in support of the bill, is to allow purchasers of AI hiring tech to “look ‘under the hood’ as needed to understand potential sources of bias with regard to race or gender.” The information gleaned from audits can help companies compare products and gauge potential employment-discrimination risk, while still protecting propietary information of vendors.

Why lawmakers made a law: A number of reports, including the Hidden Workers: Untapped Talent study, have raised concerns about potential bias arising from the use of AI in hiring.

If the bill is signed into law by Mayor Bill de Blasio, NYC will be the first municipality in the US to require vendors selling automated hiring tools to conduct audits for race and gender biases, Crain’s reports. (Companies that fail to comply could face fines.) Mark MacCarthy, senior fellow in governance studies at the Center for Technology Innovation at the Brookings Institution, supports the legislation.

HR is challenging. HR news doesn’t have to be.

News built to help HR pros grow their impact & improve the future of work.

“This bill isn’t the result of a flood,” MacCarthy told HR Brew. “There’s no natural disaster or something we can point to and say, ‘Wow, very bad, we need this now.’ But the water is rising, if you will. Employers simply can’t hire by hand anymore. There are too many applications to review. They need this tech. So if they’re gonna use it, the idea of the bill is that we get in the practice of auditing [AI] now. We adopt good practices in the industry early on before potentially bad or problematic policies are locked in and become the norm.”

Zoom out: Although the legislation, if signed into law, wouldn’t go into effect until 2023, expect it to make more waves than the Big One that swept SpongeBob away to that hippie surfer island.

“If you want to have New York City–based clients, you’re going to have to have an audit,” Dr. Frida Polli, founder and CEO of ethical-AI recruiting vendor Pymetrics, told HR Brew. “Wherever they’re based, even if they’re headquartered outside of the US, if they want to do business, they’re going to need to provide this public reporting, which I think is super exciting.”—SV

Do you work in HR or have information about your HR department we should know? Contact Susanna Vogel via the encrypted messaging apps Signal and Telegram (@SusannaVogel) or simply email [email protected].

HR is challenging. HR news doesn’t have to be.

News built to help HR pros grow their impact & improve the future of work.