DE&I

Everything you need to know about LinkedIn’s new DE&I recruiting tool

The tool, called Diversity Nudges, notifies recruiters when less than 45% of candidates are either male or female.
article cover

Smith Collection/Getty Images

· 5 min read

As DE&I has risen to the top of corporate agendas, HR teams have embraced tech to help them achieve their goals. Nearly half (47%) of global HR professionals use technology to mitigate unconscious bias in recruiting and hiring, and one-third (33%) plan to do so in the future, according to a 2021 iCIMS survey.

Last month, LinkedIn unveiled a new AI-powered tool designed to alert users of its LinkedIn Recruiter hiring platform when their candidate searches lack gender diversity. The tool, called Diversity Nudges, notifies recruiters when less than 45% of candidates are either male or female. (It does not currently offer insight into non-binary and third-gender candidates, though LinkedIn said it hopes to have this functionality soon.)

LinkedIn isn’t the first company to dip a toe in the diversity-analytics pond. In 2019, the HR D&I tech market was estimated to be around $100 million, according to Mercer and Red Thread Research, with 30% of surveyed vendors focused on addressing “inadequately diverse talent pipelines.” Companies in the space, which includes Gem, Fetcher, and Hired, often tout predictive analytics as a way to identify and reduce biases in recruiting and have received VC funding to achieve their visions. Gem, for example, has raised $148 million in funding to date at a reported $1.2 billion valuation from investors including Iconiq, Greylock, and Accel.

What’s different about Diversity Nudges, said Daniel Tweed-Kent, LinkedIn’s head of product, is that it’s directly integrated into LinkedIn Recruiter, a tool that 4.6 million recruiters already use. While Gem, for example, aggregates candidate data from LinkedIn and other platforms on its site, LinkedIn Recruiter offers recruiters one-stop shopping.

How it works. Diversity Nudges doesn’t just let recruiters know that their candidate pools lack gender balance, it also suggests ways to expand their search by considering candidates from additional locations or with different skills.

During a demo for HR Brew, Tweed-Kent’s first search for a software engineer yielded 20% women. Based on the Diversity Nudges recommendation to add an additional location, the percentage of female candidates increased to 35%.

How did the tool know to suggest revisiting candidates’ location? As Tweed-Kent explained, it reviewed data for LinkedIn’s 130 million LinkedIn users to identify commonalities between software engineers and then focused on those among women, in particular. The results, he said, determined the recommendation.

The tool then suggested that Tweed-Kent add skills, including real estate, hospitality, change management, writing, and project planning. He conceded that some of these skills may not be applicable to every software-engineer role. “We’re continuing to improve the relevance of this specific feature.”

Quick-to-read HR news & insights

From recruiting and retention to company culture and the latest in HR tech, HR Brew delivers up-to-date industry news and tips to help HR pros stay nimble in today’s fast-changing business environment.

What’s (less) known. The tool relies on how LinkedIn users self-identify their gender in their settings (LinkedIn users can read how the company uses their demographic information and update or remove their own). If users don’t self-identify, the platform infers through users’ pronouns and if they belong to any identity-based groups. Tweed-Kent said the process is between 90% and 95% accurate, varying “slightly by country,” and not including users who do not identify as male or female. This is aligned with accuracy rates reported by competitors: Fetcher claims to be around 95% accurate and Gem claims its results are at least 90% accurate. Hired doesn’t infer sex at all: It relies on users to self-identify.

Tweed-Kent said LinkedIn does not tag individuals as either male or female to avoid misgendering candidates. Fetcher takes the same approach to avoid “[influencing] individual candidates negatively or in a biased fashion,” said Melissa Roer, the company’s head of marketing. Gem allows clients to see results on an individual and aggregated level.

Joe O’Keefe, an employment attorney at Proskauer Rose, agreed with the aggregated approach, so as not to allow gender to influence an individual’s hiring. “I would not want…those types of predictive analytics to ultimately play any role whatsoever in considering a candidate for hire,” O’Keefe said, adding that the goal is to “have a talent pool that is representative of the population at large.”

Should HR jump aboard? HR’s experience with AI-assisted hiring has been mixed: Though the majority of HR pros who responded to a February SHRM survey reported that the tech saves them time, nearly one-fifth were concerned it improperly screened out candidates, and nearly half would like more transparency from vendors.

To get a fuller picture, O’Keefe recommended that HR ask vendors, like LinkedIn, how the tool functions, vets data, and mitigates bias, and who performed the audits.

“The goal here is just to make sure that [a] sufficient number of females are being considered for the position,” he said.—SV

Do you work in HR or have information about your HR department we should know? Email [email protected] or DM @SusannaVogel1 on Twitter. For completely confidential conversations, ask Susanna for her number on Signal.

Quick-to-read HR news & insights

From recruiting and retention to company culture and the latest in HR tech, HR Brew delivers up-to-date industry news and tips to help HR pros stay nimble in today’s fast-changing business environment.