· 4 min read
HR professionals are likely sick of hearing exaggerated predictions about how AI is going to take over HR jobs faster than Beyoncé can record her next album.
But it’s not the robots stealing jobs that HR leaders should focus on. Instead, maybe it’s time to get curious about how ChatGPT might make parts of the job less tedious.
According to Joshua Merrill, CEO at Confirm, an AI-powered startup, ChatGPT can draft performance reviews based on feedback from several colleagues. He claims that by asking a few pointed questions to an employee’s peers about work and collaboration, ChatGPT has enough information to aggregate an accurate performance review.
Tech takeover? Some companies have used AI in the performance review process since at least 2018, but some research found that AI exacerbated or introduced bias. And some experts speculate that while there could be a role in using ChatGPT specifically for drafting performance reviews, the technology isn’t necessarily ready for wide use.
System glitches. ChatGPT could introduce new problems to performance reviews, according to an expert who spoke to HR Brew. Performance management and recruitment software company Textio recently conducted a ChatGPT experiment that found that, when given generic yet traditionally gendered information, such as a certain role or industry, the technology created gender-biased performance feedback. The experiment also found that ChatGPT, “doesn’t handle race very well.”
Vikram Bhargava, assistant professor of strategic management and public policy at George Washington School of Business, told HR Brew that bias can occur when ChatGPT content is used uncritically. For example, if an employee is referred to as “bubbly” (a term often exclusively used to describe women), ChatGPT could take these gendered words and create a gender-biased review. While bias is not unique to the technology, it can magnify the already biased data.
Quick-to-read HR news & insights
Our HR newsletter delivers need-to-know industry news and insights to HR pros every weekday for free.
Finally, the technology is inherently impersonal, and valuable feedback or insights can get lost through the process of letting technology decipher feedback. He also said its use could make employees concerned or upset that an algorithm is being used to draft something as significant as a performance review. Bhargava pointed to the recent controversy surrounding Vanderbilt University’s use of ChatGPT to craft a note to students about the recent shooting at Michigan State University. Bhargava said that while the note itself may have been fine, using technology to craft a deeply personal message could cause students (or in the case of performance reviews, employees) to pull back from the institution that deployed the technology.
Balancing act. “It would be in the interest of HR managers to exercise caution with respect to this technology, given that we’ve only had it for a few months now,” Bhargava recommended.
That said, using the technology to speed up the performance review process isn’t inherently bad, especially as managers and HR practitioners are under pressure to complete the review cycle. He said that using some type of aggregator could be beneficial in some instances. “There are ways in which they can use it for guidance,” he explained, referencing how ChatGPT can provide structure and better insight when the data itself is more evidence-based and specific.
“I wouldn’t be surprised if these technologies ultimately write performance reviews that are just as accurate, if not more accurate, than human managers, in the same way that this Vanderbilt letter might have written a more compassionate letter than the provost could have written,” Bhargava said.
With ChatGPT, it’s kind of like our Girl Scout cookie habit—all about moderation.—KP
Correction, Mar 21, 2023: This article was updated to reflect that the recent shooting took place at Michigan State University.