AI bias hero

AI Bias Allegations Against Workday: What HR Needs to Know

Lawsuit alleges Workday’s applicant screening tool utilizes biased AI for hiring, discriminating against Black, disabled, and older applicants.

“The Workday issue” came up in conversation during last month’s meeting of i4cp’s Talent Acquisition Board.

If you haven’t heard about it—or if you have heard about it but haven’t had time to dig into the details—we’ve read all the case filings and motions so that i4cp members don’t have to.

Here are the fine points:

What the lawsuit claims

A class action lawsuit claims that Workday’s software is making biased hiring decisions for its customers—specifically, Workday’s AI-driven screening tool disqualifies applicants who are Black, disabled, and perceived to be over the age of 40.  

This case has been in process since 2021 (an initial claim made by the same plaintiff that Workday’s selection tools were discriminatory was dismissed), but has recently survived petitions to dismiss, making headlines, and raising concern (and a lot of questions). 

The current lawsuit was filed in the U.S. District Court for the Northern District of California in  February 2023 and has been mired in multiple motions for dismissal by Workday for the past 14 months.

The plaintiff, who is Black and over the age of 40, asserts that he received nearly 100 automated rejections for jobs he’d applied for, all of which were posted by employers using Workday's AI-powered applicant software.

The lawsuit claims that Workday's screening software has algorithmic bias against protected classes and violates Title VII of the Civil Rights Act, the Age Discrimination in Employment Act, and the Americans with Disabilities Act.

The suit also claims that Workday has been acting as an employment agency on behalf of users—in effect, using biased software  to procure non-diverse employees for its Fortune 500 customers.

Finally, the lawsuit alleges that Workday’s screening tools interpret an applicant’s qualifications using bias and then makes recommendations about whether the applicant should be accepted or rejected, and that Workday’s customers can “manipulate and configure [the tools] in a discriminatory manner to recruit, hire, and onboard employees.”

What Workday says

Workday’s representatives have argued that the suit should be dismissed because the plaintiff failed to exhaust administrative remedies for his discrimination claims with the Equal Employment Opportunity Commission.

Workday further asserts that it is not in any way acting as an employment agent for the companies that use its software and denies that its product is biased. 

What the court says

Workday’s argument that the lawsuit should be dismissed because the plaintiff didn’t exhaust all administrative remedies first for his discrimination claims with the Equal Employment Opportunity Commission was denied by the court.  

The plaintiff’s claim that Workday is acting as an employment agency for its clients has been found to have no supporting facts or evidence, so the court dismissed this claim. But, in January of this year, the plaintiff was given the option of amending his lawsuit to drop this particular claim and refile an amended suit, which he has since done.

What happens next

The case is moving ahead as of last week. If it continues to a final resolution, it could ultimately provide an answer to the question of who is ultimately responsible for AI tools that return biased results—the vendor or the user?

It will also test the U.S. Equal Employment Opportunity Commission’s current stance that software vendors may qualify as agents “if the employer has given them authority to act on the employer’s behalf,” which “may include situations where an employer relies on the results of a selection procedure that an agent administers on its behalf.”

This suit will likely be considered a landmark case—it’s among the first to challenge the use of AI tools in talent acquisition and has managed to remain in play for more than four years, despite vigorous efforts to quash it.

In the meantime, it goes without saying that the responsibility is very much on employers to closely monitor the recruiting and hiring software they are using and to continuously assess for any signs or patterns of potential bias or outright discrimination—review this article about the EEOC’s Strategic Enforcement Plan regarding discrimination resulting from the use of AI-assisted employment decision tools.

Lorrie Lykins
Lorrie is i4cp's Vice President of Research. A thought leader, speaker, and researcher on the topic of gender equity, Lorrie has decades of experience in human capital research. Lorrieā€™s work has been featured in the New York Times, the Wall Street Journal, and other renowned publications.