You felt it might be something like this, didn’t you? It wasn’t your typo-riddled resume or your questionable Zoom background, it was the system. You’re perfectly qualified for the 100s of roles you’ve applied for, yet nothing more than the occasional auto-reject email has ever come back.
According to a lawsuit, if you're 40 or older and applying through companies that use Workday’s hiring tool, as so many do, you may have been algorithmically ghosted - no human involved, no second look, no mercy. Rejected by the machine because you’re deemed too old.
Meet the Plaintiff
Derek Mobley, a 50-year-old IT professional from North Carolina with a degree from Morehouse College and a resume that includes Hewlett Packard Enterprise, applied to more than 100 jobs between 2017 and 2019 via Workday-powered systems - and got zero interviews. Worse still, many rejections came back within minutes or sometimes at 2 a.m. – possible proof that nobody actually reviewed his application? Mobley believes he was filtered out because of his age, race (he’s Black), and disclosed mental health conditions (x.com, wsj.com).
“There’s a standard bell curve in statistics. It didn’t make sense that my failure rate was 100%,” Mobley said after landing a job at Allstate - and getting promoted twice (wsj.com).
Why Now It’s Getting Serious
What began as a personal lawsuit in 2023 has now morphed into a national class action. In May 2025, a California judge certified Mobley’s age-discrimination claim as a collective action, meaning millions of over-40 applicants who’ve been filtered by Workday could now opt in (wsj.com).
Judge Rita Lin ruled that Workday might be legally considered an employer for the purpose of discrimination law, as it effectively performs screening functions that human recruiters normally would do (reuters.com). The judge also noted that:
“The (lawsuit) plausibly alleges that Workday’s customers delegate traditional hiring functions, including rejecting applicants, to the algorithmic decision-making tools provided by Workday”
(instagram.com, reuters.com).
The U.S. Equal Employment Opportunity Commission (EEOC) weighed in too, urging the court to let the case proceed, and explicitly warning companies that using AI hiring tools doesn’t shield them from bias liability (reuters.com).
Momentum isn’t progress. Especially when you always end up back where you started. Fathom helps you escape the loop. With insight, not intuition. Fathom helps companies build employer brands worth believing in and sticking with.
Expert Pushback: “AI Just Mirrors Us”
Experts say this case could lead to transparency and regulation in an industry that’s been a legal black box:
Ifeoma Ajunwa, a professor at Emory University School of Law and author of The Quantified Worker, told The Wall Street Journal:
“Hiring intermediaries have pretty much been excused from regulation… I think this case will change that” (wsj.com).
Computer scientist Kathleen Creel from Northeastern added that these tools aren't inherently evil but:
“Mechanical errors such as misclassifying a previous job title, or… penalize members of a single group or people with certain combinations of characteristics” (wsj.com).
Beyond the courtroom, career coach Jennifer DeLorenzo offered insight on LinkedIn:
“AI learns from someone. That data? It comes from us… historical hiring patterns that were never as objective as we pretended they were” (linkedin.com).
Employer Brand, Meet Lawsuit

If your employer branding is heavy on “inclusion” and “equity” but you rely on AI pre-screening that quietly filters out older applicants, you and your brand are vulnerable. Picture those Glassdoor and Indeed reviews:
“Great benefits. Terrible algorithm hates old people.”
“Applied three times. Got rejected faster than my VPN could load the page. #StillQualified”
What Every Employer Should Do Now
Step |
Action |
Why It Matters |
Audit your AI |
Test every hiring tool for bias (age, race,
disability) |
Ignorance won’t protect you in court |
Vet vendors hard |
Require bias testing, audit logs, and
transparent data every quarter |
Vague reassurances won't cut it |
Human oversight |
Add manual review before rejecting
candidates |
AI is a tool, not the final say |
The Bigger Picture
This is about far more than Workday getting sued. This would be the first major class action over automated ageism in hiring, and it could embolden plaintiffs and regulators worldwide. With over 11,000 organizations using Workday globally, this could set a legal precedent impacting HR tech across the board (lifewire.com, wsj.com, arxiv.org).
Workday insists its AI is trustworthy and client-customisable, and that hiring decisions are ultimately human. But as Professor Ajunwa warns, “hiring intermediaries” won’t stay in the regulatory shadows for long.
If you outsource bias to an algorithm, don’t expect to sidestep accountability. If your brand claims you hire “without barriers,” but your codebase quietly puts them up, let this story be your wake-up call.
Takeaways
1. AI isn't neutral.
Workday’s hiring software is under fire for allegedly filtering out candidates over 40, with a class-action lawsuit now certified to represent potentially millions of job seekers.
2. This is the first of its kind.
If successful, it could establish a legal precedent for holding HR tech vendors and their customers liable for algorithmic bias.
3. Employers are in the firing line too.
The lawsuit argues that companies using AI tools can't dodge responsibility just because the discrimination was outsourced to code.
4. Expert consensus? Time's up.
Legal scholars and computer scientists agree: the Wild West era of unregulated AI in hiring might finally be ending.
5. Employer branding risk is real.
If your systems discriminate but your careers page preaches inclusion, you're both inconsistent and vulnerable to reputational damage.
6. How should our EVP actually respond to this shift?
Action required now. Audit your hiring tools, interrogate your vendors, and make sure humans are still part of the process before regulators or the media beat you to it.
