A new breed of high-tech hiring tools aimed at helping employers sift through growing applicant pools can unfairly weed out women and minorities, putting unwary businesses at risk of being caught up in an anticipated wave of bias litigation.
Experts say these tools, which use algorithms to predict whether an applicant will be successful in a given job, can perpetuate existing sex- or race-based gaps in employers’ workforces or create new ones. The tools haven’t led to any suits yet, but experts say that's likely to change.
* * *
These tools owe their rise to the internet and the ways it’s changed the hiring process, according to a recent report by digital technology watchdog Upturn called “Help Wanted: An Examination of Hiring Algorithms, Equity, and Bias.” As online job boards like Monster and Indeed replaced classifieds as the job advertising platform of choice, employers’ openings started reaching more job-seekers. This led to more applications — and more work for the HR reps that had to analyze them.
Enter algorithmic hiring tools. They work by crunching employer-provided “training” data to assess how applicants match up with a given opening. In theory, they can help employers sidestep their human hiring managers’ biases and analyze job candidates’ skills without considering their sex, race or other protected traits. In practice, whittling down candidate pools without bias isn’t so simple.
One class of tool described in the Upturn report analyzes an employer’s existing workforce to find traits that correspond with job success and highlights applicants who share them. But if the raw data is bad — if the employer tends to rate men’s performance more highly than it does women’s, perhaps — the result may be, too. In one example the group cited in its report, a screening service found workers who were named “Jared” or who had played high school lacrosse tended to be successful.
“On one hand, I completely believe those would be highly correlated traits of success in certain workforces,” said Upturn Managing Director Aaron Rieke, who co-authored the report. “On the other hand, that’s clearly not a clean, causal, defensible way to judge applicants.”
If these tools lead employers to select members of one protected class like race or sex over those of another, they can cause what’s known as “disparate impact” discrimination. This term refers to employment bias that comes about when an employer applies a seemingly neutral policy or practice to discriminatory effect.
Title VII of the Civil Rights Act of 1964, the federal law that outlaws workplace discrimination, lets workers challenge employer practices that cause disparate impacts on members of protected classes. Black workers denied jobs because a would-be employer used a tool that highlighted white applicants over them would seem to have a strong case under such a theory. But that’s only if they can bring it in the first place.
Outten & Golden LLP counsel Peter Romer-Friedman, the lead plaintiffs’ attorney in a suit alleging a class of employers violated federal age bias law by excluding workers from viewing their job ads on Facebook, said hiring discrimination cases are among the hardest suits to bring because applicants rarely know why they’ve been denied a job. This is especially so when employers make decisions based on data they keep “within a black box,” like this hiring data, Romer-Friedman said.
“It’s pretty rare for the public to learn about that kind of system or how it’s being done in a discriminatory way,” Romer-Friedman said. “Really, the only times you would know about it are if a whistleblower comes forward or if there’s litigation [and] it ends up uncovering the information in discovery, or if a regulator asks for it and gets it.”
* * *