AI and Hiring Discrimination: The Impact Artificial Intelligence Hiring Tools Will Have on Companies
If you recently have applied for a job, the chances that an artificial intelligence (AI) platform was used to make decisions regarding your candidacy is high. Reports have recently indicated that almost 70% of companies are already using AI tools in their hiring process. Almost 99% of Fortune 500 companies use AI hiring tools. Additionally, there is an increasing use of AI hiring tools in lower wage job sectors where low income and minority populations are disproportionately concentrated.
How AI Hiring Tools Work
AI-powered tools have become integral in nearly every step of the hiring process. They are employed to target online job advertisements and to match candidates with positions on platforms like LinkedIn and ZipRecruiter. These tools also assist in rejecting or ranking applicants through automated resume screenings and chatbots that use knockout questions, keyword filters, or specific qualifications.
Additionally, AI is used to evaluate more abstract personality traits, either through online multiple-choice tests that pose situational or perspective-based questions or through video-game-like tools that analyze a candidate’s gameplay. If you’ve been asked to submit a video as part of your application, it may not be reviewed by a human; instead, some employers rely on AI to assess personality through voice analysis—measuring tone, pitch, and the words you chose to use—and video analysis of facial expressions and movements.
How this Exacerbates Hiring Discrimination
While these tools may be easier for the employer to use since it cuts back on time reviewing candidates, it is creating numerous hiring issues. These tools present a significant risk of amplifying existing discrimination in the workplace, including biases related to race, gender, disability, and other protected characteristics, despite being marketed as objective and less prone to bias.
AI systems are trained on vast datasets and make predictions by identifying patterns and correlations within the data. However, many of the tools used by employers rely on data derived from their own workforce and previous hiring practices, which often reflect ingrained institutional and systemic biases.
Think of a candidate that is required to do a video interview. If that candidate has a speech impediment, the AI analyzing the interview can give them a low score leading to them automatically being screened out. Another example can be an applicant with gaps in their resume between education and jobs. If an AI tool is rejecting candidates with resume gaps, otherwise qualified applicants who had to take time off for giving birth or needing medical treatment for example are turned down.
AI tools may also screen candidates on abilities that lack a connection to being a potentially successful employee. For instance, one resume screening tool found a correlation between being named Jared and playing high school lacrosse with being a successful employee, even though neither is relevant to job performance. Similarly, the vague personality traits that many AI tools aim to assess—such as positivity, stress tolerance, or extroversion—may not be essential for the role, may reflect culturally specific standards, or may inadvertently exclude candidates with disabilities or mental health conditions like autism, depression, or ADHD.
Additionally, many applicants applying for these jobs are unaware that these tools are even being used or how they internally work.
What This Means in the Legal Landscape
Lawsuits involving discrimination claims against developers of AI hiring tools are still in their early stages, but signs are showing that they will become more frequent. There are early indications that employers facing hiring bias lawsuits related to their use of AI-based tools may have the chance to shift some of the liability onto the vendors responsible for designing the technology.
The recent case, Mobley v. Workday, Inc., is the first of its kind. In a class action lawsuit, Derek Mobley (the plaintiff) sued the HR software company Workday, Inc. He claimed the company’s AI tools automatically rejected him and other applicants from 80-100 jobs based on his age, race, and disability status.
He alleged that the AI tools allowed Workday’s customers (hiring committees) to use “discriminatory and subjective judgments” when evaluating applicants and even allow for “preselection” of applicants not within certain protected categories. He further alleged that Workday’s administration and dissemination of the screening tools constituted a “pattern or practice” of discrimination and that this conduct amounted to intentional and disparate impact discrimination.
On July 12, 2024, a California federal court permitted certain bias claims from the applicant to move forward, based on the argument that Workday could be held liable as an agent of its employer clients. Judge Rita Lin of the US District Court for the Northern District of California ruled that the lawsuit sufficiently alleged that these employers had delegated traditional employment responsibilities to Workday.
While that was a partial win for the plaintiff moving forward, Judge Lin granted Workday’s motion to dismiss allegations of intentional discrimination. It was also granted that the company can’t be classified as an employment agency subject to federal laws on fair employment practices because it does not obtain job opportunities for workers.
While this case is ongoing, the final decision is likely to be used as a roadmap for future plaintiffs to bring discrimination claims against companies that use AI tools. Companies are also likely to be wary of the potential liability that they face when using these hiring tools.
The growing reliance on AI tools in job hiring poses a serious risk of discriminatory practices, as these systems can reflect the inherent biases of the people that created it. As companies increasingly turn to AI for recruitment decisions, the potential for unfair treatment of candidates rises, leading to a higher likelihood of a surge in hiring discrimination lawsuits in the future.
Noor Sandhu
Class of 2026, Staff Member