KHVPF Insight

EEOC Examines Bias in A.I. Hiring Tools

EEOC Examines Bias in A.I. Hiring Tools

Automation and artificial intelligence are transforming workplaces. It is estimated that nearly every Fortune 500 company — and perhaps three-quarters of all employers — use some form of automated soft- ware as part of their hiring processes. But AI and auto- mated software can inherit biases based on their programming. For example, the Washington Post reported recently that an artificial intelligence program trained on billions of images and associated captions on the Internet started to associate the term “homemaker” with images of women, and the term “janitor” with images of people of color.

Given this risk of automated bias creeping into the workplace, the EEOC has expressed its interest in evaluating the interplay between artificial intelligence and discriminatory hiring practices. In 2021, the EEOC launched an agency-wide initiative to ensure that AI, machine learning, and similar software would not lead to hiring and other employment decisions that violate federal civil rights laws. Last year, the EEOC issued technical guidance explaining that the use of algorithmic software in hiring decisions could screen out dis- abled candidates in violation of the Americans with Disabilities Act.

But now, the EEOC has signaled its intent to put the weight of its enforcement tools behind its AI initiative. In January, the EEOC released its five-year draft Strategic Enforcement Plan (SEP) for 2023-2027, in which it expressed its desire to focus enforcement actions on decisions, practices, or policies by which technology contributes to discrimination, including automatic resume screening software, hiring software, chatbot software for hiring, and video interviewing tools.

The EEOC has also brought its first automated- software discrimination case. In EEOC v. iTutor-Group Inc., pending in the Eastern District of New York, the EEOC claims that the employer’s hiring algorithm automatically rejected female applicants over the age of 55, and male applicants over the age of 60. The Complaint alleges that a prospective employee was instantly rejected after inputting her actual birthdate into the application software — but the next day, when reapplying with the same application but putting a birthdate that made her seem younger — was offered an interview. The EEOC claims that over 200 qualified applicants were rejected because of their age.

While the iTutorGroup lawsuit alleges that the algorithm in question was intentionally designed to discriminate against older applicants, even inadvertent bias could lead to employer liability under a disparate impact theory. The EEOC’s SEP identifies as a subject-matter priority “the use of automated systems, including artificial intelligence or machine learning, to target job advertisements, recruit applicants, or make or assist in hiring decisions where such systems intentionally exclude or adversely impact protected groups.”

Employers should remain vigilant and take steps to ensure that AI-driven hiring tools do not perpetuate bias or otherwise discriminate against applicants based on protected characteristics. Additionally, employers are encouraged to work with legal counsel and HR professionals to develop best practices for implementing AI-driven tools in their hiring processes.

Michelle C. Ruggirello