On May 18, 2023, the U.S. Equal Employment Opportunity Commission (“EEOC”) issued guidance on employers’ use of Artificial Intelligence (“AI”) in employment selection decisions and the potential for a disparate or adverse impact under Title VII of the Civil Rights Act of 1964 (“Title VII”). The EEOC acknowledged that employers nowadays have a wide variety of AI decision-making tools to assist them in making employment decisions, including recruitment, hiring, retention, promotion, transfer, performance monitoring, demotion, dismissal, and referral. Employers are increasingly utilizing these tools “in an attempt to save time and effort, increase objectivity, optimize employee performance, or decrease bias.” The EEOC’s guidance cautions employers of the potential for a disparate or adverse impact on the basis of race, color, religion, sex, or national origin when relying on AI tools to make employment selection decisions.
The EEOC notes the potential for disparate impact when employers (or their third-party vendors) use “algorithm decision-making tools” or software systems such as:
- Resume scanners that prioritize applications that use certain keywords;
- Employee-monitoring software that rates employees based on keystrokes or other factors;
- “Virtual assistants” or “chatbots” that ask candidates about qualifications and reject those who don’t meet pre-defined requirements;
- Video-interviewing software that evaluates candidates based on facial expressions and speech patterns; and
- Testing software that provides “job fit” scores for applicants or employees regarding personalities, aptitudes, cognitive skills or perceived “cultural fit” based on their performance in a game or on a more traditional test.
Employers are advised to monitor AI tools used internally and by their agents to make employment decisions, just as they would for traditional non-AI tools. It is important to understand the precise role of AI in the selection process and also whether the tools and/or selection procedure may be disparately impacting individuals who belong to one or more protected classes under Title VII. Employers should assess a selection procedure by whether using it causes the employer to select individuals in a particular group at a rate that is “substantially” different or less than the rate by which they select individuals in another group.
If a selection procedure has a disparate impact based on race, color, religion, sex, or national origin, an employer must show that the selection procedure is job-related and consistent with business necessity. An employer can meet this standard by showing that it is necessary to the safe and efficient performance of the job. The selection procedure should therefore be associated with the skills needed to perform the job successfully. If the employer shows that the selection procedure is job-related and consistent with business necessity, the inquiry then turns upon whether there is a less discriminatory alternative available.
Please contact Naureen Amjad or any member of the Employment, Labor and Benefits group with questions about your company’s use of AI in the workplace and for assistance with conducting a disparate impact analysis.
©2023 Masuda, Funai, Eifert & Mitchell, Ltd. All rights reserved. This publication should not be construed as legal advice or legal opinion on any specific facts or circumstances. The contents are intended solely for informational purposes and you should not act or rely upon information contained herein without consulting a lawyer for advice. This publication may constitute Advertising Material.