May 17, 2023
EEOC Publishes AI Guidance for Employers
On May 17th, the U.S. Equal Employment Opportunity Commission issued guidance, “…Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures…”
Comprised of just seven Q & A’s, it is a pretty quick read (unlike this longer-than-usual article) and a great reminder for employers. I find most employers are not aware of their obligations under the federal Uniform Guidelines on Employee Selection Procedures (UGESP).
Who is covered under the UGESP? Employers covered under Title VII of the Civil Rights Act, generally those with 15 or more employees, must track the results and monitor for adverse impact in the use of any selection procedures used in “making employment decisions including recruitment, hiring, retention, promotion, transfer, performance monitoring, demotion, dismissal, and referral.”
What is a selection procedure? Selection procedures historically have included things like background checks (criminal, credit, etc.), job requirements (physical, education, experience), tests or assessment of applicants or employees for hiring, promotion, and more. In today’s world, it also includes “resume scanners that prioritize applications using certain keywords; employee monitoring software that rates employees on the basis of their keystrokes or other factors; “virtual assistants” or “chatbots” that ask job candidates about their qualifications and reject those who do not meet pre-defined requirements; video interviewing software that evaluates candidates based on their facial expressions and speech patterns; and testing software that provides “job fit” scores for applicants or employees regarding their personalities, aptitudes, cognitive skills, or perceived “cultural fit” based on their performance on a game or on a more traditional test.
How does an employer monitor for adverse impact? An employer must determine whether the selection rate for one group (based on race, color, religion, sex, or national origin) is “substantially” different than the selection rate of another group. A selection rate is substantially different than another if their ratio is less than four-fifths (or 80%).
Example: An employer uses a test scored by an algorithm as part of its selection procedure. The selection rate for Black applicants was 30% and the selection rate for White applicants was 60%. The ratio of the two rates is thus 30/60 (or 50%). Because 50% is lower than 4/5 (or 80%), it is substantially different than the selection rate for White applicants. This could be evidence of discrimination against Black applicants.
Now, here is the tricky part. How is an employer to know the sex, race or any other protected status of its applicants? Question #88 of the UGESP’s FAQ’s reads, “Where applications are made in person, a user [employer] may maintain a log or applicant flow chart based upon visual observation, identifying the number of persons expressing an interest, by sex and by race or national origin; may in some circumstances rely upon personal knowledge of the user; or may rely upon self-identification. Where applications are not made in person and the applicants are not personally known to the employer, self-identification may be appropriate. Wherever a self-identification form is used, the employer should advise the applicant that identification by race, sex and national origin is sought, not for employment decisions, but for record-keeping in compliance with Federal law. Such self-identification forms should be kept separately from the application, and should not be a basis for employment decisions; and the applicants should be so advised.”
If you are not conducting this tracking and periodic monitoring, you are not alone. In the last 20+ years, I have found only a handful of employers who were aware of these requirements, much less actively engaged in the identification and monitoring processes.
Watch state and local trends! From the White House, to federal agencies, to states and local jurisdictions – everyone seems to have their eye on employers (and others’) use of AI. In the last few years, several states and local jurisdictions have introduced, and some have passed bills, including and not limited to Maryland and Illinois, restricting employers’ use of AI in employment practices. Most recently perhaps, New York City enacted a law that requires employers to conduct of “bias audit” of their AI selection procedures.
Ending on a positive note. If you find that one or more of your selection procedures do appear to have “substantially” different results or adverse impact, don’t panic. At that point, you are only required to assess whether there is a different selection procedure you could use that would reduct the adverse impact to less than a substantial difference or if you could modify the algorithm of the assessment procedure you are using to do so.