It is expected that AI either already is or will become a routine part of recruiting and hiring, perhaps eventually even helping to manage performance. The use of AI is likely to help ease the administrative burden to screen applications and even conducting first interviews. But it is likely to also increase risk for disparate impact claims.

The EEOC has issued guidance on just this point. The new guidance relies on the 1978 Uniform Guidelines on Employee Selection Procedures, which generally require validity studies be done to ensure that characteristics being measured are reliable indicators of successful performance in the job.

As a quick refresher, disparate impact occurs when a facially neutral employment practice (such as using AI to screen applications to determine who meets minimum standards) results in disproportionately screening out people of a certain protected category (such as people forty or older) or a combination of protected categories (for example, Hispanic women).

A first question — and related takeaway — is how do you determine when an impact is disproportionate? An easy common approach is the 80% rule (also called the four-fifths rule). Let's assume your applicant group has 100 men and 100 women. AI says 50 of the men (50%) and 25 of the women (25%) are qualified. Compare the selection rate of the protected group to the unprotected group — 25/50 — and the difference is 50%. Because the ratio is under 80%, it indicates there is a statistically disparate impact.

But, this may not be enough to conclude there is actually a disparate impact. The 80% rule is not appropriate for every scenario, and other types of tests may be better suited for the circumstances. Even if the 80% rule is appropriate, the disparate impact may be legally justified if the company can prove the selection tool is job-related and consistent with business necessities. This means that, if your job action doesn't satisfy the 80% rule, you need to do more or different testing or provide more justification as to why, despite this outcome, your testing methods are necessary.

Alternatively, being in compliance with the 80% rule does not automatically mean there is no legal exposure. If the job description, or another part of the recruiting process, discourages people based on a protected characteristic, the screening tool may pass the 80% rule test, but overall still poses a legal problem or can be subject to a legal challenge.

Plus, employers cannot simply rely on the vendor's representations about testing in order to escape liability from a disparate impact challenge. The legal risk does not simply move to the vendor, and the employer can still be liable. No matter the scenario, proper use of AI in screening and hiring will involve time and money to justify its use and outcomes.

No matter the scenario, proper use of AI in screening and hiring will involve time and money to justify its use and outcomes.

Consider the advantages of using AI in screening and hiring and compare that advantage to the extra investment that may be involved if the AI tool is challenged.

A related problem may be records retention. A recent case alleging improper use of an AI screening tool developed by HireVue, is based on a job application from 2021. In a Washington Post article from 2019, HireVue said a standard thirty-minute assessment may provide up to 500,000 data points, although it's not clear whether this is the same type of tool at issue in the recent lawsuit. That's a lot of data to save for two years or longer until the case is concluded. All of this resource investment is in addition to making sure the substance of the AI tool is proper. Assuming all of the data about the impact on selection is retained, organizations also need to remember what the AI process looked like two years ago. In an area of such rapid change, that could be a big challenge.

In short, companies need to pay attention and keep paying attention. As the news is telling us, AI is rapidly changing. Consider the advantages of using AI in screening and hiring and compare that advantage to the extra investment that may be involved if the AI tool is challenged.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.