Employers’ algorithms and AI software may be discriminating against applicants and employees with disabilities

Artificial intelligence collage (icons and lines on a computer screen)

On May 12, 2022, the Equal Employment Opportunity Commission (EEOC) and the U.S. Department of Justice (DOJ) issued guidance to caution employers about using artificial intelligence (“AI”) and software tools to make employment decisions. The guidance, titled “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees,” warns that using these tools without safeguards could result in a lawsuit under the Americans with Disabilities Act (ADA). 

The ADA requires that people with disabilities have full access to public and private services and facilities. The federal law also bans employers from discriminating on the basis of disabilities, and requires that employers provide reasonable accommodations to applicants and employees with disabilities to enable them to apply for a job or do a job.

Employers cautioned on use of certain decision-making tech tools

Companies increasingly rely on online applications and computer-based pre-employment assessments in hiring decisions. This would include job application screening software that prioritizes applications using certain keywords, or automatically screens out applications without certain qualification, or testing software that grades employees on personality traits, aptitudes, or cognitive skills. Many employers also rely on automated software tools to monitor existing employees’ locations, productivity and performance. Employers sometimes use these tools to make pay, disciplinary, and termination decisions.

The new guidance warns employers that rely on AI and software to make decisions about pay, performance evaluations, discipline, hiring, and terminations may result in discrimination against applicants and employees with disabilities. This is because a) sometimes the software tool used is not accessible to applicants or employees; or b) sometimes the metrics measured by performance or productivity tracking software may not fairly reflect or accurately an applicant or employee’s ability to perform the job requirements with a reasonable accommodation. 

Common barriers to access present in web or computer-based tools include incompatibility with screen-reading software used by blind users; poor color contrast, which may cause issues for individuals with low vision or color blindness; videos without alternative text or closed captions for individuals with hearing impairments; or the use of timers, which may cause problems for individuals with intellectual disabilities or those with dexterity problems that make use of a mouse or keyboard difficult. Without proper safeguards, applicants or workers with disabilities may be screened out before they even have a chance to apply or take the assessment. 

As the guidance warns, a person’s disability may “prevent[] the algorithmic decision-making tool from measuring what it is intended to measure…. If such an application is rejected because applicant’s [disability] resulted in a low or unacceptable rating, the applicant may have effectively been screened out because of [a disability.]” The guidance goes on to state, “For example, video interviewing software that analyzes applicants’ speech patterns in order to reach conclusions about their ability to solve problems is not likely to score an applicant fairly if the applicant has a speech impediment that causes significant differences in speech patterns.”

With performance-tracking metrics, the EEOC warns that employers should not be using algorithms, software or AI to make employment decisions without giving applicants and employees a chance to request reasonable accommodations. Software and AI is designed to provide information based on preset specifications or the average or ideal worker. As the technical guidance notes, “People with disabilities do not always work under typical conditions if they are entitled to on-the-job reasonable accommodations.” 

The guidance also warns against using software that violates the ADA’s restrictions on disability inquiries. The ADA only allows employers to inquire about an applicant’s or employee’s medical conditions in limited circumstances. An algorithmic decision-making tool that asks questions that are likely to elicit information about medical conditions, or that directly screens out applicants with certain conditions, may violate the ADA. 

Best practices for employers

Employers looking to use algorithms, AI, and other job-screening and performance-measuring software should ensure that applicants and employees are notified of their option to request an accommodation. Staff should be trained to recognize and respond to requests for accommodation (which often do not use the word “accommodation.”). As the technical guidance notes, this could include requests to take a test in an alternative format, or to be assessed in an alternative way. Employers should work to minimize the chances that the tools used disadvantage individuals with disabilities, including looking for software that has been tested by users with disabilities, providing clear instructions for accommodations, and avoiding screening for traits that may reveal disabilities. The best practice is to use the tools only to measure qualifications that are truly necessary for the job, and to measure those qualifications directly, rather than through characteristics or scores on personality assessments. The vendor from whom the tool is purchased should also be able to verify that the tool does not solicit information regarding an applicant’s medical conditions. 

Employers would do well to remember that applicants and employees are humans, and sometimes decisions about humans need to be made by humans, not by computers.