News

March 26, 2024Client Alert

New Lie-detecting AI in the Hiring Process: How Will This Case Unfold?

There is a new class action lawsuit related to the use of AI in the hiring process, and employers will want to keep track of this as it unfolds. A leading health solutions company in the U.S. is facing claims that they are violating prohibitions against the use of lie detectors through use of an AI screening tool in its interview process. This case raises concerns over whether AI Screening Tools to evaluate job applicants’ integrity and honesty violates laws prohibiting lie detector tests in employment. 

For background, federal law (Employee Polygraph Protection Act of 1988 (EPPA)), and many states (28 total), including Massachusetts, restrict or prohibit employers from using polygraph tests (i.e., “lie detector” tests) on applicants and employees. Violations of the federal EPPA can result in civil penalties of up to $25,597. Massachusetts’ Lie Detector Statute, the subject of this lawsuit, not only prohibits employers from subjecting applicants and employees to lie detector tests, but also requires employers to include notice in all job applications that it is unlawful in Massachusetts “to require or administer a lie detector test as a condition of employment or continued employment.” Mass. Gen. Laws. Ch. 149, §19(B)(2). Failure to do so can result in stiff penalties.

In Baker v. CVS Health Corporation, the named plaintiff alleges that when he applied for a supply chain position in Massachusetts with the employer, he was subjected to a lie detector test and the employer failed to provide him the required notice of his statutory rights under the Massachusetts Lie Detector Statute. The plaintiff is seeking $500 per violation in statutory damages, plus reasonable attorneys’ fees and costs. According to the complaint, to screen applicants, the employer administered a video interview technology. During the virtual interview, an applicant answers a series of questions about their integrity and honesty. The recording is then uploaded to a third-party platform, which uses AI to analyze facial expressions, eye contact, and voice intonation to determine if candidates are a cultural fit. The plaintiff claims that during his application process, he was unaware that the virtual interview was a lie detector test, and that had he been aware, he would not have participated.

The employer moved to dismiss the claim related to its alleged failure to provide the statutory notice of the lie detector prohibition with the job application. The employer argued that the statute does not create a private right of action to enforce its notice provisions and that the plaintiff lacked standing to challenge the lack of notice claim. The court rejected both arguments. Among other things, the court explained that plaintiff’s alleged “injury” is of the kind the statute is intended to protect, e.g., he alleged that as a downstream consequence of not receiving notice regarding lie detector tests, he did not have the chance to treat the virtual interview more critically.

This case illustrates how new AI technologies can risk creating violations of old (and rarely utilized) laws. With the rapid advancements in AI, the corresponding scrutiny by courts and government agencies (even when using vendor software) is inevitable and employers must pay close attention.

While not the subject of this litigation, employers also should continue to be mindful that AI interviewing and hiring tools—including those using visual cues to determine honesty—may pose discrimination concerns. The EEOC has been examining AI issues in employment very closely, launching its Artificial Intelligence and Algorithmic Fairness Initiative in 2021 and continuing to prioritize AI review in its 2024-2028 Strategic Plan.

Accordingly, employers should continue to review any AI technology used in application and employment processes and consult with their Michael Best attorney to help minimize risks.

back to top