Jeremy Farrell, Esq., jfarrell@tuckerlaw.com, (412) 594-3938
Technology-driven workplace surveillance tools could trigger liability under the Fair Credit Reporting Act.
Employers that use tracking technology and artificial intelligence (AI) to monitor workers and make employment decisions may now have one more thing to worry about—the Fair Credit Reporting Act (FCRA).
Many companies are familiar with the FCRA’s requirements for obtaining criminal background reports from third party consumer reporting agencies. Among other things, the FCRA requires that employers obtain worker permission before obtaining a consumer report and notify the employee before and after taking an adverse action based on the report’s contents.
Employers may have to follow the FCRA’s requirements if they buy AI or other sorts of data-driven reports or algorithmic scores from third party vendors, says the Consumer Financial Protection Bureau (CFPB), one of the federal agencies responsible for enforcing the FCRA, in guidance published in October 2024.
That’s a potentially significant compliance obligation for employers as AI becomes more prevalent through emerging technology and an increasingly remote workforce. There can be serious penalties for operating outside the FCRA. Violators can be sued, and FCRA claims can be particularly susceptible to class action treatment.
The CFPB’s concern is that AI-generated reports can contain sensitive information not known by workers that can then be used against them in hiring decisions, job assignments, and career advances. Inaccurate reports could harm employees. The FRCA is aimed at increasing transparency around the use of such information.
It does that by regulating “consumer reports,” which means “any written, oral, or other communication of any information by a consumer reporting agency bearing on a consumer’s credit worthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living which is used or expected to be used or collected in whole or in part for … employment purposes.” 15 U.S.C. § 1681a(d).
That definition raises two key considerations for employers. First, does the employer use “consumer report” for employment purposes, and second, does the entity providing the report qualify as a “consumer reporting agency”?
The first point is fairly straightforward. Information is used for employment purposes if it is used to evaluate someone for employment, promotion, reassignment, or retention. In a press release announcing the new guidance, the CFPB cited the following examples as ways that employers are using AI or tech-driven data for employment purposes:
The second point is a bit more convoluted because not all third parties that collect or assess data meet the definition of a “consumer reporting agency” under the FCRA. The agency offered the following two examples in the circular about how tech vendors could qualify as a “consumer reporting agency” under the FCRA:
If you have any questions about the content of this article, or how you can align your practices with the FCRA, please contact Jeremy Farrell at (412) 594-3938 or jfarrell@tuckerlaw.com.
January 29, 2025
The same attributes that have anchored over a century of success are still our guiding principles today.
Enter your email address below and be notified when we post new information.