Consumer Financial Protection Bureau Director Rohit Chopra recently announced a rulemaking that would apply the Fair Credit Reporting Act to certain data brokers. The CFPB also published an FAQ with background related to this announcement.
The FCRA applies to credit reporting agencies, or entities that regularly engage “in whole or in part in the practice of assembling or evaluating consumer credit information or other information on consumers for the purpose of furnishing consumer reports to third parties…,” in addition to users (lenders, for example) and furnishers (creditors, for example) of consumer credit information.
Entities that fall within the FCRA’s scope must comply with a number of requirements such as providing consumers a copy of their credit report, notifying consumers when their credit information will be used against them (such as to deny a loan or employment), responding to consumer disputes to credit information and requests to correct credit information, removing old credit information in cerain circumstances, and only disclosing credit information to certain requestors.
Data companies, or data brokers, generally collect (or license) and then sell (or sublicense) consumer data. But, data companies generally do not consider themselves “credit reporting agencies” subject to the FCRA. This is because the data they collect does not (normally) meet the definition of a “consumer report” under the FCRA. If the CFPB does apply the FCRA to these data companies, this will create a major compliance change for those companies.
Specifically, the CFPB says in its FAQ that:
- “[F]irms that monetize certain data would be prohibited from selling it for purposes other than those authorized under the FCRA.”
- “[A] company’s sale of data regarding, for example, a consumer’s payment history, income, or criminal records would generally be a consumer report.”
- “[T]he proposals under consideration would clarify the extent to which “credit header data” constitutes a consumer report…. Credit header data is personally identifying information like someone’s name, address, or Social Security number, which is held by traditional credit bureaus. Much of the existing data broker market relies on credit header data purchased from the big three credit bureaus to create their individual dossiers.”
- “Credit header data could be sold for purposes like credit underwriting, employment applications, insurance underwriting, and government benefits applications, but not, for example, for targeted advertising, to train AI, to sharpen chatbots or similar AI services, or to individuals who could be stalkers or perpetrators of domestic violence.”
And here is where we see what likely underpins much of the CFPB’s push to bring data companies into the FCRA: AI. Director Chopra says that:
Today, “artificial intelligence” and other predictive decision-making increasingly relies on ingesting massive amounts of data about our daily lives. This creates financial incentives for even more data surveillance. This also has big implications when it comes to critical decisions, like whether or not we will be interviewed for a job or get approved for a bank account or loan. It’s critical that there’s some accountability when it comes to misuse or abuse of our private information and activities.
The Director’s math might look like this: AI + credit data + data companies = risk to consumers. And, likely further sharpening the CFPB’s attention on this topic, the FAQ notes that:
“[C]redit or consumer reporting has been the most-complained-about product to the CFPB every year since 2017. Complaints about credit or consumer reporting represented roughly 76 percent of consumer complaints submitted to the CFPB during 2022, far more than any other category of consumer product.”
So, we have the #1 most common CFPB complaint combined with a new area of technology and a set of sensitive data. But wait. There’s more. Continuing to build its case, the FAQ says:
The people most at risk from many data broker harms include people who are elderly, have dementia, or are pregnant; low-income families struggling to access housing, food, or medical care; people of color; limited English proficiency individuals; immigrants; LGBTQI+ individuals; military families; survivors of intimate partner violence; and children and teens.
Let’s therefore add vulnerable and sensitive groups to the above long list of risks, and we have a recipe for expanded regulation. The FTC and other US enforcement agencies are known for taking a special liking to pursuing companies who take advantage of these vulnerable and sensitive groups.
Data companies handling the above types of data should keep a close eye on these developments, as they may be required to comply with at least some FCRA requirements that would normally apply to a credit reporting agency – including only disclosing the above data to certain entities and for certain purposes (and namely, not for advertising or AI-related purposes). The new rule is expected to be released in 2024.