The financial technology (“Fintech”) industry has boomed over the last decade, from the rise of mobile payment apps, robo-advisers, lending platforms, consumer-friendly brokerages to cryptocurrency trading platforms. By their nature, many Fintech companies deal with highly sensitive consumer personal and financial information and face a host of privacy concerns and legal obligations that flow therefrom. In the United States, instead of a single federal regulatory framework governing all personal information, there are several different laws and regulations which impose various privacy and data security requirements on different sectors or jurisdictions. This article discusses the scope and requirements of two such laws of prominence for many Fintechs: the Gramm-Leach-Bliley Act (the “GLBA”) and the Fair Credit Reporting Act (the “FCRA”).
The GLBA regulates all collection, usage and disclosures of personal financial information, and broadly applies to all “financial institutions,” which are businesses that are “significantly engaged” in “financial activities,” as well as businesses whose services facilitate financial operations on behalf of financial institutions. What constitutes “financial activity” has been construed broadly, potentially encompassing many services provided by Fintechs. The Federal Trade Commission (FTC) is the main agency that enforces the GLBA, while state law can require higher levels of compliance than what is required by the GLBA. The FTC has implemented regulations to carry out the GLBA’s financial privacy provisions (the “Privacy Rule”) and information security provisions (the “Safeguards Rule”).
Under the Privacy Rule, financial institutions have several obligations towards consumers (defined as individuals “who obtain or ha[ve] obtained a financial product or service from the financial institution that is to be used primarily for personal, family, or household purposes, or that person’s legal representative”) and customers (defined as “a subclass of consumers who have a continuing relationship with the financial institution”) whose nonpublic personal information (“NPI”) is handled by the financial institution. The Privacy Rule’s obligations do not extend to information collected about individuals who obtain financial products or services for commercial or business purposes. The Privacy Rule also does not apply to information collected from non-consumers, who have not obtained and are not obtaining financial products or services from the institution.
A company that is subject to the Privacy Rule must provide a “clear and conspicuous” written notice describing the company’s privacy policies and practices, including the categories of information collected, the categories of information disclosed, the categories of affiliated and nonaffiliated parties to whom NPI will be disclosed, and policies and practices with respect to protecting NPI in accordance with the GLBA’s Safeguard Rule (described in detail below). Further, if the company shares NPI with third parties outside of the exceptions listed in the Privacy Rule, it must provide an opt-out notice explaining the consumer’s right to direct the institution not to share their NPI and a reasonable method for opting out. The Privacy Rule requires covered entities to provide each consumer with a privacy notice both at the time they become a customer and every year that they remain one.
The Safeguard Rule mandates financial institutions to develop, implement and maintain a comprehensive information security program that describes the specific measures taken to protect customer information. In late 2021, the FTC announced significant new security requirements for non-bank financial institutions subject to the GLBA which have been incorporated into the Safeguard Rule. The new Safeguard Rule is far more prescriptive than the original rule with key requirements including, among other things, designation of a qualified individual to implement and supervise a company’s information security program, periodic risk assessments, implementation of safeguards to mitigate identified risks such as encryption of customer information and implementation of multi-factor authentication for anyone accessing customer information on the company’s system, regular testing of safeguards, staff trainings, monitoring of service providers, updating the security program, a written incident response plan, and regular reporting to the institution’s Board of Directors.
Penalties for noncompliance with the GLBA can include fines up to $100,000 per violation and $192 per record lost in restitution. This also includes fines for officers and directors of up to $10,000 per violation, criminal penalties of up to five years in prison, and revocation of professional licenses.
Fintechs should undertake scoping exercises to carefully assess whether they provide financial products or services to individuals for their personal, family or household purposes, and if so, develop adequate privacy and data security practices and notices for NPI collected from consumers and customers.
The FCRA limits the circumstances under which consumer credit information may be used and grants consumers rights to know what information is being used and when it impacts them negatively. The FCRA is very broad and covers a plethora of data types used to make eligibility decisions about consumers. It applies not only to credit bureaus and background check companies, but also to “anyone who: (1) assembles or evaluates consumer data and shares it for purposes determining eligibility for credit, insurance, employment, housing or other eligibility purposes; (2) buys credit reports, including credit scores; or (3) supplies consumer information to credit bureaus.”
Some Fintechs, such as lead generators, data aggregators and debt collectors, as well as those that use algorithms to make decisions about consumers, may also be subject to the FCRA if their services are used to facilitate decision-making about consumers’ eligibility for credit, housing, employment and other eligibility purposes. As a result, it is worthwhile for Fintechs to analyze their data collection purposes and practices to determine whether they are subject to the requirements of the FCRA. The Consumer Financial Protection Bureau (the “CFPB”) has primary enforcement authority over the FCRA, but the FCRA may be enforced by other federal agencies and applies to companies that are outside the CFPB’s jurisdiction. The CFPB also recently issued an interpretive rule stating that preemption under the FCRA is “narrow and targeted.” It further encouraged states to enact laws that may go further than the FCRA to protects their residents.
The FCRA allows victims to file for financial and statutory damages including attorney’s fees, court costs and punitive damages. Companies that violate the FCRA can also be fined by the CFPB and the FTC. A willful violation of the FCRA may result in actual or statutory damages ranging from $100 to $1,000 per violation, in addition to punitive damages determined by the courts. While a negligent FCRA violation may be subject to lower fines and restitution amounts, any regulatory action or lawsuit alleging a violation of the FCRA can be extremely costly for a Fintech.
If a Fintech is subject to the requirements of the FCRA, it should: (i) adhere to the data privacy and disclosure requirements of the FCRA, including making sure that credit report information is used only for permissible purposes as defined by the FCRA (see Orrick’s Insight into the CFPB’s recent advisory opinion on permissible purposes under the FCRA here); (ii) have a mechanism to correct erroneous information; and (iii) notify consumers when they are subject to adverse decisions in housing, employment, credit and other purposes based on the credit information collected by the company. For example, a company may not use credit information for targeted marketing purposes, as targeted marketing is not one of the permissible uses laid out in the FCRA.
Fintechs should pay special attention to situations where algorithms are used to make automated decisions about consumers. In such situations, companies must ensure that they have implemented reasonable procedures to maximize accuracy and provide consumers access to their own information and an ability to correct errors. When an algorithm is used to make any adverse decisions—such as charging higher rent or denying credit—the consumer must be provided with an adverse action notice and given an explanation as to why the decision was made with specificity. Further, if not used with caution, use of artificial intelligence (“AI”) may result in discrimination against a protected class, in violation of the Equal Credit Opportunity Act (the “ECOA”). The FTC has specifically warned that employing algorithms that discriminate against protected classes can be considered an unfair practice subject to enforcement actions and possible fines. Additionally, the CFPB recently redesigned its whistleblower web page encouraging whistleblowers with knowledge of “potential discrimination or other misconduct within the CFPB’s authority to report it”. Thus, to avoid regulatory scrutiny, companies that use AI to make credit-related decisions should rigorously test such algorithms before commercialization and periodically audit and monitor their use thereafter to ensure compliance with the ECOA and other equal opportunity laws.
In addition to federal privacy laws, Fintechs should remain aware of the increasing number of state laws that regulate how businesses must handle personal information. In many cases, such state privacy laws have partial exemptions for personal information subject to the GLBA or the FCRA. For instance, the California Consumer Privacy Act (the “CCPA”) exempts from its privacy requirements personal information “collected, processed, sold, or disclosed pursuant to the federal GLBA, and implementing regulations.” Fintechs, therefore, should carefully assess whether they are subject to the federal requirements discussed above, and, if they are not or certain information they collect falls outside of the scope of the federal requirements, make sure to assess and comply with the applicable requirements of state privacy laws to which they are subject.
The authors wish to give special thanks to summer associate Christina Lee of Harvard Law School ’24, for contributing to this piece.