2 minute read | May.16.2023
Artificial Intelligence (AI) is transforming financial services, from underwriting and trading securities to customizing financial products and services. These innovative modeling techniques may enhance the accuracy of models used to identify potential customers and assess their potential risks, expand access to credit by underserved populations, and reduce losses and other associated costs.
However, as highlighted by the Federal Trade Commission’s (FTC) article, The Luring Test: AI and the Engineering of Consumer Trust, AI may also carry significant consumer and commercial risks. As companies contemplate novel uses of AI, here’s what you need to know about the FTC’s focus on three “laws important to developers and users of AI” which may impact your business:
The FTC, among other regulators, will likely continue to use its authority and expertise to exercise jurisdiction under these statutory regimes and focus on companies’ use of AI. Given the growing concerns about the use of new AI tools, the FTC provides several cautionary considerations for businesses operating in this space:
It’s clear that FTC staff and other regulators are focusing on how companies may choose to use AI technology, including new generative AI tools, in ways that can have actual and substantial impact on consumers. If you use or are contemplating using AI tools as part of your product or service offerings, now is the time to examine the three laws the FTC deems important to developers and users of AI.