WASHINGTON, DC –  Amid a reported uptick in new financial scams using artificial intelligence (AI) to trick victims, a group of leading U.S. Senators is urging a consumer watchdog to re-double efforts to keep people safe and crack down on offenders.

Today, U.S. Senator Jack Reed (D-RI) joined U.S. Senators Sherrod Brown (D-OH), Bob Menendez (D-NJ), and Tina Smith (D-MN) in sending a letter urging Consumer Financial Protection Bureau (CFPB) Director Rohit Chopra to take action to protect consumers from scams and fraud caused by artificial intelligence (AI) and machine learning in consumer financial products.

The lawmakers raised concerns over AI voice cloning technology which can allow scammers to fraudulently scam unsuspecting victims out of money or their identity and even access consumers’ finances, including their bank accounts. 

Law enforcement officials have also warned that scammers are using AI to clone or closely mimic the voice of a victim’s loved one using social media content found on line. The scammer then uses AI to contact a family member or friend, claiming they need money, typically because they’ve been in an accident or are in jail or been kidnapped, and can manipulate the audio clip to have a realistic conversation that sounds like the victim’s loved one.

“Voice cloning adds a new, threatening dimension to these scams, allowing fraudsters to generate voice clips to convincingly impersonate friends, family, or potentially even financial advisors and bank employees. Hearing trusted voices amplifies the risks of consumers falling victim to scams,” the four lawmakers wrote. “The risks posed by voice cloning in the realm of financial scams demand immediate attention and action. To effectively address this emerging threat, we respectfully request that the CFPB review the risks posed by this new technology as soon as practicable and take action under the CFPB’s existing authorities to protect consumers.”

As AI becomes more prominent, criminals will begin to use it more often too to put new twists on old scams.  Senator Reed warns that even experts can be fooled by AI. 

“As AI evolves and becomes more prevalent and sophisticated, bad actors are trying to take advantage using ‘spoofed speech’ and other techniques.  It is important for people to be educated and on guard against these scams.  The FTC and CFPB need to take action to help thwart this type of fraud,” said Senator Reed.  “I also urge people to take precautions and have a plan to prevent getting fooled by scams using AI.  Be careful about the information you make publicly available on social media.  Also, talk to family members – especially older relatives -- about the potential for these types of scams and come up with a code word that only you and your loved ones know.  Lastly, scammers often try to create a false sense of urgency.  Be wary of any caller who tries to get you to turn over financial information over the phone and don’t be afraid to hang up on the caller and then try to reach your loved one by contacting them through another family member or their friends.”

Scammers often try to get victims to pay or send money in ways that make it hard to get your money back, such as converting funds into cryptocurrency.  If a caller directs you to wire money, go to a cryptocurrency ATM, or purchase gift cards and give them the card numbers and PINs, those are likely signs of a scam.

If you or someone you know has been the victim of a scam or an attempted scam, you can report it to the Federal Trade Commission (FTC) at ReportFraud.ftc.gov and to local law enforcement.

According to the latest FTC Consumer Sentinel Network data, consumers reported losing nearly $8.8 billion to fraud in 2022, an increase of more than 30 percent over the previous year.

Full text of the letter to the CFPB follows:

July 6, 2023

Dear Director Chopra:

We write to express our deep concerns regarding the emergence of voice cloning technology and its potential exploitation in financial scams. We urge the Consumer Financial Protection Bureau (CFPB) to take action regarding the governance of artificial intelligence (AI) and machine learning in consumer financial products, especially as it relates to protecting consumers from fraud and scams.

Voice cloning, the process of reproducing an individual's voice with high accuracy using AI and machine learning techniques, has seen remarkable advancements in recent years, and is increasingly being used in malicious ways. One particularly alarming application is its potential use in perpetrating financial scams.

Financial scams already impose significant hardships on unsuspecting consumers, who often have no reimbursement recourse from banks and peer-to-peer payment apps. In the past, this Committee has sent letters to Zelle, Cash App, and Venmo regarding their scam and fraud detection policies. Voice cloning adds a new, threatening dimension to these scams, allowing fraudsters to generate voice clips to convincingly impersonate friends, family, or potentially even financial advisors and bank employees. Hearing trusted voices amplifies the risks of consumers falling victim to scams. As the FTC explained in a recent consumer alert, while a grandparent may be aware of “grandparent scams”, the use of AI-generated voice clips increase uncertainty and make detection more difficult.

We are also concerned about how financial institutions themselves may be vulnerable to breaches powered by artificially generated voice clips. In May, Chairman Brown sent letters to six of the largest banks that offer voice authentication services, outlining concerns that artificial intelligence (AI) generated voice clips allow fraudulent actors to break into customers’ accounts. In comparing the responses received, it became clear that financial institutions do not have a uniform and robust approach to detecting and preventing AI-driven threats, leaving consumers vulnerable to harm.

The risks posed by voice cloning in the realm of financial scams demand immediate attention and action. To effectively address this emerging threat, we respectfully request that the CFPB review the risks posed by this new technology as soon as practicable and take action under the CFPB’s existing authorities to protect consumers. Thank you for your prompt attention to this critical matter and for your continued work to safeguard consumers.

Sincerely,