Lujan, Kelly Back FCC Proposal Prohibiting Robocalls Mimicking Human Voices
Senate Communications Subcommittee Chairman Ben Ray Lujan, D-N.M., and Sen. Mark Kelly, D-Ariz., backed the FCC’s NPRM that would make voice-cloning technology in robocall scams illegal (see 2401310082), telling Chairwoman Jessica Rosenworcel it’s needed because the use of AI to…
Sign up for a free preview to unlock the rest of this article
If your job depends on informed compliance, you need International Trade Today. Delivered every business day and available any time online, only International Trade Today helps you stay current on the increasingly complex international trade regulatory environment.
“impersonate trusted voices threatens to expose ... even greater harm” to consumer confidence in U.S. telecom systems. The FCC issued a cease-and-desist letter and K4 order Tuesday against Texas-based Lingo Telecom over robocalls to voters before the New Hampshire primary last month with an AI-generated voice impersonating President Joe Biden (see 2402060087). “We commend you for moving forward with this proposed Declaratory Ruling to clarify that existing statute already prohibits robocalls using AI-generated voices to wireless customers,” Lujan and Kelly said in a letter to Rosenworcel released Wednesday. “This ruling is particularly important given the increasing ease of access to generative-AI tools. Such tools can be used to clone a person’s voice with high levels of accuracy, and there are alarming cases of this technology being used to harm vulnerable families across the country.” The proposed ruling “is aligned with Congressional intent for the Telephone Consumer Protection Act,” which should empower consumers “to provide consent on whether to receive AI generated calls,” the senators said: The FCC should also “exercise its lawful enforcement authority to stop the use of generative AI for fraudulent misrepresentation, particularly in critical sectors like public safety, election integrity, and consumer protection.”