FCC Declares AI-Voiced Robocalls Illegal: What This Means for Consumers

The FCC has ramped up its fight against robocalls with a pivotal declaration classifying AI-generated voices as "artificial" and therefore illegal when used in automated calling scams. While this ruling may not completely eliminate the surge of fake calls impersonating figures like Joe Biden this election season, it is a significant step in combating the issue.

This new directive, which has been under consideration for months and was hinted at last week, isn't technically a brand-new rule. The FCC cannot create regulations without due process. Instead, robocalls are essentially an updated term for practices already prohibited under the Telephone Consumer Protection Act (TCPA), which protects consumers from unsolicited artificial and pre-recorded messages sent indiscriminately to numerous phone numbers—an issue that was relevant even during the law's inception.

The crux of the matter was whether AI-generated voices used in calls fall into the existing categories of prohibited communications. Although it may seem straightforward, the federal government often requires thorough investigation and expert consultation before determining legality. This deliberation was likely prompted by a recent incident involving a fraudulent call mimicking President Biden, which urged New Hampshire voters not to waste their votes in the primary. The dubious operations behind this call are now facing scrutiny from attorneys general, the FCC, and other authorities determined to deter similar misconduct.

Notably, the AI-generated Biden calls originated from a dubious telecom entity linked to 'Life Corporation' in Texas. As previously discussed, this call would have been illegal even had it involved someone impersonating the president or a manipulated recording. It qualifies as an illegal robocall, potentially constituting voter suppression, albeit no charges have materialized so far, making it easy to categorize it within existing illegal practices.

To successfully prosecute such cases, evidence must support the allegations, allowing for legal adjudication. Prior to today, utilizing an AI voice clone of a prominent figure like the president may have breached some laws, but it was not explicitly against the rules for automated calls. For example, receiving a message from an AI voice clone of your doctor reminding you about an upcoming appointment would not be problematic, especially since you would likely have opted in to receive that call. Starting now, however, the use of an AI-generated voice would count against any defendant in a legal situation.

From the declaratory ruling, a key takeaway is:

"Our finding will deter negative uses of AI and ensure that consumers are fully protected by the TCPA when they receive such calls. Additionally, it clarifies that the TCPA does not allow any exceptions for technologies that claim to emulate a live agent, preventing unscrupulous businesses from exploiting perceived gaps in our TCPA regulations. Although AI technologies like voice cloning are still evolving, we've already witnessed their capacity to uniquely harm consumers, particularly those whose voices are cloned. These techniques can mislead recipients into believing that trusted individuals require them to take actions they normally wouldn't. Mandating consent for such calls empowers consumers to choose whether to engage or approach the call with caution."

This situation serves as a compelling reminder of how legal frameworks can adapt and evolve. While the FCC cannot arbitrarily redefine these regulations due to procedural barriers, they possess the authority to make decisions based on the clarity of necessity without needing to consult Congress or the president. As the primary regulatory body, the FCC is equipped to conduct research and make impactful choices.

However, this crucial capability faces jeopardy due to an impending Supreme Court ruling that could potentially dismantle long-standing precedents, thereby destabilizing U.S. regulatory agencies. This could be great news for those who favor robocalls and unregulated industries.

If you receive an AI-powered robocall, make sure to document it and report it to your local attorney general’s office; they are likely collaborating in the newly formed anti-robocalling coalition against these fraudsters.

Most people like

Find AI tools in YBX