Listen to this “Biden” appeal sent to voters. No wonder the FCC is cracking down on AI robocalls.

Some voters in New Hampshire received a call from someone who sounded a lot like President Joe Biden. The call encouraged New Hampshire residents to stay home during last week’s primary elections and “save their votes” for the November general election. Of course, this doesn’t make any sense. Voters can vote in both elections. Why […]

Listen to this “Biden” appeal sent to voters.  No wonder the FCC is cracking down on AI robocalls.

Some voters in New Hampshire received a call from someone who sounded a lot like President Joe Biden. The call encouraged New Hampshire residents to stay home during last week’s primary elections and “save their votes” for the November general election.

Of course, this doesn’t make any sense. Voters can vote in both elections. Why would Biden say such a thing to them? Well, that’s because he didn’t. They were Automated calls generated by AI voice created to look like Biden. You can listen to one herecourtesy of The telegraph:

This is just one concrete example of how AI can already be weaponized by bad actors. And that’s likely one of the main reasons why the FCC now wants to take action against AI-generated calls.

FCC Proposal to Ban AI-Based Robocalls

FCC Chairwoman Jessica Rosenworcel released a statement Wednesday announcing a proposal that the FCC would recognize calls generated by artificial intelligence as “artificial” voices under the Telephone Consumer Protection Act (TCPA). ). By doing so, the FCC would make AI-generated robocalls illegal.

SEE ALSO:

The FCC Just Fined a Robocall Company $300 Million After Blocking Billions of Scam Calls

The TCPA is often used by the FCC to limit unwanted calls consumers receive from telemarketers. Under this law, the use of artificial or pre-recorded voice messages and automatic telephone dialing systems is prohibited.

“AI-generated voice and image cloning is already sowing confusion by making consumers believe scams and frauds are legitimate,” Rosenworcel said in a statement. The statement continues:

“No matter which celebrity or politician you favor, or the relationship you have with your loved ones when they call for help, it’s possible that we can all be targets of these fake calls. That’s why the FCC is taking steps to recognize this emerging technology as illegal under current law, giving our partners in state attorneys general offices across the country new tools they can use to crack down on these scams and protect consumers.

The timing of Rosenworcel’s statement appears to show that Biden’s robocalls have raised concerns about how these AI-generated voices can be used in telemarketing scams as well as potential election fraud.

Currently, the only real steps to avoid the worst-case scenarios caused by AI-generated voices have been taken by the AI ​​companies themselves. As Bloomberg reportedAI company ElevenLabs last week suspended the user who created the Biden robocalls from its platform.

“We are committed to preventing the misuse of audio AI tools and take any incidents of misuse very seriously,” ElevenLabs said in a statement.

However, as we have seen with recent non-consensual AI-generated pornographic images from Taylor Swiftthere are those in the space who may not feel the same way as ElevenLabs does when it comes to the use of AI products.

Teknory