The FTC wants your help fighting AI vocal cloning scams

Judges will award $25,000 to the best idea on how to combat malicious audio deepfakes.
Sound level visualization of audio clip
The FTC is soliciting for the best ideas on keeping up with tech savvy con artists. Deposit Photos

Share

The Federal Trade Commission is on the hunt for creative ideas tackling one of scam artists’ most cutting edge tools, and will dole out as much as $25,000 for the most promising pitch. First announced last fall, submissions are now officially open for the FTC’s Voice Cloning Challenge. The contest is looking for ideas for “preventing, monitoring, and evaluating malicious” AI vocal cloning abuses.

Artificial intelligence’s ability to analyze and imitate human voices is advancing at a breakneck pace—deepfaked audio already appears capable of fooling as many as 1-in-4 unsuspecting listeners into thinking a voice is human-generated. And while the technology shows immense promise in scenarios such as providing natural-sounding communication for patients suffering from various vocal impairments, scammers can use the very same programs for selfish gains. In April 2023, for example, con artists attempted to target a mother in Arizona for ransom by using AI audio deepfakes to fabricate her daughter’s kidnapping. Meanwhile, AI imitations present a host of potential issues for creative professionals like musicians and actors, whose livelihoods could be threatened by comparatively cheap imitations.

[Related: Deepfake audio already fools people nearly 25 percent of the time.]

Remaining educated about the latest in AI vocal cloning capabilities is helpful, but that can only do so much as a reactive protection measure. To keep up with the industry, the FTC initially announced its Voice Cloning Challenge in November 2023, which sought to “foster breakthrough ideas on preventing, monitoring, and evaluating malicious voice cloning.” The contest’s submission portal launched on January 2, and will remain open until 8pm ET on January 12.

According to the FTC, judges will evaluate each submission based on its feasibility, the idea’s focus on reducing consumer burden and liability, as well as each pitch’s potential resilience in the face of such a quickly changing technological landscape. Written proposals must include a less-than-one page abstract alongside a more detailed description under 10 pages in length explaining their potential product, policy, or procedure. Contestants are also allowed to include a video clip describing or demonstrating how their idea would work.

In order to be considered for the $25,000 grand prize—alongside a $4,000 runner-up award and up to three, $2,000 honorable mentions—submitted projects must address at least one of the three following areas of vocal cloning concerns, according to the official guidelines

  • Prevention or authentication methods that would limit unauthorized vocal cloning users
  • Real-time detection or monitoring capabilities
  • Post-use evaluation options to assess if audio clips contain cloned voices

The Voice Cloning Challenge is the fifth of such contests overseen by the FTC thanks to funding through the America Competes Act, which allocated money for various government agencies to sponsor competitions focused on technological innovation. Previous, similar solicitations focused on reducing illegal robocalls, as well as bolstering security for users of Internet of Things devices.

[Related: AI voice filters can make you sound like anyone—and anyone sound like you.]

Winners are expected to be announced within 90 days after the contest’s deadline. A word of caution to any aspiring visionaries, however: if your submission includes actual examples of AI vocal cloning… please make sure its source human consented to the use. Unauthorized voice cloning sort of defeats the purpose of the FTC challenge, after all, and is grounds for immediate disqualification.