Written by 11:12 am AI Threat

### AI Firm ElevenLabs Likely Utilized Tools to Generate Fake Biden Robocall

Two fake-audio experts say that the deepfake robocall of President Biden received by some voters la…

Some voters in New Hampshire were targeted with a deceptive robocall last month, purportedly from President Biden, advising them against voting in the state’s primary election. The call, suspected to be AI-generated, utilized voice-cloning technology from ElevenLabs, a company that recently achieved “unicorn” status following a successful $80 million funding round led by Andreessen Horowitz. ElevenLabs specializes in AI tools for various applications, including ebooks and video games, allowing users to replicate voices through a paid service. While the company recommends obtaining permission before cloning someone’s voice, it permits stateless cloning for certain non-commercial purposes like political discourse.

Security firm Pindrop conducted an analysis attributing the robocall to ElevenLabs’ technology based on audio patterns matching the company’s voice synthesis engines. Despite the complexity of tracing AI-generated audio sources, Pindrop’s CEO confidently asserted the call’s origin as ElevenLabs. Subsequent examinations by Pindrop and UC Berkeley’s forensic expert, Hany Farid, corroborated the involvement of ElevenLabs in the creation of the AI-generated call.

ElevenLabs offers an AI talk sensor on its website, claiming to identify audio tapes produced using its systems. Pindrop’s assessment using this sensor indicated an 84% likelihood of the suspect call being generated by ElevenLabs. Farid initially expressed skepticism but later confirmed the AI origin of the call after conducting an independent analysis.

The use of ElevenLabs’ technology for political propaganda has raised concerns, with reports of AI-generated content spreading misinformation. Despite ElevenLabs’ market leadership in AI voice cloning and its valuation exceeding $1.1 billion, there are apprehensions about potential misuse of the technology. The company’s substantial funding may enable the development of safeguards to prevent malicious activities, especially with the upcoming US presidential elections.

The incident underscores the challenges posed by AI-generated content in the lead-up to the 2024 election, emphasizing the need for reliable tools to authenticate audio sources promptly. The widespread accessibility of AI voice cloning technology has created both opportunities and risks, prompting calls for enhanced oversight to address potential misuse. As the threat of AI-generated propaganda looms, stakeholders must be vigilant in safeguarding against misinformation that could impact electoral outcomes.

Visited 3 times, 1 visit(s) today
Last modified: January 26, 2024
Close Search Window
Close