Written by 9:58 am AI Trend

– AI Scam Uses Fake Voices to Demand Ransom for Hostage

With the advent of AI, scammers have gotten smarter, and are now spoofing voices to demand ransom f…

This is undeniably frightening.

Crisis Scenario

Fraudsters have been leveraging the advancements in AI to become more sophisticated and unpredictable — even resorting to mimicking voices to extort ransom for individuals allegedly held “hostage.”

Referencing a disturbing incident recounted in The New Yorker, a Brooklyn couple, using the aliases Steve and Robin to safeguard their identities, shared their harrowing experience of receiving a distressing phone call in the dead of night from the husband’s parents.

Initially dismissing the call from her mother-in-law Mona as an accidental dial, Robin, the wife, was stirred from her sleep. Upon answering the call, she was met with cries of desperation: “I can’t do it, I can’t do it.”

Fearing the worst, Robin contemplated various dreadful scenarios involving her in-laws or her own parents, who also resided in Florida during the winter months.

“I thought she was trying to tell me that some horrible tragic thing had happened,” Robin recounted, her mind racing with grim possibilities.

As her father-in-law Bob intervened, urging her to “get Steve, get Steve,” she hastily woke her husband, a law enforcement professional. Upon Steve taking the call, an unfamiliar male voice delivered a chilling ultimatum.

“You’re not gonna call the police,” the voice asserted. “You’re not gonna tell anybody. I’ve got a gun to your mom’s head, and I’m gonna blow her brains out if you don’t do exactly what I say.”

Demanding \(500 via Venmo, an oddly modest sum considering the gravity of the purported hostage crisis, the perpetrator then coerced an additional \)250 from Steve through a possible accomplice.

Subsequently, upon confirming the safety of their parents, who were peacefully asleep in their beds, the couple realized they had narrowly escaped a distressing hoax.

Solution in Technology

While the specific software employed by the scammers in Steve and Robin’s ordeal remains undisclosed, various readily available options exist, as outlined in The New Yorker, that enable the replication of voices belonging to individuals like Steve’s parents or other victims of this reprehensible scheme.

One such tool is ElevenLabs, a New York-based AI company utilized by numerous publications, including The New Yorker, for diverse functions such as article narration and text-to-voice conversions.

Notably, this technology can replicate any individual’s voice with minimal recorded audio snippets for training, as highlighted by Hany Farid, an expert in generative AI at the University of California, Berkeley.

The means by which the scammers obtained the voices of Steve’s parents may forever remain a mystery; however, they did manage to reclaim their funds from Venmo, according to the article’s account.

Visited 2 times, 1 visit(s) today
Tags: Last modified: March 10, 2024
Close Search Window
Close