Written by 11:47 pm AI, AI Security, AI Threat

### Warning from Better Business Bureau: Artificial Voice-Cloning Schemes on the Rise

For years, scammers have long used the ploy of claiming a loved one is in trouble to convince targe…

Scammers have historically employed the tactic of claiming a family member is in distress to manipulate individuals into sending money or revealing personal details in DAVENPORT, Iowa ( KWQC). However, with the rise of AI technology, these fabrications are becoming more sophisticated and dangerous.

Merely a brief snippet of an individual’s voice, obtainable from online sources or social media, is all that fraudulent actors require to perpetrate this deceit.

By subjecting the audio snippet to an AI system capable of replicating speech patterns, scammers can manipulate the content to convey any message they desire. To enhance the credibility of the fabricated message, they may even incorporate laughter, fear, or other emotive elements.

The Better Business Bureau offers guidance on how to shield oneself from this evolving form of deception.

Regardless of the urgency conveyed in the narrative, it is essential to refrain from immediate action. Establishing a secret code known solely to household members is a proactive measure to safeguard against such schemes. Laura Chavez, the Vice President of Operations at the Better Business Bureau of Iowa, recommends verifying the caller’s identity by posing intricate questions that would challenge a fraudster.

Chavez advises individuals to establish a unique passphrase for verification purposes. It is crucial that this passphrase remains confidential, not derived from publicly accessible data, or shared on social platforms.

“If you are uncertain about the authenticity of the situation but harbor suspicions,” Chavez suggests taking a moment to assess the circumstances thoroughly. By asking discerning questions, you can ascertain the legitimacy of the communication and confirm the identity of the individual on the other end.

Recent studies indicate that one in four individuals either encountered an Artificial Voice scam personally or knew someone who had fallen victim to it. Shockingly, 77% of recipients of AI-generated voice calls reported financial losses as a result of these fraudulent activities.

Visited 2 times, 1 visit(s) today
Last modified: January 5, 2024
Close Search Window
Close