Written by 5:26 pm Latest news

– Alleged Impersonator Falsely Informs Mother of Daughter’s Arrest

Kaveri said it was all a scheme using artificial intelligence to replicate her daughter’s voi…

She quickly realized that she had fallen victim to a scam and promptly activated call recording.

A recent fraudulent scheme involving the replication of voices using artificial intelligence has come to light. Kaveri, a user, shared her harrowing experience with a hoax involving a supposed ‘arrest.’ In a detailed account on X, she narrated how she initially hesitated to answer a call from an unfamiliar number but eventually decided to pick it up.

Recalling the incident, Kaveri disclosed that the caller, posing as a fake police officer, attempted to deceive her into believing that her daughter was in trouble, devising a plan to extort money from her. The scammers impressively mimicked her daughter’s name and voice, indicating the possible use of AI technology.

In her narrative on X, Kaveri described the unsettling phone call she received just an hour earlier from an unidentified number. Despite her usual practice of ignoring such calls, she felt compelled to answer this one. On the line was an individual claiming to be a police officer, inquiring about the whereabouts of her daughter, K.

The impersonator informed Kaveri that her daughter had been arrested for allegedly filming an MLA’s son in a compromising situation alongside three friends and attempting to blackmail him.

Realizing the fraudulent nature of the call, Kaveri activated call recording and insisted on speaking with her daughter. “The man’s demeanor was harsh and impolite throughout the conversation. To my shock, a recording was played for me: “mumma mujhe bacha lo, mumma mujhe bacha lo” (Mom, save me, save me). The voice closely resembled my daughter’s, but it was not her usual way of speaking,” she detailed on X.

Subsequently, Kaveri revealed that the imposter demanded a ransom for her daughter’s release and threatened to pursue legal action if the demands were not met. Undeterred, Kaveri requested to converse with her daughter directly. Infuriated by her defiance, the fake officer abruptly terminated the call after a brief exchange.

Kaveri attributed the entire ordeal to a sophisticated ploy leveraging artificial intelligence to replicate her daughter’s voice convincingly.

Following Kaveri’s revelation on social media platform X, numerous users resonated with her experience, sharing similar encounters in the comments section.

One user recounted a similar incident involving a friend in Assam who received a late-night call from a purported “cop” falsely claiming that their acquaintance had been arrested for drug possession.

Another user shared a distressing account involving their mother, who received a threatening call from an individual posing as an ASI regarding their brother’s whereabouts, instilling fear through detailed information and a misleading police uniform display on WhatsApp.

Additionally, a third user highlighted proactive measures taken by some individuals to safeguard against such scams, such as using exclusive passwords known only to family members to thwart AI-based impersonation attempts.

The alarming implications of these scams were further underscored by a fourth user who expressed concerns about the extent of personal information accessible to scammers, including contact details, family members’ names, and potential whereabouts.

Lastly, a fifth user emphasized the unsettling nature of hearing a recorded voice of one’s loved one, indicating the potential psychological distress caused by scammers utilizing advanced AI tools to obtain voice samples surreptitiously.

Visited 1 times, 1 visit(s) today
Tags: Last modified: March 11, 2024
Close Search Window
Close