AI Voice Cloning Scam: Mom Duped Into Sending ₹12.5 Lakh

AI Voice Cloning Scam: Mom Duped Into Sending ₹12.5 Lakh

4 months ago | 5 Views

A new type of fraud utilizing artificial intelligence caused Sharon Brightwell, a Florida resident, to lose $15,000 (about ₹12.5 lakh). It serves as a reminder of how technology is becoming more and more misused with frauds that seem real, urging individuals everywhere to exercise caution before we delve into the specifics of this case.

When technology tries to pass for reality

WFLA reported that Sharon received a call on July 9 from a number that was nearly identical to her daughter's. The voice on the phone was her daughter's, but it was strained and agitated. While texting, Sharon was informed that her daughter had struck a pregnant woman in an accident. The caller claimed that her daughter was being held captive.

Soon after, a man claiming to be an attorney phoned Sharon. He asked for a bail amount of $15,000 (about 12.5 lakh rupees). Sharon, in an effort to be helpful and anxious, pulled out the money and handed it over to the folks as directed.

AI Scams Surge: Voice Cloning And Deepfake Threats Sweep India

Sharon received another call not long after that. The situation deteriorated, the pregnant woman was said to have miscarried, and her family was reportedly requesting an additional ₹25 lakh to prevent legal action. Sharon's grandson and a good family friend intervened at that point. They contacted Sharon's actual daughter, who was working and secure, together. Sharon later explained how relieved and astonished she was to hear her daughter's real voice and understand how close she was to suffering further financial losses.

The mechanics of the con

Using only a little bit of April Monroe's actual voice, the criminals were able to replicate her voice with AI technology. April said the cloned voice was so similar that it fooled her mother and other members of the family. In order to help the family and raise public understanding of these frauds, she has since organized a fundraiser.

Police in Hillsborough County, Florida, verified that an investigation is ongoing. They also said that AI voice scams are becoming more sophisticated and harder to detect.

This sort of crime demonstrates that fraudsters no longer need to break into systems; all they need is audio of a person's voice from public videos or calls to perpetrate realistic scams. These strategies induce panic and urgency, causing victims to react without considering their options. Individuals who are at risk, notably the elderly and those experiencing stress, continue to be in danger.

Ways to prevent voice cloning scams

By using a different phone number or app to contact your relative during an emergency call, you can always verify it.

Be wary if you are urged to send money right away, even if the call seems genuine.

In case of an emergency, use a secret family code word so you can confirm calls.

Be cautious about posting videos or audio online on social media.

Inform elderly relatives about emerging types of internet frauds.

Read Also: Xbox Ally and Ally X: Sky-High Prices Could Limit Access

Get the latest Bollywood entertainment news, trending celebrity news, latest celebrity news, new movie reviews, latest entertainment news, latest Bollywood news, and Bollywood celebrity fashion & style updates!

HOW DID YOU LIKE THIS ARTICLE? CHOOSE YOUR EMOTICON!

#