• Source:JND

Not long ago, a woman living in Florida was a victim of AI-based voice fraud and lost around $15,000 (approx. ₹12.5 lakh). Sharon Brightwell, the woman in question, thought she was aiding her daughter in an accident case, but, turned out it be a complete hoax.

This goes to show how much of a problem this is becoming – scammers are using AI to create voice clones and develop phone scams that are unbelievably real. If you’ve ever posted videos or voice notes, you might want to think twice before sharing such content.

When AI Pretends to Be Someone You Love

Sharon got a call on July 9 from a number that was nearly identical to her daughter’s. The voice on the call was similar to her daughter’s and was worried and frantic. The caller claimed that Sharon’s daughter had a car accident where she was texting, and hit a pregnant woman and was under some kind of authority hold.

A moment later, she got another call. This time, a man posing to be a lawyer was demanding a $15,000 bail payment. In a frantic attempt to die with the situation, Sharon withdrew the money and handed over the amount as it was required.

The scam escalated. She got a second call claiming the unborn baby had died, and the victim’s family now wanted ₹25 lakh more to drop legal action. At that point, her grandson and a close family friend intervened. They contacted Sharon’s real daughter directly, who was safe at work the whole time.

ALSO READ: Airtel Launches ₹399 Recharge Plan With Unlimited 5G And Extra Data For Just ₹1 More

How the AI Voice Scam Was Pulled Off

The scammers used AI voice cloning to mimic Sharon’s daughter, April Monroe, with stunning accuracy. All it took was a small voice sample—possibly pulled from a social media video or past phone call. The cloned voice fooled not just Sharon, but even others familiar with April.

April has since launched a fundraiser to help recover the losses and raise awareness about this disturbing scam trend.

Authorities in Hillsborough County, Florida, have confirmed an investigation and warned that AI-driven scams are becoming more advanced and harder to detect. These scams don’t need hacking or deep tech—just a few seconds of your voice and a convincing story.

Why These Scams Work

AI phone scams exploit basic human instincts—fear, urgency, and love. Victims act fast, often without verifying details, because the stakes seem so high. These scams often target:

  • Elderly people
  • Emotionally vulnerable individuals
  • Families with young children

The emotional manipulation is so strong that even normally cautious people can fall victim.

ALSO READ: Sony Unveils FlexStrike Wireless Fight Stick For PS5, Launching In 2026

How to Protect Yourself From AI Voice Cloning Scams

To avoid falling prey to similar scams, follow these essential steps:

Safety Tip

Why It Matters

Always verify emergency calls using another contact method

Don’t rely on the same number that called you

Be wary of urgency

Scammers often push you to act fast—pause and think

Create a family codeword

A simple, pre-decided word can confirm real emergencies

Limit what you post online

Public videos or audio clips can be used to clone voices

Educate older relatives

They are prime targets for emotional scams

Additionally, treat every request for money over a phone call—no matter how real it sounds—with serious suspicion.

Final Thoughts

It’s not just about one household losing money; this is a matter of how swiftly scams are changing due to AI, which is getting more advanced by the minute. With new tech seeming more and more advanced, the need to verify the authenticity of something has gotten more difficult. Being informed, prepared, and vigilant are the new rules to safeguarding your money and your family. When something seems suspicious, hitting the brakes and confirming details without acting in a group is the way to go.