• Source:JND

A 16-year-old boy died by suicide in February this year due to "sextortion" and now his family is struggling to get justice for their vibrant teen who made people smile. According to a report by CBS News, Elijah Heacock, the victim of the online extortion, was neither depressed nor had any other complicated issues which could forced him to end his life in such a tragic way. His father, John Burnett told CBS Saturday Morning, that were unaware of the reasons that forced their child to take such a drastic step.

AI-generated nudes 

His father told Heacock received AI-generated images of him in a compromising position. The online scamster had demanded $3000 for destroying the AI-generated nude images of him and warned if he failed to do so, those pictures would be shared with his family and relative members. He died by suicide shortly after receiving the message, CBS affiliate KFDA reported. Burnett and Elijah's mother, Shannon Heacock, didn't know what had happened until they found the messages on his phone.

ALSO READ: Scammed Out Of 2 Crore, Seduced By Kindness: Granny Joins Fraud Gang After losing Millions | Japan’s Most Twisted Scam Tale Yet

What is sextortion?

Elijah fell victim to a sextortion scam, where predators target young people online, threatening to leak explicit images unless demands, such as money or harmful acts, are met. His parents, unfamiliar with the term until his death, were devastated. "These criminals are highly organized, well-funded, and relentless," Burnett, Elijah’s parent, said. "They don’t need real photos—they can fabricate anything to blackmail kids.

500,000 sextortion cases reported in US

The Rise of Sextortion Scams The National Center for Missing and Exploited Children (NCMEC) reported over 500,000 sextortion cases targeting minors last year. The FBI estimates at least 20 young people have died by suicide since 2021 due to these scams. Teen boys are increasingly targeted, per NCMEC’s 2023 findings, and generative AI has worsened the crisis, with over 100,000 reports this year involving AI-generated content. Dr Rebecca Portnoff, head of data science at Thorn, a nonprofit combating child exploitation, told CBS, “No technical skills are needed to create illegal, harmful material.” A simple search yields apps and websites for generating explicit images, she added.

ALSO READ: Mumbai Engineering Student's Secret Search For Online Call Girl Services Ends In Rs 6 Lakh Scam