- By Vikas Yadav
- Tue, 07 Nov 2023 12:05 AM (IST)
- Source:JND
Deepfake video of Rashmika Mandanna continues to make waves across the internet, questioning the ill effects the evolving technology can have on the lives of humans. It serves as a reminder as to why strict regulations for the technology need to be in place. And these requirements were echoed in a recent Washington Post report.
Amid the rising cases of fake depictions of women in comprising positions or sex, the report noted that AI is adding fuel to the engines involved in the generation of pornographic material. The rise can be attributed to the uptick of cheap tools that can strip people naked in photos. For impressive realism, the process involves analysing body types and swapping human faces in videos of people engaged in sex.
According to the report, compared to 2018, the top 10 websites in the business of AI-aided sexual depiction witnessed a rise of over 290 per cent, as per industry analyst Genevieve Oh. The list of affected individuals ranges from celebrities to politicians. Oh claimed over 1,43,000 videos were added in 2023 on top 40 fake video platforms, which have garnered 4.2 billion views.
AI Deep Fake Images: The rise can be attributed to the uptick of cheap tools that can strip people naked in photos.(Image:Pixabay/Sulky-the-Unicorn)
A Sensity AI study in 2019 discovered that 96 per cent of deepfake photos involved pornography, and over 99 per cent of them targeted women. As the technology continues to evolve and new risks erupt, there is an urgent requirement to introduce stringent laws to govern the ill uses of this technology, including AI-generated deep fake porn.
The report shared an instance of a YouTuber (Gabi Belle) in an eye-opening incident. Hundreds of naked images of her appeared primarily on websites that were known for hosting AI-generated adult content or porn. In another instance, a case of a 14-year-old girl wherein her photos taken from social networks underwent alterations using an AI "nudifier" app, according to the authorities.
