- By Vikas Yadav
- Sun, 28 Apr 2024 06:41 PM (IST)
- Source:JND
Amid the rise of AI-powered deepfake images, Apple is cracking down on apps that could generate nonconsensual nude images, according to 404 Media. Three apps in this category were removed from the App Store. The update comes days after the news outlet highlighted the advertisement of such AI apps on Instagram.
Earlier this week, 404 Media published a report that highlighted how Meta's photo-sharing app was being used by these app developers to promote their free apps that could undress people in images. A few of these apps redirected users to the App Store to download an app dubbed as an "art generator", according to 9To5Mac.
After this update, Apple reached out to inquire about the apps highlighted in the report. After sharing the relevant ad and app URLs, the tech giant removed these apps. This hints more apps could exist in the wild and Apple could face a tough time locating them despite these apps violating its policy terms. The apps were listed on the platform since 2022.
Some of these apps also offered face-swapping services. One of the ads of these apps included a photo of Kim Kardashian with the text highlighting its feature to undress an individual's clothes. Google also reportedly removed similar apps from the Play Store. These AI nude generators can be a major concern for the privacy of the affected individual and can lead to harassment and blackmail.
Also Read: Lottery App Creates Semi Nude Images Of Woman Using AI; Govt Pulls It Down After Complain
Meanwhile, AI features in Huawei's latest Pura 70 series smartphones were also in the news for inadvertently erasing the clothes of a person in photos. The 'smart AI retouching' feature allows a user to remove objects using a few taps with the help of AI. Huawei reportedly acknowledged the issue linked to this feature that may expose body parts without consent.
