Several apps in the App Store could take nude photos from your people photos. But Apple immediately removed these apps from the App Store.
This AI takes nude photos – but Apple intervenes
Several apps could undress people in photos at the touch of a button. And the striking thing was that these apps were in the App Store. It is not known why they have been specifically kicked out of the App Store now.
On Instagram you could clearly see what the apps were used for. But that probably wouldn’t have been in the description of the app. That’s probably why the apps were allowed in the App Store.
We were no longer able to try the apps, but there are still plenty of websites that use this AI for nude photos. It soon became clear to us that the websites were well visited. We ended up in a 6-hour queue when we tried to feed the AI an image of Shrek. For various reasons we didn’t wait for the end result…
The AIs are a good revenue model anyway. According to once research by Graphika 34 websites that offer these types of AI models have already had more than 24 million visitors in September. Associated groups in Telegram now have more than a million members.
The websites work with a kind of freemium model, which allows you to ‘undress’ a few photos for free, but after that you have to pay. Prices range from $2 per photo to $300 for access to the API with “other features.”
There have already been several cases this year where these nude photo AIs have also been used for other objectionable purposes. For example, young people in Spain have taken AI nude photos of their fellow students. And recently a 41-year-old child psychiatrist was arrested in America took nude photos of children with an AI.
Do you always want to be up to date with the latest Apple news? Then please sign up for our newsletter. Additionally, download the free iphoned app and keep an eye on our website. Then you’ll never miss an Apple news again!