Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Share

Apple pulls AI image apps from the App Store after learning they could generate nude images

Apple is cracking down on a category of AI image generation apps that “advertised the ability to create nonconsensual nude images.” According to a new report from 404 Media, Apple has removed multiple AI apps from the App Store that claimed they could “create nonconsensual nude images.”

Sponsored by Incogni: Keep your SSN out of criminals’ hands. The most likely source of your personal data being littered across the web? Data brokers. They’re using and selling your information — home address, SSN, phone number and more. Incogni helps scrub this personal info from the web and gives you peace of mind to keep data brokers at bay. Read more here.

On Monday, the site published a report exploring how companies were using Instagram advertising to promote apps that could “undress any girl for free.” Some of these Instagram ads took users directly to Apple’s Store for an app that was described there as an “art generator.”

Today’s report says that Apple did not initially respond to 404 Media’s request for comment on Monday. The company did, however, reach out directly after the initial story was published to ask for more information. When provided with direct links to the specific ads and App Store pages, Apple proceeded to remove those apps from the App Store.

The report explains:

Apple has removed a number of AI image generation apps from the App Store after 404 Media found these apps advertised the ability to create nonconsensual nude images, a sign that app store operators are starting to take more action against these types of apps. 

Overall, Apple removed three apps from the App Store, but only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself. 

404 Media’s emanuel maiberg

As the report implies, this will likely be a cat-and-mouse game going forward, considering Apple was unable to find the apps without 404 Media sending it direct links.

Follow ChanceThreadsTwitterInstagram, and Mastodon

FTC: We use income earning auto affiliate links. More.

You may also like...