Apple removes apps that created nudes without consent with AI

A Apple took action against apps that promoted the creation of nude images without consent. The company removed three apps from the App Store just days after a report from 404 Media reveal the existence of misleading advertisements on Instagram. These ads promoted the apps as tools capable of digitally undressing anyone – with the help of artificial intelligence (AI).

ADVERTISING

O Google also took similar actions, removing similar apps from the Play Store.

It is worth mentioning that the Apple it only removed the apps after 404 Media provided links to the apps and their related advertisements. This indicates that the Apple failed to identify apps that violated its policies.

This case raises concerns about the ability of app stores to proactively detect and remove harmful content.

ADVERTISING

Read also

* The text of this article was partially generated by artificial intelligence tools, state-of-the-art language models that assist in the preparation, review, translation and summarization of texts. Text entries were created by the Curto News and responses from AI tools were used to improve the final content.
It is important to highlight that AI tools are just tools, and the final responsibility for the published content lies with the Curto News. By using these tools responsibly and ethically, our objective is to expand communication possibilities and democratize access to quality information.
🤖

Looking for an Artificial Intelligence tool to make your life easier? In this guide, you browse a catalog of AI-powered robots and learn about their functionalities. Check out the evaluation that our team of journalists gave them!

Leave a comment

Your email address will not be published. Required fields are marked with *

Scroll up