Meta sues operator of AI ‘nudify’ apps over policy violations
Meta is taking legal and technical action against a Hong Kong-based company behind AI apps that generate fake nudes, defining it as “abuse” of the platform and users, Kazinform News Agency correspondent reports.

According to Meta, the company has filed a lawsuit against Joy Timeline HK Limited, the operator behind the CrushAI apps, which allow users to create AI-generated nude images of people without their consent, which are also known as deep fakes. The legal action, filed in Hong Kong, aims to stop the company from promoting these apps on Meta’s platforms after repeated violations of advertising rules.
“This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it. We’ll continue to take the necessary steps – which could include legal action – against those who abuse our platforms like this,” Meta said in a post.
In the same post, the company also shared current practices taken to reduce the broader reach of these apps across the internet, including sharing data like violating URLs with other tech firms to investigate and remove similar content. Thus, over 3,800 URLs have been distributed to partner platforms since March.
According to Meta, further steps have been taken to keep up with evolving tactics used by these actors, who often deploy misleading images or quickly shift to new domains to evade detection. To counter this, Meta has developed new tools that detect such deceptive ads even without nudity and uses matching technology to swiftly identify and block duplicates.
Earlier, it was reported that Meta Platforms is reportedly in talks to invest over $10 billion in Scale AI, potentially marking its largest external investment in artificial intelligence to date.