×
AI as force nudifier suggests parents should rethink sharing kids’ photos online
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Parents are increasingly avoiding posting photos of their children on social media due to AI-powered “nudifier” apps that can generate fake nude images from any photograph. The shift comes as these deepfake tools have become widely accessible and cheap, with some offering free trials and costing as little as 8 cents per fake image.

The big picture: AI nudifier apps have transformed the landscape of online child safety, making it possible for anyone to create convincing fake nude images with minimal technical skill or cost.

  • These apps generate roughly $36 million annually in revenue and are being widely used in schools, where students create fake nudes of classmates.
  • Unlike traditional photo manipulation that required advanced skills, AI nudifiers only need users to upload a photo and pay a small fee.
  • One examined site charged $49 monthly for 600 credits, allowing users to create both fake nudes and pornographic animations.

Why this matters: The ease of creating deepfake nudes has made “sharenting” significantly riskier than just a few years ago, prompting parents to reconsider their social media habits.

  • Identity theft involving minors surged 40 percent from 2021 to 2024, with roughly 1.1 million children having their identities stolen annually.
  • Even seemingly innocent posts like birthday parties can expose sensitive information that hackers can use for identity theft.
  • Private social media accounts offer limited protection since perpetrators of child sexual abuse usually know their victims.

Legal landscape: New federal legislation addresses the distribution but not creation of AI-generated fake nudes.

  • President Trump signed the Take It Down Act, making it a federal crime to post nonconsensual nude imagery and AI-generated fakes.
  • The law requires social media sites to remove offending images but doesn’t prohibit businesses from offering nudifier apps.
  • Enforcement remains challenging because many app creators operate overseas.

What companies are doing: Tech giants are taking action against nudifier apps through various measures.

  • Meta filed a lawsuit in Hong Kong against a developer of AI nudifier apps that circumvented the company’s ad detection technologies.
  • Social media companies like Snap, TikTok and Meta prohibit advertising of nudifiers on their platforms.
  • Meta shares information about offending apps with the Tech Coalition’s Lantern Program, which includes Google and Microsoft.

What experts are saying: Child safety advocates emphasize the widespread nature of the problem in educational settings.

  • “The teachers and the school administrators I talk to will say it happens all the time in our schools, where kids create fake nudes,” said Josh Golin, executive director of Fairplay for Kids.
  • “It’s everywhere,” said Alexios Mantzarlis, founder of tech publication Indicator, which investigated 85 nudifier websites. “Any kid with access to the internet can both be a victim or a perpetrator.”

Safer alternatives: Parents who want to share photos have lower-risk options available.

  • Sending photos through encrypted text messages to close friends and relatives.
  • Using private photo-sharing services like Apple’s iCloud and Google Photos with small, trusted groups.
  • Only a quarter of parents currently avoid sharing photos of their children online due to privacy concerns.
Why A.I. Should Make Parents Rethink Posting Photos of Their Children Online

Recent News

Why most AI pilots fail to scale beyond proof-of-concept

The gap between pilot and platform represents enterprise AI's biggest challenge today.

On-premises GPU servers cost same as 6-9 months of cloud

Cloud flexibility's fine print undermines its core value proposition.