AI-generated images of children on social media platforms like TikTok and Instagram are drawing attention from individuals with a sexual interest in minors. These images, created using AI technology, depict young girls in suggestive poses and outfits, sparking disturbing comments from older men. While the content is technically legal and features fake people, it raises concerns about the sexualization of minors and the potential dangers it poses.
Child predators have long targeted social media platforms like TikTok and Instagram, which are popular among teens and minors. The rise of AI text-to-image generators has made it easier for predators to find or create inappropriate content. While AI-generated child sexual abuse material (CSAM) is illegal, the images in question fall into a gray area – they are not explicit but are sexualized, featuring minors who are not real. Experts warn that these images can serve as gateways to darker, potentially criminal activities.
Tech companies are required by law to report suspected CSAM and child sexual exploitation on their platforms to the National Center for Missing and Exploited Children (NCMEC). While they are not required to flag or remove images like those described in the article, NCMEC believes that social media companies should take them down to prevent the sexualization and exploitation of children. Popular AI-generated content accounts have been removed from TikTok and Instagram after Forbes brought them to the platforms’ attention.
The AI-generated images of children on social media have attracted a troubling audience, many of whom are older men. Comments on these posts range from suggestive to disturbing, indicating a dangerous intent. Experts warn that these accounts can lead to more severe or illegal content on other platforms and pose a clear safety risk to children. The images, whether AI or real, can act as signposts for promoting links towards CSAM in other channels.
TikTok’s powerful algorithm contributes to the dissemination of these images, making it easier for individuals with a sexual interest in children to find and share similar content. As these AI kids gain viral moments on social media, there is a concern that society may become desensitized to the dangers posed by the sexualization of minors. It remains a pressing issue for tech companies, law enforcement, and child safety advocates to address how to handle the proliferation of AI-generated content depicting minors in sexualized ways on social media platforms.