A human rights group accuses TikTok of recommending pornography and sexualised clips to minors. Researchers created fake child accounts, switched on safety settings, and still saw explicit search prompts. These led to videos showing masturbation simulations and pornographic sex scenes. TikTok insists it acted quickly after being alerted and says it remains focused on safe online use.
Fake child profiles uncover harmful content
In July and August, Global Witness researchers built four TikTok accounts. They posed as 13-year-olds by entering false birth dates. The platform did not request further identification. Investigators enabled TikTok’s “restricted mode”. The company promotes this feature as protection against sexual or mature material. Despite that, the accounts received sexual search suggestions under “you may like”. These linked to videos of women exposing breasts, flashing underwear and simulating masturbation. At the most extreme, explicit pornography appeared hidden inside harmless-looking clips.
Global Witness calls for action
Ava Lee from Global Witness described the discovery as a “huge shock”. She argued TikTok fails to block harmful content and even recommends it to children. Global Witness usually examines how technology influences climate change, human rights and democracy. The group first came across TikTok’s explicit material during different research in April.
TikTok defends moderation system
Researchers flagged the findings earlier this year. TikTok said it removed the material and introduced fixes. But when Global Witness repeated the experiment in late July, sexual videos returned. TikTok says it offers more than 50 safety tools for minors. It claims nine out of ten violating clips are deleted before anyone views them. After the report, TikTok announced it had upgraded its search suggestions and removed further harmful videos.
Children’s Codes add legal pressure
On 25 July, the Children’s Codes within the Online Safety Act came into effect. These rules demand stronger age checks and stricter protection against pornography for minors. Algorithms must also block content linked to self-harm, suicide and eating disorders. Global Witness carried out its second test after these rules began. Ava Lee urged regulators to step in, saying child safety online requires firm enforcement.
Users question search feeds
During the research, investigators also tracked reactions from TikTok users. Some expressed confusion about sexualised recommendations. One asked: “can someone explain to me what is up with my search recs pls?” Another wrote: “what’s wrong with this app?”
