A new investigation claims TikTok’s algorithm promotes pornography and sexualised videos to minors. Researchers created fake child accounts, activated safety settings, and still received explicit search prompts. These led to clips showing masturbation simulations and even pornographic sex. TikTok says it acted immediately once informed and stresses its dedication to age-appropriate use.
Child accounts reveal disturbing results
In July and August, researchers from Global Witness set up four TikTok profiles. They posed as 13-year-olds by entering false birth dates. The app asked for no further proof of age. Investigators enabled TikTok’s “restricted mode”. The platform advertises this feature as protection against mature or sexual themes. Despite this, the accounts received explicit search suggestions under “you may like”. These directed users to videos of underwear flashing, breast exposure and masturbation. At the most extreme, investigators uncovered full pornography hidden in innocent-looking clips.
Global Witness raises alarm
Ava Lee from Global Witness called the findings a “huge shock”. She said TikTok not only fails to protect children but actively suggests harmful material. Global Witness usually investigates how large tech platforms affect democracy, climate issues and human rights. The organisation first noticed this problem during separate research in April.
TikTok insists on protections
Researchers reported their findings earlier this year. TikTok said it removed the content and corrected the issue. But when Global Witness repeated the test weeks later, the same problem reappeared. TikTok says it has more than 50 features designed to safeguard teenagers. It claims nine out of ten violating clips are deleted before being viewed. After the latest report, TikTok stated it improved search suggestions and removed further harmful material.
New regulations heighten responsibility
On 25 July, the Children’s Codes of the Online Safety Act came into force. These rules require platforms to introduce strong age checks and stop minors from accessing pornography. Algorithms must also filter content linked to self-harm, suicide or eating disorders. Global Witness repeated its research after the codes began. Ava Lee urged regulators to step in, insisting children’s safety online must be enforced.
Users react with confusion
During the study, researchers observed responses from TikTok users. Some questioned why their search suggestions became sexual. One wrote: “can someone explain to me what is up with my search recs pls?” Another asked: “what’s wrong with this app?”
