The Popular Video Platform Allegedly Directs Child Accounts to Pornographic Content Within a Few Clicks

Per findings from a new study, the widely-used social media app has been observed to direct profiles of minors to pornographic content after only a few taps.

Research Methodology

Global Witness set up test accounts using a birthdate of a 13-year-old and enabled the "restricted mode" setting, which is designed to reduce exposure to inappropriate content.

Researchers discovered that TikTok recommended inappropriate and adult-themed search terms to multiple test profiles that were set up on clean phones with no previous activity.

Alarming Recommendation Features

Search phrases recommended under the "recommended for you" feature contained "extremely revealing clothing" and "inappropriate female imagery" – and then progressed to keywords such as "hardcore pawn [sic] clips".

For three of the accounts, the sexualized searches were recommended right away.

Rapid Access to Explicit Content

Within minimal interaction, the study team encountered explicit material including exposure to penetrative sex.

The research group claimed that the content tried to bypass filters, usually by showing the content within an harmless image or video.

Regarding one profile, the process took two clicks after logging on: one click on the search bar and then a second on the proposed query.

Regulatory Context

The research entity, whose remit includes investigating digital platforms' effect on public safety, stated it carried out multiple testing phases.

Initial tests occurred prior to the activation of child protection rules under the UK's Online Safety Act on July 25th, and a second set following the rules took effect.

Alarming Results

Investigators noted that multiple clips included someone who appeared to be under 16 years old and had been reported to the online safety group, which oversees harmful material involving minors.

Global Witness claimed that the video platform was in violation of the digital protection law, which obligates social media firms to block children from viewing inappropriate videos such as explicit content.

Government Position

An official representative for Ofcom, which is charged with overseeing the law, stated: "We appreciate the effort behind this investigation and will analyze its results."

Ofcom's codes for following the legislation specify that digital platforms that carry a significant danger of presenting inappropriate videos must "configure their algorithms to remove dangerous material from young users' timelines.

The platform's rules ban explicit material.

Company Reaction

The social media company stated that upon receiving information from the research group, it had removed the offending videos and made changes to its search recommendations.

"Immediately after notification" of these allegations, we took immediate action to look into the matter, remove content that contravened our rules, and introduce upgrades to our search suggestion feature," stated a official speaker.

Kyle Cooper
Kyle Cooper

Tech strategist and writer passionate about AI advancements and digital solutions.