170073537056117.webp

WSJ study reveals exposure of youth to content that is conflict-related on TikTok

In a shocking experiment conducted carried out by The Wall Street Journal, automated accounts pretending to be 13-year-olds using TikTok were bombarded by controversial and often extremist content pertaining to the conflict between Israel and Gaza.

WSJ study reveals exposure of youth to content that is conflict-related on TikTok

This study reveals the powerful influence of the algorithm used by TikTok which creates a highly customized feed based on the user's interactions.

The Wall Street Journal created various bot accounts that were registered as 13-year-olds in order to use the content curating capabilities of TikTok. The bots, which merely stopped watching TikTok videos on the conflict between Israel and Gaza, soon were flooded with related content. The algorithm offered videos that were frequently polarized with either Israel or Palestine-based views and many of them stoked fears and illustrated graphic scenarios.


Within a few hours, bots were served content that was extremely divided, with many of the videos that endorsed extreme views. The bots were presented with a myriad of videos with alarmist content, including some forecasting apocalyptic scenarios. The majority of the videos backed the Palestinian viewpoint, with a number depicting children suffering, protests as well as descriptions of death.

The response of TikTok and its the company's policies

TikTok declared that the trial is not a reflection of the actual experience of teenagers since real users interact using the app in different ways, such as sharing, liking, and looking for videos. TikTok also emphasized its efforts to get rid of millions of videos that contain negative content.

This study raises serious questions about the effects of the algorithm used by TikTok on young users, specifically in the way it could take them on the rabbit hole of content. Exposure to this kind of intense and polarized content at an early age could affect the way they perceive complex global issues as well as their mental health.

TikTok has features for family control that let parents filter content, however the study indicates that they aren't enough. In addition, the results may draw the attention of regulators, given the increasing concerns about the effects of social media in young minds.

170073537014693.webp