If they aren’t careful—or potentially even if they are—TikTok’s youngest users could find themselves bombarded with dangerous video content within minutes of joining the platform, a new study claims.
The report, released Wednesday evening by the nonprofit Center for Countering Digital Hate (CCDH), aims to decrypt TikTok’s algorithm by using accounts that posed as 13-year-old girls, among the platform’s most at-risk users. Every 39 seconds, TikTok served the harmful content to the “standard” fake accounts created as controls. But the group also created “vulnerable” fake accounts—ones that indicated a desire to lose weight. CCDH says this group was exposed to harmful content every 27 seconds.
Researchers set up a duo of standard and vulnerable accounts in four English-speaking countries: the United States, the U.K., Canada, and Australia. They scrolled each account’s For You feed, which provides an inexhaustible supply of videos when a user opens the app. According to TikTok, For You is “central to the TikTok experience” because it’s “where most of our users spend their time.” Every time a video relating to body image, mental health, eating disorders, or self-harm played, CCDH’s team liked it, then paused for 10 seconds. Everything else got skipped.
TikTok’s algorithm has made the app immensely popular and its owner, ByteDance, extremely valuable ($300 billion currently). But speaking to press, CCDH chief executive Imran Ahmed called TikTok’s recommendations “the social media equivalent of razor blades in candy—a beautiful package, but absolutely lethal content presented within minutes to users.”
— source fastcompany.com | Clint Rainey | 12-15-22