2

WSJ’s deep dive into eating disorder rabbit holes on TikTok explains a sudden po...

 2 years ago
source link: https://www.theverge.com/2021/12/18/22843606/tiktok-wsj-algorithm-change-eating-disorder
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

WSJ’s deep dive into eating disorder rabbit holes on TikTok explains a sudden policy change

TikTok’s changes come just days after the WSJ says it asked for a statement about its upcoming story

By Emma Roth Dec 18, 2021, 12:09pm EST

Illustration by Alex Castro / The Verge

A troubling report from the Wall Street Journal digs into the personal experiences of young girls who were sent down rabbit holes of extreme weight loss challenges, purging techniques, and deadly diets through TikTok, contributing to the development of eating disorders, or making existing ones worse. The WSJ did its own experiment to see how TikTok’s algorithm can potentially promote this kind of harmful content — its findings may explain TikTok’s sudden decision to alter the way its video recommendation system operates.

As detailed in the report, the WSJ created over 100 accounts “that browsed the app with little human intervention,” 12 of which were bots registered to 13-year-olds that spent time on videos surrounding weight loss, alcohol, and gambling. A chart included in the report shows that as soon as one of the bots abruptly stopped watching gambling-related videos and began spending time on videos about weight loss instead, TikTok’s algorithm adjusted itself accordingly; it quickly increased the number of weight loss videos that the bot saw to account for this shift in behavior.

By the end of its experiment, the WSJ found that of the 255,000 videos that the bots watched in total, 32,700 of those contained a description or metadata that matched a list of hundreds of keywords pertaining to weight loss. 11,615 videos had text descriptions that matched keywords relevant to eating disorders, while 4,402 had a combination of keywords that indicated the normalization of eating disorders.

A number of these videos were reportedly using different spellings for eating disorder-related keywords to avoid getting flagged by TikTok. After the WSJ alerted the platform to a sample of 2,960 eating disorder-related videos, 1,778 were removed — the WSJ says it’s unclear whether it was taken down by TikTok or the creators themselves.

Just one day before the WSJ’s report dropped, TikTok announced that it’s working on new ways to stop these dangerous rabbit holes from forming. This change also occurred just days after the WSJ says it contacted TikTok for a statement about its upcoming story, so it’s possible that TikTok preemptively rolled out the update before the report was published.

In its post, TikTok acknowledges that it isn’t always healthy to view certain kinds of content over and over again, including videos related to extreme dieting and fitness. It’s now working on a way to recognize whether its recommendation system is unintentionally serving up videos that may not break TikTok’s policies, but could be harmful if consumed excessively. The platform also says it’s testing a tool that will let users stop videos containing certain words or hashtags from showing up on their For You page.

“While this experiment does not reflect the experience most people have on TikTok, even one person having that experience is one too many,” said TikTok spokesperson, Jamie Favazza, in a statement to The Verge. “We allow educational or recovery-oriented content because we understand it can help people see there’s hope, but content that promotes, normalizes, or glorifies disordered eating, is prohibited.” The spokesperson also pointed out that TikTok provides access to the National Eating Disorder Association Hotline within the app.

If this situation seems familiar to you, it’s because Instagram has already been through it (and is still dealing with it). After whistleblower Francis Haugen leaked the Facebook Papers, a collection of revealing internal Facebook documents, Instagram worked quickly to patch the holes in its sinking ship.

The papers show that Facebook did its own research on Instagram’s impact on teens, and found that the app could wreak havoc on teens’ mental health, and also made young girls’ body image worse. About a month later, Instagram announced its plans to introduce a feature that will “nudge” teens away from viewing potentially harmful content. It also rolled out a “take a break” feature that prompts users to close the app if they’ve spent a set amount of time on the platform, ranging from 10, 20, or 30 minutes.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK