TikTok delivers an endless stream of content, from makeup tutorials to political reporting to entertaining and wholesome trends. But, like with most things, all the good also inevitably comes with some bad. The app has long been criticized for its tendency to push harmful weight-loss content to its audience, especially younger users. And now, TikTok is cracking down on it.
Last week, the app updated their community guidelines to state that it will be paring back content involving restrictive low-calorie diets, medication or supplements for weight loss or muscle gain, exercises designed for rapid weight loss, before-and-after transformations, and more starting in May.
And while the guideline updates are certainly a step in the right direction, experts are also warning that they may not go far enough, considering what we know about how weight-loss content has a way of slipping through the cracks.
This isn’t the first time TikTok has attempted to control weight-loss content on the platform. Back in 2020, the ByteDance-owned app came out with restrictions that were designed to limit users’ exposure to potentially harmful imagery and language, including a ban on “ads for fasting apps and weight loss supplements, and [increased] restrictions on ads that promote a harmful or negative body image,” Tara Wadhwa, TikTok’s safety policy manager, said in a press release.
But while 2020’s restrictions were specifically for advertisements, these new guidelines are more expansive, taking aim at organic, creator-generated content that gets shown to users under 18 years old, or winds up on the For You Feed.
“We know that being exposed to harmful content online that promotes dangerous dieting behaviors can actually lead vulnerable young people to engage in those behaviors,” says Amanda Raffoul, PhD, instructor at the University of Toronto and researcher with STRIPED (Strategic Training Initiative for the Prevention of Eating Disorders).
But with millions of videos being uploaded to the platform each day, it’s likely that this content will continue to be created and circulated. (Unless, of course, TikTok itself is banned – something that’s looking more and more like it may become a reality.)
After all, restrictions on certain kinds of content haven’t stopped users from posting or finding them before. “By issuing a blanket ban on all of these types of content, there’s still a likely chance that people are going to get creative and find ways to promote diet products or encourage really low-calorie diets and get through whatever filters TikTok might have for that,” Dr. Raffoul says.
Currently, TikTokers say “corn” instead of “porn” and “unalive” instead of “kill”, or type “d1s0rder” in their captions instead of “disorder?” Loopholes like that have always existed on the platform, and creators are likely to find similar workarounds when making and posting weight-loss content.
Of course, by doing something to begin to restrict this type of content, the app may be sending a message that it knows weight-loss videos are harmful and wants them off the platform, which is at least the right statement to be making. “I’m hopeful because something needs to be done about the fact that dangerous weight-loss content is being promoted to young people especially,” Dr. Raffoul says.
“But I’m skeptical because it seems like whenever platforms make these types of promises, what they’re doing is trying to restrict user-generated content and not changing the way that the platform itself gets dangerous content to vulnerable users,” she adds.
Dr. Raffoul is referring to, of course, TikTok’s (in)famous algorithm, which tracks what types of content users interact with, then feeds them more of the same. That means that if someone checks out the comments of one video centered on weight loss, they’ll start seeing more. And that can be all it takes to lead them down the road to developing body image issues and even disordered eating. That’s not a stretch: more TikTok use is related to higher levels of body satisfaction, according to a 2022 study in the journal Body Image.
Instead of restricting users, then, brands would do a much better job at culling harmful content if they focused on algorithms.
“What platforms can do is audit and tailor the ways in which algorithms might be pushing more extreme content to vulnerable users,” Dr. Raffoul says. “Things like algorithm risk audits are being embedded in a lot of different social media legislation now because what they do is require an external organization or auditor to take a look at the ways that a TikTok algorithm, for example, might unfairly be pushing out eating disorder content, and hold them accountable for that.”
Ultimately, users can’t rely solely on the guidelines of TikTok, or any other social media platform, to keep them protected from triggering or harmful content, which has a weed-like way of surviving. Instead, we all have to consume content mindfully: unfollow accounts posting harmful content, report videos that make their way onto your feed anyway, and perhaps most importantly, aim to get your health advice from a balanced selection of sources – not just your FYP.
Elizabeth Gulino is a freelance journalist who specializes in topics relating to wellness, sex, relationships, work, money, lifestyle, and more. She spent four and a half years at Refinery29 as a senior writer and has worked for House Beautiful, Complex, and The Hollywood Reporter.