The Surprising Truth About The TikTok Algorithm

Our TikTok For You Page knows us better than we know ourselves; it knows how we like our morning coffee, our favorite fandoms, and the trauma our therapists have yet to uncover. Depending on your interests and engagement, you may be on Straight Tok — a treasure trove of Addison Rae lookalikes, dance challenges, and unusual couple content. Vines, premium edits and just clips of super cool people you want to be, are on Alt Tok — and you're grateful when those videos show up on your FYP. Aesthetic coffee pours and smart pets are interspersed between it all.

Basically, you can tell a lot more about a person from their TikTok algorithm than from meeting them in person. (It's a very real concern for the generation living in technology.) However, the TikTok algorithm goes beyond harmless puppy content; it targets your political beliefs, your mental health, and promotes content that you may not even want to see. Read on to know more about the darker side of the algorithm.

TikTok's algorithm can put you in a filter bubble

Last year, those behind TikTok wrote a blog post titled, "How TikTok recommends videos #ForYou." The post broke down the algorithm by the way we engaged with the content, whether it's commenting or liking a video, or if we're checking out certain hashtags. Surprisingly, one of the strongest indicators of interest is watching a video from start to finish, which means that you could watch something out of curiosity but not real interest and that is determined as a category you want more of. The blog also mentioned that it diversifies the kind of content it shows you which it says "gives you additional opportunities to stumble upon new content categories, discover new creators, and experience new perspectives."

However, an investigation by The Wall Street Journal revealed that TikTok's algorithm is really only feeding into your confirmation bias. The publication created bot accounts with assigned interests — interests that were only shown by engaging with content regarding that topic — and TikTok exclusively showed them that interest on their FYPs. While this may not be terrible for people interested in recipes, the experiment showcased how easily a user can get into a rabbit hole of harmful content: whether it's depressive videos or videos with racist themes. This is similar to Facebook's filter bubble, where people see only what they agree with. Fortunately, becoming aware of this problem is the first step to understanding these apps.