How Does TikTok Amplify Voices and Suppress Others? 

TikTok Algorithm Analysis

Erika Fox

Adapted for online publishing: 5/22/2023, Originally written: 12/8/2020

TikTok has taken the world by storm in a way that no social media app has ever done before. Having been downloaded over two billion times, it is the fastest-growing app and has become a critical centerpiece for youth culture globally.The importance of TikTok to culture might be attributed to: one, the trends on the platform that turn into Gen Z inside jokes, two, the app’s capacity for random virality such that basically anyone could earn a massive platform for their content, three, the way the app allows and encourages young people to band together and participate in social activism, or all of the above. However, this same algorithm that has positively impacted so many lives also has the potential to have a severely negative effect on others.

Here, I will analyze the main function on the TikTok app, the #ForYou page, to seek answers to the overarching question: How does TikTok, the popular app known for its capacity for random virality and hosting a Gen Z cultural scene, also have the potential to make a negative impact on society? More specifically, how does the #ForYou page algorithm amplify some voices, and consequently, suppress others? Is the spread of discriminatory information on the app the fault of the algorithm itself or problematic user input? These questions will serve as a guide as we try to uncover explanations as to how TikTok may or may not introduce more hate into our lives.

Transparency, the extent to which an algorithm’s designer allows its innerworkings to be known to the public, is the first aspect of TikTok’s #ForYou page that I want to discuss. While most major tech companies claim that the reason they keep their algorithms tightly under wraps, or “black-boxed”, is to protect their products from getting mimicked, for some it might actually be to hide potentially controversial decisions the company has made in their design. For example, Amazon’s search algorithm has been highly questioned as critics suspect that they problematically favor their own products in the results. By withholding concrete information about how their search algorithm chooses which products to show first, it makes it very difficult to confirm (or deny) these speculations. TikTok’s parent company, ByteDance, similarly has done nearly nothing to explain how their #ForYou page algorithm actually works. That is, despite the app’s 2017 release, the company waited until June 2020 to publish a blog post that gave some insight into the functionality of the #ForYou page. This publication came as a response to a series of accusations the company was facing regarding their untrustworthy privacy policy. While their explanation post was meant to be a gesture of increased transparency in order to rebuild some of the user trust that seemed to be fleeting, the algorithm explanation was general at best and still, no code was offered. In other words, this post served only to provide an illusion of transparency.

ByteDance’s lack of readiness to be open about their functionality could be a tell that their algorithm has controversial decision-making practices, whether they were coded into the algorithm intentionally or came as a side-effect to a well-intentioned model. Most of what we know for sure about the functionality of the #ForYou page comes from analyzing its inputs and outputs, with only a small portion of that information confirmed by ByteDance’s “transparency” post. That blog post from TikTok explains that the algorithm works to provide users with a personalized experience using their app data, which makes it a recommendation algorithm not too different than what Netflix uses to recommend content to its subscribers. However, Netflix has complete control over the content that lives on their platform, where TikTok consists of videos that could be posted at any time by anyone. Additionally, where Netflix relies on a binary thumbs up/thumbs down metric in order to recommend or not recommend similar content, TikTok has a multitude of information that could be considered. TikTok’s explanation post includes a list of inputs the #ForYou might take into account, including watch time (did the user watch the video in its entirety?), engagement (did the user like, comment on or share the video?), followings (did their friends engage with the video?), trends (does the user seem to be following a trend?), and metadata (does the user seem to be keeping up with a hashtag?). However, while this might hit at the basics of what the #ForYou algorithm is doing when it determines what new videos to show users, this list of inputs does not fully answer the question as to how the algorithm chooses which voices to amplify. Without information such as how much weight each of those inputs gets in the generation of the feed output or any details about what the engagement should look like, we still have reason to believe that TikTok’s algorithm allows hateful content to reach massive audiences.

From what we have discussed about the functionality of TikTok’s #ForYou page algorithm at this point, there are a few imaginable ways in which discriminatory content could garner a platform on the app. For example, if watch time is the input that gives content the biggest boost, then it would not matter if the users actually liked a video or not, as long it caught their attention for longer than the average post. This plays into the idea that many scholars have explored regarding the tendency of social media users to focus their attention on extreme and potentially misleading content, including Ian Bigly and James Leonhardt in their work “Extremity Bias in User-Generated Content Creation and Consumption in Social Media”. Users of social media apps cannot help but shift their gaze to polarizing content, as it is surprising and unfamiliar. If the engineers behind TikTok’s #ForYou page algorithm did not consider this phenomenon in their design, that could be a fatal flaw that lets the app be destructive. On the other hand, a study that also noticed TikTok’s extreme content scene, Serrano et. al’s “Dancing to the Partisan Beat” conducted an in depth analysis in order to try and pinpoint a flaw in the #ForYou page’s design more concretely. Their analysis revealed that hashtags reflecting extreme viewpoints such as #trump and #bernie2020 attracted comparable view statistics to tags relating to popular stars such as #shawnmendes and #billieeilish. This finding reintroduces suspicions that TikTok has something to hide within their algorithm design and knowingly favors extreme content. After all, if those kinds of videos grasp user attention for longer, that would benefit their business.

Although there appear to be multiple reasons why extremist content might thrive on TikTok, it is important to consider the impact the #ForYou page’s feed output actually has on its users in the first place. As TikTok has skyrocketed in popularity, particularly during a politically charged year such as 2020, it has gained some attention on how it has changed the way young people might get involved. Just like for the dance trends that spread on the app faster than wildfire, TikTok’s 800 million+ active users exhibit their strong engagement to the short-form video content when it comes to political messages as well. Take a famous example, the event of Trump’s Tulsa rally that was scheduled for late June 2020. Teens took to TikTok to speak out about their distaste for the scheduled rally both due to opposition to President Trump’s ideology and pandemic related concerns. Just like any other big trend on the platform, the message quickly caught the attention of young users. Eventually this energy translated into a tactile action plan to sabotage the event where thousands of them purchased tickets to the rally with no plans to attend, preventing actual Trump fans from getting in. Instances like this reflect the true power TikTok has over its users. While the activism it inspired in teens in the event of the Tulsa rally can be considered a positive influence of TikTok, the same tendency the app has for viral messages could have the opposite effect. To our knowledge, there is nothing stopping hateful activism to spread in a similar way on the platform, that is, without reliance on millions of users to report the videos to stop their growth. Instead, TikTok sports an ominous combination of billions of impressionable users and a tendency for viral extreme content that will influence their behavior. While the app has not yet been utilized to start a movement fueled by hate, events such as the Tulsa rally prove that the potential is there. 

At the end of the day what keeps users on the platform is their desire to be in the loop on the community’s cultural impact and trends, in hand with the addictive nature of the app’s endless scrolling interface. Ninety percent of the app’s users use the app every single day. To not open the TikTok app on a daily basis would be to potentially miss out on a movement such as their meddling with Trump’s Tulsa rally or to be out of the loop of the latest joke or dance trend. What’s more is that the endless stream of video content their #ForYou page has to offer makes it difficult for users to leave once they are on the app. While the majority of the app’s users seem to be mostly unwitting to some of the shadiness behind how ByteDance works to keep them on the platform, we cannot not say that no efforts have been made to resist the power that TikTok has over them. As hinted at before during our discussion around transparency, ByteDance has experienced a high degree of backlash regarding its data collection practices. In other words, the app has a fair amount of involvement in what is known as surveillance capitalism, a term coined by Shoshana Zuboff in order to describe the profitable commodification of personal data by companies with technological aspects. This involvement sparked the first initiatives by TikTokers to seek alternatives to the app that has been so important to them, such as the rival app “Triller”, or the copycat new feature of Instagram known as “Reels”. The app’s role in surveillance capitalism has even caused it to be banned in several countries including India, which lead the United States to consider following in their footsteps. Although the US still hasn’t made any significant moves in that direction, the prospect of a TikTok ban has inspired significant protest from fans of the app, saying that TikTok’s ability to band its users together and inspire creativity outweighs any possible negative associations the platform may have.

The reaction TikToker’s had to the idea of losing their hub for creativity, activism and joy proves just how much this app has managed to become a pillar in youth culture. The way TikTok has encouraged many to share without fear to a limitless amount of people has brought people together in an unprecedented way, during a time when they needed a community more than ever (the pandemic). The societal weight of TikTok only makes it more crucial that its parent company ByteDance properly takes on the responsibility they have to keep its influence positive. Because the #ForYou page’s algorithm remains mostly black-boxed, it is impossible to concretely claim that they do not claim this responsibility, but as we have seen many times before, their lack of transparency does not bode well. From what our research methods have revealed over the course of this case study, there are several ways the app could impose doom on society. First, the algorithm could have significant gaps in its design that allow for the tendency of many users to gaze at extreme content to spread those messages at an accelerated rate. Second, an analysis of popular hashtags in the app suggest that the #ForYou page developers have full awareness of their algorithm’s tendency to promote extreme content but allow it for the sake of business. Third, the app inspires high levels of engagement across its audience, which could mean the platform has the capacity to spark energized acts of hate down the road. Despite all of these threats, the reign of TikTok shows no sign of diminishing. However, this research should inform its users to hold their beloved platform accountable in order to prevent it from shadowing our communities with hate in the future.