drag

What do we actually know about TikTok’s algorithm?

IMAGE VIA ADDISON RAE/INSTAGRAM
WORDS BY ISABELLE SACKS

A brief history of censorship on the platform.

In case you hadn’t heard, TikTok is a huge deal. The platform crossed 2 billion downloads in May and had about 1.6 million Aussie users as of February, a number which has likely exploded since then due to quarantine related boredom.

And just like any social media site, a huge part of TikTok’s success has been its algorithm, which seems to be uniquely effective at zeroing in on users’ preferences. Now, in the name of transparency, TikTok is peeling back the curtain in a new blog post breaking down how its ‘For You’ page recommendations work, including some tips on how to make sure the content you’re suggested is more personalised.

Similar to platforms like YouTube, TikTok looks at user interactions and video information, which “might include details like captions, sounds, and hashtags,” and device or account settings, all of which have an effect on the feed. Language preference, country setting, and device type will also factor in to make sure “the system is optimised for performance,” and data points are weighted according to how much they tell TikTok about a user’s preferences. 

While TikTok highlighted its commitment to showing diverse content to users even though they may not “appear to be relevant” or have “a huge number of likes,” it has been embroiled in scandals around its exclusionary practices for much of the history of the platform, so we thought it was important to discuss its history with censorship.

Political speech

Questions over whether TikTok or its Chinese parent company ByteDance, are censoring political speech intensified after an incident where an American teenager’s account was suspended after she made a video criticising the Chinese government for its treatment of Uighur Muslims, a persecuted religious minority in China.

TikTok says it was a human moderation error, but there have been similar reports of TikTok censoring content related to other Chinese political issues, including references to Tiananmen Square, Tibetan independence, the Hong Kong protests, and the treatment of other Chinese minority groups.

Suppressing content from “undesirables”

The Intercept published an explosive piece in March with documents showing that TikTok purposefully suppressed videos by people belonging to groups that they deemed “undesirable.” While unclear how widespread these exclusionary practices were used, moderators were specifically instructed to suppress videos by creators that were deemed fat, poor, unattractive, old, LGBTQ+ or disabled.

When German publication Netzpolotik reported a similar practice in late 2019, TikTok offered the justification that the policy had been in place to avoid marginalised creators from being subjected to cyberbullying. Despite this, the Intercept’s reporting showed that even if this had once been the case, more recent documents explicitly stated that the policy was in place to “retain new users and grow the app”.

Racial bias

An experiment by UC Berkeley Artificial Intelligence researcher Marc Faddoul showed how specific TikTok can be when recommending new accounts for users to follow, right down to their race and hair colour, which raised questions about racial bias in its algorithm. TikTok explained that this was actually due to what it calls “collaborative filtering”. It says that “recommendation of accounts to follow is based on user behaviour” and doesn’t have anything to do with a creator’s race or appearance.

But Marc says the algorithm can still be racially biased, regardless of the intent. He told Buzzfeed News, “If the most popular creators on a platform are white, and the app keeps recommending other white creators, it makes it hard for creators of colour to gain followers and popularity – even if that’s not the intention of the algorithm. Then it means it’s easier for a white person to get recommended than someone from an underrepresented minority.” 

Black Lives Matter

TikTok was also forced to apologise after facing more censorship accusations when videos with the hashtag #BlackLivesMatter and #GeorgeFloyd were appearing to have zero views just as the US was erupting into nationwide protests. TikTok said the issue was due to a technical glitch that affected around 200,000 hashtags, but the incident prompted many to speak out about their personal experiences of TikTok censoring BIPOC creators. 

Shadowbanning

Shadowbanning is a phenomenon where a creator’s content seemingly randomly gets dramatically fewer views and engagements all of a sudden. There are plenty of conspiracy theories out there around shadowbanning, and many have expressed legitimate concerns about TikTok shadowbanning creators from marginalised communities or those discussing controversial topics.

Nevertheless, some have dismissed the idea of a “great conspiracy”, positing instead that creators don’t pay enough attention to the algorithm that runs their lives until it’s negatively affecting them. TikTok says that it recommends videos based on people’s interests rather than the creator’s level of celebrity, so if your content isn’t up to scratch, it doesn’t matter how many followers you have.

Perhaps we should also reflect on why our feeds aren’t as diverse as we expect. It may be that what TikTok’s algorithm is showing us is just an honest, if disheartening, reflection of our society’s biases. If we want our “For You” pages to be as diverse as the world we live in, we must make a concerted effort to engage with content made by people who look and think differently to ourselves.

tiktok.com

Lazy Loading