TikTok offered details about how its most popular feed works. Experts seem unimpressed.

TikTok offered details about how its most popular feed works. Experts seem unimpressed.

TikTok recently published a blog post about its popular “For You” page, but detailed information about the company’s algorithms was noticeably absent. | Nasir Kachroo/NurPhoto via Getty Images

When a company “reveals” its algorithm, pay attention to what it doesn’t share.

Open Sourced logo

Faced with longstanding questions and speculation about how its popular “For You” feed works, TikTok released some details about how the feed works in a blog post last week. The move comes amid a broader effort by TikTok to appear more transparent, especially as the company faces accusations of both political and aesthetic censorship as well as growing pressure from lawmakers in the United States over TikTok’s relationship with its Beijing-based parent company, ByteDance.

But TikTok’s post told a familiar story about how feed algorithms work. According to TikTok’s recent announcement, a variety of factors based on how users engage with the app influence what ends up in their For You feed; some of those factors matter more than others; and they’re all in service of a secret, technical sauce that the platform uses to make the best guess as to what it predicts users will want to see. This workflow is not so different from the way other social media platforms, like Facebook, describe their feeds.

Companies like TikTok generally want to maximize engagement and time spent on their platform, which helps them sell advertisements. Exactly how they do that is meant to be a bit of a mystery. Among other things, not revealing everything that happens behinds the scenes helps companies dictate users’ understanding of certain feeds and their conception of algorithms in general. So while we now know a little bit more about how TikTok’s algorithm works, what TikTok chooses not to share is perhaps more important.

“The more that they can keep close to the vest, the more ambiguous it is to understand,” said Kelley Cotter, a doctoral student at Michigan State University who studies public awareness of algorithms. “Without the heat on them, they’re kind of able to design and redesign as they see fit without being sort of weighed down by either regulation, lawsuits, [and] users.”

Of course, TikTok is by no means unique in choosing to not reveal the exact formula behind its algorithm. But its not-so-big reveal about the For You algorithm serves as a reminder that when platforms say they’re telling us more about how their algorithms work, they’re often not telling us that much. And as long as they don’t reveal exactly how these algorithms impact users — and different types of users — people take it upon themselves to figure out how social feed algorithms work. Some may draw conclusions those platforms don’t necessarily agree with.

TikTok sounds a lot like platforms describing how their feeds work

In its recent blog post, TikTok explained that everyone’s For You page is unique, and users provide the app with implicit and explicit information that informs what videos they might see in the future.

Some factors, like whether you watch longer videos to the end, matter more than others, such as whether a viewer and a creator are based in the same country. For users who are just starting on the app, their initial preferences — their stated interests, their response to a general video feed, and so forth — help inform what the For You feed looks like. As TikTok collects more information from that user and users like them, what shows up in the For You feed continues to be adjusted based on what the algorithm thinks they’d be interested in. Predictive models are also involved, similar to many other recommendation systems.

“When you decide to follow new accounts, for example, that action will help refine your recommendations too, as will exploring hashtags, sounds, effects, and trending topics on the Discover tab,” TikTok said in its blog post.

The company says that a user’s feedback, like tapping “not interested,” can also play a role in what appears in the For You feed. TikTok may also prevent certain types of content, such as “graphic medical procedures or legal consumption of regulated goods” from ending up there. Acknowledging that filter bubbles can be a problem, TikTok says that it also tries to diversify users’ feeds. Notably, the company said that the follower counts of a user posting a video — as well as the view counts of previous videos posted by that user — don’t directly influence whether they end up in the For You feed. The company did admit that “a video is likely to receive more views if posted by an account that has more followers.”

But how revealing is what TikTok shared? Not very, explained some researchers.

Marc Faddoul, a research scientist at the University of California Berkeley, said in an email that what TikTok discussed was, “from a research perspective, useless.” Faddoul added that “everything they say is obvious” and flagged as unsurprising “the fact that clicking on ‘not interested’ will decrease the prominence of similar content [and] listing all the types of user interactions which ‘might’ be used to personalize the recommendations.”

The details in TikTok’s recent post also largely match up with how companies have previously characterized the algorithms that drive their own feeds. Consider how Facebook describes the factors that play into its News Feed. Facebook acknowledges that “thousands of signals that may be considered for News Feed Ranking,” and “some things have a smaller influence over what you see.” This seems similar to what we know about Instagram’s algorithm as well.

Cotter, from Michigan State, also noted that a lot in TikTok’s post “very closely resembles what we’ve seen from companies in the past.” She pointed to her previous research on a series of Facebook blog posts meant to describe the News Feed. Cotter and her colleagues found that the company was more focused on explaining why the feed works as it does, rather than exactly how.

Nicolas Kayser-Bril, a reporter at AlgorithmWatch, a nonprofit that focuses on algorithmic decision-making, came to a similar conclusion.

“The language reminds me of what Facebook and Google write in their ‘Why am I seeing this ad?’ sections: we learn that ‘a number of factors’ are at play,” Kayser-Bril said in an email. “Absent an independent audit of TikTok’s algorithm, there is no reason to believe that we know more about it after reading the statement (we certainly know more about how the company would like its algorithm to be seen, but this is a wholly different thing).”

TikTok would not say how many total factors can influence the feed. The blog post simply says that “recommendations are based on a number of factors” and provide different categories of those factors.

So what kinds of information would be helpful in actually understanding how TikTok and its For You algorithm? Christo Wilson, a computer science professor at Northeastern, argues that the nuances of how the software understands content is most revealing.

“You’d have to get a lot more of the nitty-gritty details of what kind of features or variables are going into recommendations,” Wilson said. “They look at hashtags. That’s, like, the most obvious thing. But are they looking at things like sentiment?”

TikTok has an image problem

TikTok’s blog post didn’t just come out of nowhere. Users have long been engrossed in figuring out how to get videos in the For You feed. Some TikTok influencers have even turned to making videos to share their tips on how to reach the For You feed, and some have attempted to run their own pseudo-experiments to see what gets people onto the page. Overall, the inner workings of the For You algorithm are at least a somewhat significant source of interest on the app itself. Videos with the #TikTokAlgorithm hashtag have garnered more than 130 million views.

All that is great for TikTok, which wins when people spend more time on its app trying to figure out how its algorithm works in an attempt to go viral. As Vox’s Rebecca Jennings wrote earlier this year:

Its algorithm serves trending content to a wide audience, so even accounts with a handful of followers can go hugely viral within the span of a few hours. Followers are racked up far more quickly than on other platforms, so having tens of thousands of them is relatively standard for anyone who’s had even a minor hit.

Investigative reporting revealed other factors that might have previously been at play within the feed — factors that went unaddressed in TikTok’s recent blog post.

Earlier this year, the Intercept obtained internal policy documents that encouraged content moderators to limit videos appearing in the “For You” feed that were deemed “undesirable,” including those featuring people with an “abnormal body shape” and “ugly facial looks.” TikTok also reportedly reached out to some high-profile users of its app to update them about changing rules, and the company censored political speech on its livestreaming feature.

TikTok content moderators based in the US similarly told the Washington Post that they were instructed by TikTok staff based in Beijing to censor political speech as well as content deemed “vulgar.” These reports echoed content moderation documents acquired by the Guardian in 2019 as well as earlier reporting from the German site Netzpolitik that TikTok had discriminated against people with disabilities, as well as LBGTQ and fat people.

“The guidelines referenced were a misguided attempt to reduce cyberbullying, drafted for use in limited countries, and they had long been out of use by the time that article was published,” a TikTok spokesperson told Recode. “Today, we take a nuanced approach to moderation, including building out a global team with deep industry experience and working with an external content advisory council of subject matter experts.”

TikTok announced earlier this month that it would stop using content moderators based in China. This was around the same time that the company faced accusations of political and racist censorship in the midst of anti-police brutality protests. (As of Monday, it had job postings for content moderation staff in cities including São Paulo, Brazil, and Seoul, South Korea.)

In May, TikTok users participated in a campaign that intended to highlight the work of black users as well as raise awareness of censorship on the platform. TikTok released a statement in June, acknowledging the campaign and committing to invest in “moderation strategies to better handle potentially violative content,” while pledging “to make sure [its policies and practices] do not inadvertently limit exposure for creators based on who they are.”

TikTok also has a transparency problem, but it’s not alone

TikTok has also been working to quell criticism that its platform isn’t transparent. The company released its first transparency report in 2019, detailing government requests for data and content removal. Notably, China was not on the list. TikTok also announced this year that it would launch a transparency center in Los Angeles focused on “moderation and data practices.” The company’s recent blog post about the For You algorithm said that experts visiting that center will eventually be able to learn more about how its algorithms operate and review the company’s source code.

But its recent blog post doesn’t add up to complete transparency. Algorithm Watch’s Kayser-Bril said “such statements are just a declaration of intent” and that the transparency standard ought to be an independent audit, in which researchers are given access to production servers and databases. Kayser-Bril noted that the company’s mention of providing access to the source code was “a step in the right direction” but still not sufficient.

Notably, TikTok would not say who those researchers are or will be, or if they’d be provided data beyond the company’s source code.

Faddoul, the Berkeley researcher, suggested that TikTok should provide estimates of how much impact factors like view count have in the For You algorithm as well as explicit guidelines that moderators consider, including the weight that moderator decisions have on the algorithm. TikTok has not released content moderation enforcement statistics, which would include concrete figures of how much content is taken down and for what reasons (Facebook does this in its community standards enforcement reports).

Faddoul added that details in the internal content moderation documents flagged by the Intercept include ones that TikTok itself should be publicly disclosing. While TikTok does lay out its community guidelines, the company doesn’t seem to release a content enforcement report (Facebook does).

While TikTok does acknowledge how the For You feed generally works, the platform doesn’t discuss the impact that the feed has actually had on users. Black creators have raised concerns about censorship, for instance, and questions about the true influence of content moderators as well as the total number of factors that can actually influence what shows up in the For You the feed remain unanswered.

Ultimately, Cotter said, to really understand how an algorithm works, you need empirical analysis — a study of how the system impacts users — to see what it’s doing and how that affects users. This type of research is sometimes conducted by platforms themselves and not released to the public. But even if TikTok were doing empirical analyses, it’s not clear whether the company would be acting on it. Earlier this year, the Wall Street Journal reported that Facebook failed to act on internal research finding that the site was making people more divisive and polarizing.

Social media companies aren’t necessarily compelled to answer to outside researchers either. A recent audit of the Instagram algorithm reportedly found that pictures that featured more bare skin were more likely to be boosted on the platform. Facebook didn’t answer the researchers’ questions, which were sent about a month before the study was published by AlgorithmWatch. A day after publication, Facebook criticized the study as flawed and explained that the platform surfaces “posts based on interests, timeliness of posts, and other factors to help people discover content.”

So what’s the point of TikTok, Facebook, or any social media giant publicly announcing some vague details about how their algorithm works?

“They’re PR memos,” says Cotter. “They’re trying to present these systems in the way they want them to be presented and establish the rubric for how they should be evaluated.”

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.


Support Vox’s explanatory journalism

Every day at Vox, we aim to answer your most important questions and provide you, and our audience around the world, with information that has the power to save lives. Our mission has never been more vital than it is in this moment: to empower you through understanding. Vox’s work is reaching more people than ever, but our distinctive brand of explanatory journalism takes resources — particularly during a pandemic and an economic downturn. Your financial contribution will not constitute a donation, but it will enable our staff to continue to offer free articles, videos, and podcasts at the quality and volume that this moment requires. Please consider making a contribution to Vox today.

Author: Rebecca Heilweil

Read More

RSS
Follow by Email