BETA
This is a BETA experience. You may opt-out by clicking here
Edit Story

TikTok May Be Suppressing Videos About The Midterms And Voting, New Research Suggests

Tests conducted by advocacy group Accelerate Change found that including certain election-related words in TikTok videos decreased their distribution by 66%, which TikTok denied.


TikTok may be suppressing content that includes neutral political messages focused on getting out the vote, according to new research by advocacy project Accelerate Change.

TikToks in which creators speak certain election keywords, such as “voting,” “midterms,” and “get out the vote,” receive only one-third as many views as otherwise identical videos that do not include the spoken keywords, the research found.

In the study, video creators created 18 “paired” videos, identical but for one variable: In one set of videos, the creators spoke aloud (and included in captions) certain election related-keywords, and in the other, they held up signs with the keywords handwritten on them. The videos with written, rather than spoken, terms received an average of three times as many views as their “paired” counterparts — suggesting that TikTok recommends content including election-related terms at a lower rate than similar content without such terms.

The study comes at a tricky moment for TikTok. Although the company has repeatedly minimized the role it plays in U.S. political discourse, saying the app “isn’t a go-to hub for breaking news” and is “not the go-to place for politics,” numerous investigations have shown that TikTok is a major source of election information — and misinformation — for its users today.

Like other tech companies, TikTok identifies politics-related content on its app and annotates it with links to an “election center” containing educational materials about how to vote. But unlike its competitors, TikTok does not appear to have a voter engagement program that actively encourages people to vote.

TikTok is owned and controlled by a Chinese corporation, ByteDance, and is currently negotiating a contract with the U.S. government based on regulatory concerns that the app could compromise U.S. national security. (Disclosure: I previously held policy positions at Facebook and Spotify.)

After publication of this article, TikTok spokesperson Jamie Favazza questioned the Accelerate Change study, saying “the methodology is unclear and appears to be inconsistent, with creators posting on different days and times, and at least one creator deleting a “verbal video” post after it gained 15K views.”

Favazza said that TikTok does not have a politics or elections classifier that determines which content is about politics or social issues, and that TikTok does not demote or downrank such content. In response to a follow-up question about how the company decides which content should be accompanied by links to its in-app elections center, she said the company determines which content is related to the elections “based on keywords.”

The company also said in an elections blog post that it would not recommend certain “unverified claims, such as a premature declaration of victory before results are confirmed; speculation about a candidate’s health; claims relating to polling stations on election day that have not yet been verified; and more,” but did not say it would downrank nonpartisan get-out-the-vote messages.

Still, Peter Murray, President of Accelerate Change, believes the group’s research shows downranking in effect. “Often with an algorithm performance experiment like this, you struggle to see a pattern in the data, but in this case the result was dramatic and clear: TikTok is suppressing more than 65% of voting video views,” he said in a press release.

If TikTok were, in fact, downranking political content, it would not the first company to have done so. Following the events of January 6, 2021, Facebook announced that it would begin a test that would downrank content about politics and elections in users’ news feeds. The company later announced that the test was effective, and that it would be implementing the demotion globally.

Still, as influencers and voter education groups turn to TikTok to spread get-out-the-vote messages, TikTok’s alleged suppression of those messages could affect who votes in the midterm elections, Murray said. “TikTok could be a positive force for democracy and voter engagement, but instead they have chosen to intensely suppress nonpartisan voting messages on their platform. Given the scale of their platform, this suppression could significantly depress youth voter turnout throughout the country.”

Correction: This article initially stated that Accelerate Change is supported by the Public Interest Network. The Public Interest Network previously supported Accelerate Change but no longer does so.

Follow me on TwitterSend me a secure tip