YouTube’s ‘Dislike’ Button Doesn’t Do What You Think

Users try to control the video platform’s algorithm by giving content a thumbs-down, but Mozilla researchers say it’s not so simple.
red lit dislike sign
Photograph: Getty Images

YouTube creators often implore their viewers to ‘smash that Like button,’ believing its feedback to be vital to their future success on the algorithm-driven platform. But a new study from the Mozilla Foundation suggests that users who hit the Dislike button on videos to weed out content they don’t want to see are wasting their time.

The study used inputs from 22,722 users who had installed Mozilla’s RegretsReporter browser extension, who were tracked between December 2021 and June 2022. Researchers analyzed more than half a billion YouTube recommendations that were made after users clicked on one of YouTube’s negative feedback tools, such as the Dislike or Don’t Recommend Channel buttons. “These are the tools YouTube offers for people to control their recommendations, but how does that actually impact your recommended videos?” asks Becca Ricks, senior researcher at Mozilla, pointing to YouTube’s own support site on how to “manage your recommendations and search results.”

Different button inputs had different effects on the likelihood of being recommended similar content going forward. Pressing Don’t Recommend Channel would stop only 43 percent of unwanted video recommendations, according to Mozilla, while the Dislike button stopped only 12 percent of recommendations users did not like. “What we found was that YouTube’s control mechanisms do not really seem to be adequate for preventing unwanted recommendations,” says Ricks.

Mozilla’s investigation was prompted by YouTube’s increased public comments in recent years about its recommendation system. “They’ve been talking a lot about metrics like time well spent or user satisfaction as opposed to watch time,” says Ricks. “We were really curious to what degree some of those signals were being picked up by the algorithm, especially because in the previous YouTube report we worked on, we had heard from people that they didn’t feel like they were in control, or they didn’t really feel like taking actions on unwanted videos really translated well to the recommender system.”

For instance, one user in the Mozilla study responded negatively to this Tucker Carlson clip posted by Fox News on February 13. One month later, he was recommended another clip of Carlson’s TV show, again posted by Fox News’s official YouTube channel. A different user expressed a negative response to a video showing webcams focused on Ukraine’s conflict zones in late February; within a month, they were shown another video, this time from the WarShock YouTube channel, detailing how dead Russian soldiers are removed from Ukraine. Ricks has no qualms with the content of the videos, saying it doesn’t breach YouTube’s guidelines. “But if you as a user say you don't want to see it, it’s kind of shocking that it continues to be recommended,” she says.

“I’m not really surprised,” says Guillaume Chaslot, a former YouTube employee and founder of AlgoTransparency, a site that highlights the YouTube algorithm. “I feel, big picture, you should be able to choose and specify to the algorithm what you want, and YouTube absolutely doesn’t let you do that,” he adds.

YouTube says its systems are working as they’re meant to. “Mozilla’s report doesn’t take into account how our systems actually work, and therefore it’s difficult for us to glean many insights,” says YouTube spokesperson Elena Hernandez, who added that viewers are given control over their recommendations. This includes “the ability to block a video or channel from being recommended to them in the future.”

Where Mozilla and YouTube differ in their interpretations of how successful their “don’t recommend” inputs are appears to be around the similarity of topics, individuals, or content. YouTube says that asking its algorithm not to recommend a video or a channel simply stops the algorithm from recommending that particular video or channel—and does not affect a user’s access to a specific topic, opinion, or speaker. “Our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, like creating echo chambers,” says Hernandez.

Jesse McCrosky, a data scientist working with Mozilla on the study, says that isn’t entirely clear from YouTube’s public statements and published research about its recommender systems. “We have some small glimpses into the black box,” he says, which show that YouTube broadly considers two types of feedback: on the positive side, engagement, such as how long users watch YouTube and how many videos they watch; and explicit feedback, including dislikes. “They have some balance, the degree to which they’re respecting those two types of feedback,” says McCrosky. “What we’ve seen in this study is that the weight toward engagement is quite exhaustive, and other sorts of feedback are quite minimally respected.”

The distinction between what YouTube believes it says about its algorithms and what Mozilla says is important, says Robyn Caplan, senior researcher at Data & Society, a New York nonprofit that has previously investigated YouTube’s algorithm. “Some of these findings don’t contradict what the platform is saying, but demonstrate that users do not have a good understanding of what features are there so they can control their experiences, versus what features are there to give feedback to content creators,” she says. Caplan welcomes the study and its findings, saying that while Mozilla’s intended slam-dunk revelation may be more muted than the researchers had hoped, it nevertheless highlights an important problem: Users are confused about the control they have over their YouTube recommendations. “This research does speak to the broader need to survey users regularly on features of the site,” Caplan says. “If these feedback mechanisms aren’t working as intended, it may drive folks off.”

Confusion over the intended functionality of user inputs is a key theme of the second part of Mozilla’s study: a subsequent qualitative survey of around one-tenth of those who had installed the RegretsReporter extension and participated in the study. Those that Mozilla spoke to said that they appreciated that inputs were directed specifically at videos and channels, but that they expected it to more broadly inform YouTube’s recommendation algorithm.

“I thought that was an interesting theme because it reveals that this is people saying: ‘This is not just me telling you I blocked this channel. This is me trying to exert more control over the other kinds of recommendations I’m going to get in the future,’” says Ricks. Mozilla recommends in its research that YouTube allow users more options to proactively shape their own experiences by outlining their content preferences—and that the company do a better job of explaining how its recommendation systems work.

For McCrosky, the key issue is that there’s a gap between the messaging users perceive YouTube is providing through its algorithmic inputs, and what they actually do. “There’s a disconnect in the degree to which they’re respecting those signals,” he says.