Arrow-right Camera
The Spokesman-Review Newspaper
Spokane, Washington  Est. May 19, 1883

YouTube’s dislike button rarely shifts recommendations, researchers say

A sign is displayed outside YouTube headquarters in San Bruno, Calif., in October 2019.  (/New York Times)
By Nico Grant New York Times

For YouTube viewers dissatisfied with the videos the platform has recommended to them, pressing the “dislike” button may not make a big difference, according to a new research report.

YouTube has said users have numerous ways to indicate that they disapprove of content and do not want to watch similar videos. But all of those controls are relatively ineffective, researchers at the Mozilla Foundation said in a report published Tuesday. The result was that users continued receiving unwanted recommendations on YouTube, the world’s largest video site.

Researchers found that YouTube’s “dislike” button reduced similar, unwanted recommendations only 12%, according to their report, titled “Does This Button Work?” Pressing “Don’t recommend channel” was 43% effective in reducing unwanted recommendations, pressing “not interested” was 11% effective and removing a video from one’s watch history was 29% effective.

The researchers analyzed more than 567 million YouTube video recommendations with the help of 22,700 participants. They used a tool, RegretReporter, that Mozilla developed to study YouTube’s recommendation algorithm. It collected data on participants’ experiences on the platform. But the participants were not representative of all YouTube users because they voluntarily downloaded the tool.

Jesse McCrosky, one of the researchers who conducted the study, said YouTube should be more transparent and give users more influence over what they see.

“Maybe we should actually respect human autonomy and dignity here, and listen to what people are telling us, instead of just stuffing down their throat whatever we think they’re going to eat,” McCrosky said .

YouTube defended its recommendation system. “Our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, like creating echo chambers,” Elena Hernandez, a spokesperson for YouTube, said in a statement. “Mozilla’s report doesn’t take into account how our systems actually work, and therefore it’s difficult for us to glean many insights.”

YouTube also said its own surveys have shown that users were generally satisfied with the recommendations they saw, and that the platform has tried to not prevent recommendations of all content related to a topic, opinion or speaker. The company also said it was looking to collaborate with more academic researchers under its researcher program.

One research participant asked YouTube on Jan. 17 not to recommend content like a video about a cow trembling in pain, which included an image of a discolored hoof. On March 15, the user received a recommendation for a video titled “There Was Pressure Building in This Hoof,” which again included a graphic image of the end of a cow’s leg. Other examples of unwanted recommendations included videos of guns, violence from the war in Ukraine and Tucker Carlson’s show on Fox News.

The researchers also detailed an episode of a YouTube user expressing disapproval of a video called “A Grandma Ate Cookie Dough for Lunch Every Week. This Is What Happened to Her Bones.” For the next three months, the user continued seeing recommendations for similar videos about what happened to people’s stomachs, livers and kidneys after they consumed various items.

“Eventually, it always comes back,” one user said.

Ever since it developed a recommendation system, YouTube has shown each user a personalized version of the platform that surfaces videos its algorithms determine viewers want to see based on past viewing behavior and other variables. The site has been scrutinized for sending people down rabbit holes of misinformation and political extremism.In July 2021, Mozilla published research that found that YouTube had recommended 71% of the videos that participants had said featured misinformation, hate speech and other unsavory content.

YouTube has said its recommendation system relies on numerous “signals” and is constantly evolving, so providing transparency about how it works is not as easy as “listing a formula.”

“A number of signals build on each other to help inform our system about what you find satisfying: clicks, watch time, survey responses, sharing, likes and dislikes,” Cristos Goodrow, a vice president of engineering at YouTube, wrote in a corporate blog post last September.

This article originally appeared in The New York Times.