YouTube algorithm boosts videos that users regret watching – research

youtube algorithm

Research by Mozilla has found that the YouTube algorithm is recommending and boosting videos that many users regret watching due to misinformation or hateful content.

The research was based on crowdsourced data that was gathered over 10 months. Users donated the data using an open-source browser extension called RegretsReporter.

No ad to show here.

The extension lets users report videos that they’ve regretted watching on YouTube. It asks users more about their “YouTube Regret” and collects information about how they arrived at the video.

According to Mozilla, users reported a range of videos for including COVID fear-mongering, political misinformation, and inappropriate children’s cartoons.

Mozilla also found a high link between video recommendations and regretted videos.

YouTube algorithm findings

The research noted that 71% of all reported videos were actively recommended by the YouTube algorithm.

Meanwhile, the rate of regrettable videos was higher in countries that do not have English as a primary language. Rather, these countries saw a 60% higher rate than their English-speaking counterparts.

“YouTube needs to admit their algorithm is designed in a way that harms and misinforms people, Brandi Geurkink, Mozilla’s Senior Manager of Advocacy, said in a statement.

“We also now know that people in non-English speaking countries are the most likely to bear the brunt of YouTube’s out-of-control recommendation algorithm.”

Other findings from the report included:

  • A recommended video was 40% more likely to be reported as regrettable than videos the user searched for.
  • Around 9% of the regretted videos that YouTube recommended were later taken down for violating community guidelines.
  • Recommended videos were not necessarily related to the previous video the user watched. Mozilla found that 43.6% of videos the volunteers reported were unrelated to their previous videos.
  • Recommended YouTube Regrets receive 70% more views than other videos that volunteers watched.

These findings have led Mozilla to accuse YouTube’s algorithm of not only boosting content that its users don’t actually want to see, but also recommending videos that violate its own guidelines.

The most common categories for reports from users were misinformation, violent or graphic content, hate speech, and spam or scams.

Mozilla has called on Google to fix the platform’s algorithm, which has been criticised for helping radicalise users and spread fake news.

“Mozilla doesn’t want to just diagnose YouTube’s recommendation problem—we want to solve it. Common sense transparency laws, better oversight, and consumer pressure can all help reign in this algorithm,” Geurkink said.

Feature image: NordWood Themes/Unsplash

Read more: Instagram algorithm prioritises partial nudity – study

No ad to show here.

More

News

Sign up to our newsletter to get the latest in digital insights. sign up

Welcome to Memeburn

Sign up to our newsletter to get the latest in digital insights.

Exit mobile version