Bad Recommendations Activists say YouTube's algorithm spreads climate disinformation.

Reading time
2 min read
Chart with top 100 related videos for YouTube search on "global warming"

YouTube is a great place to learn about new ideas — including some that have been thoroughly discredited.

What’s new: YouTube’s recommendation algorithm is helping spread misinformation about climate change, according to research by Avaaz, a self-funded activist group.

What they found: The researchers aimed to learn which videos YouTube was likely to feature in its “Up next” recommendations for videos resulting from three searches: “climate change,” “global warming,” and the more skeptical phrase “climate manipulation.” Working between August and December, they entered the search terms into a YouTube service that lists related videos. Then they used a data visualization tool to find the 100 most likely recommendations.

  • The researchers watched the videos and flagged as “misinformation” those that contradict scientific consensus according to the Intergovernmental Panel on Climate Change, U.S. government agencies, and peer-reviewed research.
  • For videos returned by searches on “climate change” and “global warming,” the percentage of recommendations containing misinformation were 8 and 16 percent respectively. For videos returned by a search on “climate misinformation,” the number was 21 percent.
  • Ads by organizations like the World Wildlife Federation as well as major advertisers like L’Oreal and Warner Bros. often accompany videos that contradict scientific findings.
  • The report’s proposals include giving advertisers the ability to stop their ads from running alongside misleading videos, limiting algorithmic recommendation of such videos, and making YouTube’s internal data on recommendations available to independent researchers.

The response: YouTube defended its recommendation software and questioned the study’s methodology. It pointed out that it displays a link to Wikipedia’s “Global Warming” page under many climate-related videos.

Behind the news: In June, YouTube overhauled its algorithms to give users more control over recommendations. Those changes cut the time viewers spent watching such content by 70 percent. The move followed earlier efforts to block videos espousing miracle cures or conspiracy theories.

Why it matters: YouTube’s recommendations are a potent force for spreading information (and misinformation). They were credited with driving around 70 percent of the site’s viewing time in 2018.

We’re thinking: It’s great to see YouTube and other companies working to reduce misinformation. But the AI community’s work is far from done. We need incentive mechanisms that don’t just reward numbers of views, but shift incentives toward distributing factual information and rational perspective to the extent they can be determined fairly.


Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox