How Facebook Fills the Feed Leaked Documents Show How Facebook's Algorithm Works

Published
Reading time
3 min read
Animation showing how the Facebook algorithm awards points to a post

Facebook’s recommendation algorithm is a closely guarded secret. Newly leaked documents shed light on the company’s formula for prioritizing posts in an individual user’s feed.

What happened: The Washington Post analyzed internal documents and interviewed employees to show how the company’s weighting of emojis, videos, subject matter, and other factors have evolved in recent years. The Post’s analysis followed up on an earlier report by The Wall Street Journal.

How it works: Facebook’s recommendation algorithm ranks posts for their likelihood to spur engagement according to more than 10,000 variables. Posts earn points for various attributes, and those with the highest score float to the top of a user’s newsfeed. The average post scores a few hundred points, but scores can reach 1 billion or more. Facebook is constantly adjusting the algorithm. The details below were drawn from past documents and may not reflect the current iteration:

  • The algorithm awards points depending on the types of stories likely to spur shares and interactions (health and civic information may count for less as of spring 2020), whether video is included (live videos score higher than prerecorded clips), number of likes (1 point each), number of reaction emojis (0 to 2 points as of September 2020), number of reshares (5 points as of January 2018), number of text comments and their length (15 to 30 points as of January 2018, single-character comments don’t count). The algorithm also weighs the user’s friend list (comments by strangers count less), groups the user has joined, pages the user has liked, and advertisers that have targeted the user. In addition, it considers the post’s processing burden and the strength of the user’s internet signal.
  • To limit the spread of posts the company deems harmful — for instance, those that include hateful messages or disinformation — the algorithm slashes their scores between 50 and 90 percent. But there’s no upper limit to the number of points a post can accrue, so this penalty has little effect on the rank of posts with extremely high scores.
  • Until January 6, Facebook favored posts that include live video over other media types, weighting them up to 600 times more heavily than those with pre-recorded videos, photos, or text. The company capped the multiplier at 60 after the attack on the U.S. Capitol.
  • Facebook introduced emoji reactions in 2017, including the angry emoji. The following year, internal research found that posts that elicited high numbers of angry emojis were more likely to include “civic low quality news, civic misinfo, civic toxicity, health misinfo, and health antivax content.” Reducing its weight limited the spread of such content, and surveys showed that users didn’t like to see it attached to their posts. Recently the company cut its value to zero.

Turning points: Early on, Facebook’s recommendation algorithm prioritized updates from friends, such as a new photo or change in relationship status. In the early 2010s, the company tweaked it to favor likes and clicks. To counteract the resulting flood of clickbait, it was adjusted to promote posts from professional news media. In 2018, the company made changes to promote interaction between users by favoring reaction emojis, long comments, and reshares. This shift displayed more posts from friends and family but led to a surge of divisive content, prompting new rounds of changes in recent months.

Why it matters: Facebook’s membership of nearly 3 billion monthly active users famously exceeds the populations of the largest countries. What information it distributes, and to whom, has consequences that span personal, national, and global spheres. Both users and watchdogs need to understand how the company decides what to promote and what to suppress. Revealing all the details would invite people to game the algorithm, but some degree of transparency is necessary to avoid dire impacts including suicides and pogroms.

We’re thinking: Internet companies routinely experiment with new features to understand how they contribute to their business. But Facebook’s own research told the company that what was good for its bottom line was poisonous for society. The company hasn’t been able to strike a healthy balance on its own. As a society, we need to figure out an appropriate way to regulate social media.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox