The Facebook Oversight Board’s First Decisions

The Facebook Oversight Board’s First Decisions

By Digital Freedom Fund, 5th December 2021

Illustration of a statue holding a balance scales and sword with a blindfold over its eyes

In 2021, the Facebook Oversight Board delivered its first decisions, sparking fresh criticism about the board’s effectiveness in holding Facebook to account.

Digital rights activists have long called for better oversight of Facebook and other social media platforms, citing the urgent need to prevent censorship and ensure freedom of expression online while also curbing misinformation and hate speech. 

But when the Facebook Oversight Board was established in 2020, many rejected its legitimacy. To critics, a board funded and appointed by Facebook itself was a woefully inadequate alternative to true accountability and served merely to shield Facebook from legal repercussions for its decisions.

The Oversight Board is mandated to review difficult decisions made by Facebook on content moderation. Since its inception, several high-profile cases of content removal by Facebook have been referred to the board.

By shunting responsibility back to Facebook itself, many critics argued that the board had further exposed its inefficacy

In May, the Oversight Board ruled that Donald Trump’s Facebook account should not be reinstated following the former president’s suspension from the platform after he was accused of spreading misinformation and encouraging the violent riots at the US Capitol in January. However, the board ultimately avoided making a final decision on the fate of Trump’s account by suggesting that Facebook itself choose what to do with it over the next six months. By shunting responsibility back to Facebook itself, many critics argued that the board had further exposed its inefficacy.

Facebook also faced backlash amid the outbreak of conflict in Israel-Palestine, with the platform accused of political bias and suppression of pro-Palestinian voices. In response to the alleged removal of pro-Palestinian content, nearly 200 Facebook employees issued a signed letter to their employer, requesting an investigation into the company’s content moderation system. 

…the decision once again underscored the board’s deficiencies, as it failed to tackle the deeply rooted problems at the heart of Facebook’s automated content removal system

In September, the Oversight Board ruled that Facebook had indeed unjustifiably removed news content related to the conflict. But for many, the decision once again underscored the board’s deficiencies, as it failed to tackle the deeply rooted problems at the heart of Facebook’s automated content removal system. Without a thorough overhaul of Facebook’s opaque content moderation policies and tools, it’s unlikely that the Oversight Board will ever live up to the demands of its many critics.

As part of our The Year in Digital Rights series, DFF counted down to Human Rights Day 2021 with ten of the year’s biggest digital rights stories. Read the full series here.

Artwork by Cynthia Alonso