Gibraltar Chronicle Logo

Why Facebook created its own ‘supreme court’ for judging content

By Siri Terjesen

Facebook’s quasi-independent Oversight Board on May 5, 2021, upheld the company’s suspension of former President Donald Trump from the platform and Instagram. The decision came four months after Facebook CEO Mark Zuckerberg banned Trump “indefinitely” for his role in inciting the Jan. 6 riot at the U.S. Capitol. The board chastised Facebook for failing to either set an end date for the suspension or permanently ban Trump and gave the social media company six months to resolve the matter.

What is this Oversight Board that made one of the most politically perilous decisions Facebook has ever faced? Why did the company create it, and is it a good idea? We asked Siri Terjesen, an expert on corporate governance, to answer these and several other questions.

The Oversight Board was set up to give users an independent third party to whom they can appeal Facebook moderation decisions, as well as to help set the policies that govern these decisions. The idea was first proposed by Zuckerberg in 2018 after a discussion with Harvard Law Professor Noah Feldman, and the board began work in October 2020, funded by a US$130 million trust provided by Facebook to cover the initial six years of operating expenses.

According to the board itself, it “was created to help Facebook answer some of the most difficult questions around freedom of expression online: what to take down, what to leave up, and why.” The Oversight Board has final decision-making authority, even above the board of directors, and its decisions are binding on Facebook.

The Oversight Board has 20 members from around the world and a diverse variety of disciplines and backgrounds, such as journalism, human rights and law, as well as different political perspectives. It even includes a former prime minister. The goal is to eventually expand the board to 40 members in total.

So far, the board has reviewed 10 Facebook decisions, including the one involving Trump. The decisions involved a variety of types of content, such as posts that were removed because they were deemed racist, indecent or intended to incite violence. It overturned Facebook’s ruling in six of the cases and upheld it in three of them. In the 10th case, the user deleted the post that Facebook had removed, which ended the board’s review.

In cases where the board overruled Facebook, the posts that had been removed were reinstated. And the board sometimes urged the company to clarify or revise its guidelines.

Given that Facebook is expected to take 20 to 30 billion enforcement actions in 2021 alone, it’s unlikely the Oversight Board will be able to handle more than a handful of the most high-profile cases, like that of Trump. It’s one of the reasons the Oversight Board is dubbed “Facebook’s Supreme Court.”

As a platform company, Facebook is unique.

It’s a social media giant that must monitor a global operation that generates over $86 billion in revenue, employs 58,600 people and serves more than 2.8 billion active monthly users – more than a third of the world’s population – as well as millions of advertisers. Very few companies operate in a space that involves user content moderation, and none at this scale. Other platform companies have considerably less content, and usually only in one language, whereas Facebook is available in 100 languages.

Given Facebook’s shareholder-elected corporate board of directors includes just 10 people, each of whom has their own demanding day job, it is not surprising to me that Zuckerberg decided to set up an outside panel to develop decisions about speech and online safety.

It’s unlikely, however, that other companies will ever have a similar type of board. The Oversight Board has been extremely resource intensive. It took over two years to establish through a series of 22 roundtable meetings with participants in 88 countries, six in-depth workshops, 250 one-on-one discussions and 1,200 submissions – not to mention its high cost of $130 million, which is meant to last six years.

A growing body of research questions whether directors on corporate boards can fulfill their oversight responsibilities on their own, due to the sheer amount of information that must be obtained, processed and shared.

While I think we will see more corporate boards outsource some decisions and processes to external panels – as a small board cannot be expected to have the requisite knowledge and skills on all topics – few corporations are likely to follow Facebook’s lead and grant an outside body the power to make unilateral decisions.

Since only the board of directors is beholden to a company’s shareholders, board directors ultimately need to take the final responsibility for corporate decisions.

While it’s likely that some at Facebook hoped shifting its thorniest decisions would insulate the company, executives and corporate board members from political or legal problems, as the Trump decision shows, it won’t actually do that.

Certainly the decision to utilize an outside oversight body might be interpreted as political, as all 10 Facebook board directors live and work predominantly in the United States and might be hesitant to vote to make decisions like restricting the freedom of expression of a former president who still commands support among many Americans – and won 47% of the popular vote in the last election.

But whether Facebook makes the decision itself or outsources to an independent board, Facebook will still face the consequences if the decision to uphold the Trump ban alienates Americans or people around the world who feel it is an attack on their freedom of expression.

People may leave Facebook for other platforms such as Parler, Gab and Signal, as many have already done since the initial Trump ban in January – and knowing an outside body made the decision won’t stop them.
And a poor “political” decision could drive away some advertisers and make it harder to hire and retain employees, regardless of who made it.

Twitter CEO Jack Dorsey made an internal decision to permanently suspended Trump from his company’s platform on Jan. 8, 2021. While Dorsey acknowledged that the decision set a “dangerous precedent,” Twitter, like other social media companies, doesn’t have an appeals process for that kind of decision.

Some newer companies, such as MeWe and Rumble, offer more lax content moderation in order to allow greater freedom of expression for users.

Gab describes itself as “A social network that champions free speech, individual liberty and the free flow of information online. All are welcome.” Parler’s content guidelines are even more basic and keeps content moderation to an “absolute minimum. We prefer to leave decisions about what is seen and who is heard to each individual.”

Gab and Parler are presently banned from the app stores of both Apple and Google due to a lack of content moderation.

Siri Terjesen is the Phil Smith Professor of Entrepreneurship and Associate Dean, Research and External Relations, at Florida Atlantic University.

Most Read

Local News

Man jailed 17 months for violent offences

Download The App On The iOS Store