Facebook's new plans raise questions about motives and impartiality

Facebook has found itself at the centre of many allegations, scandals and legal disputes over the past few years. In an effort to combat some of this criticism Facebook has announced its plan to build a supreme court-like body to oversee content moderation on its social network.

After months of consultation, the tech giant has released an updated version of the charter that will govern its ‘oversight board’. The nominally independent body will have the power to make binding decisions about what is allowed on Facebook, will have up to 40 members, and will be funded by a trust that sits separately from Facebook. Acting like a Supreme Court for the technology giant, the new panel will ultimately have more power than majority shareholder and chief executive Mark Zuckerberg. In an open letter the Facebook founder explained the reasons behind setting up the oversight board.

“We are responsible for enforcing our policies every day and we make millions of content decisions every week. But ultimately I don’t believe private companies like ours should be making so many important decisions about speech on our own,” he wrote.“That’s why I’ve called for governments to set clearer standards around harmful content. It’s also why we’re now giving people a way to appeal our content decisions by establishing the independent oversight board”.

He added: “The board’s decision will be binding, even if I or anyone at Facebook disagrees with it”.

The board will be made up of a minimum of 11 part-time members when it launches in 2020, with Facebook hoping it will eventually comprise up to 40 people from all over the world.

In order to ensure a level of autonomy from Facebook, new board members will be appointed by the existing board and each member will serve no longer than three years. Facebook said it hopes to ensure the panel’s independence by paying the board through a trust, though that trust will be set up and funded by Facebook.

The publication of a charter outlining how the board will operate detailed the three main goals: providing oversight of Facebook’s content decisions, reversing Facebook’s decisions when necessary and acting as an independent authority outside of Facebook.

The plans for a new Oversight Board come amid intense scrutiny of Facebook and other social media companies over their content moderation policies. Debate has raged as to whether theunprecedented size and influence of social media has given private companies too much power to unilaterally police speech online. 

The move has been seen as a somewhat preemptive strike, as Facebook released its oversight board plans for content moderation and broadened its definition of terrorism, shortly after the company released results from a study about what the general public wants from the oversight body, and just before it’s set to appear before the Senate Commerce Committee to discuss its handling of violent and extremist content.

All of the major tech platforms (Facebook, Twitter and Google via YouTube) have come under increased scrutiny for giving a platform to harmful content. Including the House Judiciary Committee’s hearing about tech’s role in fuelling white nationalism and hate speech. Facebook itself has suffered greatly from the increased spotlight on real-world consequences linked to harmful content on Facebook, from Russian election meddling in the US to genocide in Myanmar.

Therefore, Facebook’s release of a detailed internal solution for content moderation should theoretically serve the company well as it attempts to defend itself against government scrutiny. A concrete plan to avoid similar failures is likely to help placate regulators and hopefully buy Facebook some goodwill.

However, experts have questioned the board’s independence, as well as the motivation behind the move. Bernie Hogan, senior research fellow at the Oxford Internet Institute, said of the proposals: “Facebook does not have a court. The only vote that really counts is the majority shareholder, Mark Zuckerberg”.

He added: “Facebook’s ‘supreme court’ invokes all the pomp and circumstance of actual judicial practice without any of the responsibility to citizens”.

Regardless of its real-world impact and impartiality, Facebook’s model could both provide a roadmap for other tech giants eager to alleviate content moderation pressure and also inform eventual regulation.

Given the very public nature of its process documentation, other tech companies could borrow from Facebook’s plan. It wouldn’t be the first time a platform has copied a good idea around self-regulation: LinkedIn and YouTube both began publishing content moderation transparency reports in 2011, and now Facebook, Instagram, Google, Twitter, Snap, and Pinterest all release some version of the same. Facebook has said the trust would be opened up for other networks to join, and fund, in the future.

Additionally, if the model proves effective, regulators could simply mandate that open platforms with a certain number of users create a similar mechanism of their own. They could also build on the idea further, or borrow less directly and keep the pieces that work while discarding those that don’t. Zuckerberg has made no secret of his eagerness to collaborate with regulators on content moderation, and Facebook’s Oversight Board could provide a starting point for such collaboration. While regulators and other tech companies will be interested in Facebook’s Oversight Board, users will likely be more concerned with the company’s policy changes around harmful content. Facebook has a wide variety of stakeholders to please, and its most recent batch of updates could help it improve its standing with them all.

The mechanics of what is effectively a corporate risk management process may not mend its relationship with users, but that might also not be its specific point. More concrete changes to policy, like Facebook’s effort to clearly define terrorist organisations according to their on-platform behaviour, could more effectively send a message that Facebook is listening to user concerns. Or as Mr Hogan colourfully puts it: “This panel is seen as an attempt to do something, but it appears to be just short of having enough teeth to make a difference. It is a way to tell critics ‘lay off, we are doing all [we] can’. Such a panel, while admirable, is no match for some well organised trolls or broad systemic issues”. 

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.