Facebook鈥檚 Oversight Board Is Not as Powerful as We Think. But It Can Push the Company鈥檚 Policies in the Right Direction.
Last month, Facebook鈥檚 independent Oversight Board the that it will be reviewing. The entity was first conceived over two years ago, when Facebook CEO Mark Zuckerberg that the company would establish an independent body to review a set of content moderation cases every year, floating the idea of 鈥.鈥 Ultimately, Facebook created what it calls a Content Oversight Board. These framings鈥攁s either a Supreme Court or as an Oversight Board鈥攊ncorrectly imply that Facebook鈥檚 new entity has an immense amount of power over the company鈥檚 operations. As the Board begins reviewing cases, it is important to lower our expectations around the scope and scale of its powers, and instead consider what the Board can actually accomplish under its narrow mandate.
Calling Facebook鈥檚 new Board a 鈥淪upreme Court鈥 depicts the Board as the final arbiter on the company鈥檚 content moderation rules, or 鈥淐ommunity Standards.鈥 As the body is currently structured, this is impossible. The Board is tasked to review and offer a binding decision on specific cases that have been surfaced by both users and Facebook itself, and that Facebook has deemed highly important. However, the Board鈥檚 role will be limited to deciding these particular cases, rather like an intermediate appellate court. Whereas the U.S. Supreme Court may strike down congressional statutes that conflict with the U.S. Constitution, even if Facebook鈥檚 Board concludes that a Facebook rule or otherwise fails to protect users鈥 human rights, the Board cannot demand that Facebook remove or amend that rule. Thus, while the first cases announced by the Board raise questions under Facebook鈥檚 policies on hate speech, violence and incitement, and nudity, the Board is only empowered to assess whether Facebook properly applied these policies as written. In addition, although the Board can policy recommendations to Facebook proactively or when Facebook makes a policy advisory request, the company is not to accept and implement this guidance.
The name Content Oversight Board also suggests more power and influence than the new entity possesses. Oversight bodies typically have the authority to identify areas of concern, and then investigate and issue findings. But Facebook鈥檚 Board members have that they do not have the jurisdiction to review overall compliance with rules, as many oversight bodies do. Further, some oversight bodies, such as the where one of us previously served, have the authority to evaluate whether an entity鈥檚 governing rules are appropriate from a policy perspective. But Facebook鈥檚 Board also lacks the power to examine how Facebook develops its content policies and evaluate whether they appropriately balance all relevant considerations. If it were actually empowered to act like an oversight body, the Board could directly address criticisms from lawmakers, civil society groups, and users regarding myriad Facebook policies, ranging from its lack of transparency around how it moderates categories of content such as hate speech, to its troubling practices regarding collecting and managing user data, to Facebook leadership鈥檚 decision last fall to not hold politicians accountable when they spread misinformation. The limited scope of the Board鈥檚 authority means that the leadership of Facebook will continue to operate unchecked and without true oversight.
Additionally, it is important to recognize that the Board will only review a handful of cases every year鈥攊ndeed, the first group is only six cases鈥攁nd although Facebook can compel the Board to review certain cases in an expedited manner, most will take the Board weeks, if not months, to review. As representatives from the Content Oversight Board have stated, the Board is a 鈥渄eliberative body and not a rapid response mechanism.鈥 The lack of a rapid response mechanism has sparked frustration among Facebook critics, who in September an alternative unofficial oversight board to publicly analyze and critique Facebook鈥檚 content moderation policies and practices.
However, as the Board reviews its first set of cases, it is worth considering what impact it may actually have on Facebook鈥檚 content moderation operations. Since Facebook announced creation of the Board, the company has shared more information about its and . If the Board is operated effectively, it should be able to have a limited set of positive influences over certain components of the company鈥檚 content moderation efforts.
Most notably, the Board provides users with an opportunity to appeal decisions to an independent entity. Four of the first six cases are based on user referrals. Although it is not yet clear how independent the Board will be in practice, the Board鈥檚 , , and include measures designed to create independence, and having an additional layer of appeal can be beneficial. Facebook has also that the Board鈥檚 decisions will establish precedent to be followed in subsequent cases regarding how Facebook鈥檚 Community Standards should apply to particular fact patterns.
Another area in which the new Board offers some potential for a positive impact on Facebook鈥檚 content moderation practices is to address criticism that it has failed to account for cultural, linguistic, and regional contexts when relying on both and automated tools. The first group of cases relate to several different countries, including Azerbaijan, Brazil, China, France, and the United States. While it is unlikely that the Board will be composed of individuals who represent all of the communities and perspectives reflected among Facebook鈥檚 user base, the company has stated that it invested significant effort to recruit a broad set of Board members. These Board members will also have access to geographic and subject matter experts when making decisions. If used well, these resources could enable the Board to make more informed decisions on sensitive cases than Facebook's regular moderation staff can.
Finally, if Facebook and the Board prioritize transparency from the outset, the Content Oversight Board could provide unique insight into and accountability around its procedures. According to its Bylaws, the Board will publish an , which will at minimum include data points such as the number and type of cases the Board reviewed, a summary of the Board鈥檚 decisions and policy advisory statements, and an analysis of how the Board鈥檚 decisions considered international human rights. The Bylaws also that the Board will make each of its decisions publicly available and archive them in a database. Facebook also that within 30 days of receiving a policy recommendation from the Board, it will publish a statement explaining whether or not it will adopt the recommendation. When done correctly, transparency reports are valuable tools. If implemented consistently, these transparency mechanisms could provide insight into Facebook鈥檚 internal policy making process, and help hold the company accountable.
As numerous critics and organizations have , whether or not the Facebook Oversight Board is successful is largely dependent on the independence, legitimacy, operational structure, and transparency of the Board and Facebook. If the Board does have a positive impact, it could offer a new framework for appeals in the content moderation space, and its mandate could potentially be expanded in the future to include other forms of content, such as advertising, and other moderation procedures, such as algorithmic curation and promotion. However, as the Board begins its operations, we must keep in mind that this entity will not radically transform how Facebook operates as a whole, or even how it moderates content. Rather, it has the potential to push the company in the right direction, one limited step at a time.