国产视频

In Short

Meta鈥檚 Mandate-Free Governance: How to Think about Facebook and the Oversight Board

shutterstock_1967934286.jpg

Last month, the U.S. Supreme Court heard oral arguments in that could upend the internet as we know it. Both center on instances of online radicalization and whether tech companies should be held liable for content published on their platforms that precipitates real-world harm. At issue is whether Section 230 of the shields firms like Google, Meta, and Twitter from liability for user-generated content that appears on their sites.

The core question is one societies worldwide are grappling with: Who is responsible for regulating online content? Legal experts are highly the Supreme Court will rule in a way that meaningfully alters Section 230 and imposes stricter requirements on firms. And unlike in the European Union, where the Digital Services Act has created the most robust and far-reaching set of enforceable legal obligations on large platform companies in any jurisdiction to date, U.S. lawmakers are apt to continue allowing tech companies to police themselves when it comes to content moderation.

But in practice, these companies are governing not just themselves. In setting up processes and bodies to moderate content, they are exercising governance over publics all over the world without explicitly delegated authority from national sovereigns to do so. This phenomenon of what I call private polity-making is an unprecedented type of quasi-democratic governance, one best embodied by the Facebook Oversight Board.

In September 2019, facing ire from shareholders and politicians over content moderation decisions on its platform, Facebook (since renamed Meta) unveiled the charter of a . The Oversight Board鈥檚 stated purpose was 鈥渢o protect free expression by making principled, independent decisions about important pieces of content and by issuing policy advisory opinions on Meta鈥檚 content policies.鈥 The Board launched in October 2020 with hopes private case law, especially in hate speech, that 鈥渞eal courts would eventually cite.鈥 Board members included a former Danish prime minister, human rights lawyers, journalists, and a Nobel laureate. But membership notably did not include computer scientists or software engineers, the type of experts who might be best positioned to weigh in on technical aspects of content moderation such as algorithmic design.

The Board鈥檚 greatest test for managing digital harms came when it was just months old. On January 7, 2021, Facebook banned President Donald Trump after an insurrectionist mob stormed the U.S. Congress in 国产视频 name. On January 21, after President Biden took office, Facebook referred 国产视频 account suspension to the Board (per the bylaws, Trump could not appeal the suspension himself).

In May, the Board upheld the suspension but Facebook for not specifying its length: 鈥淔acebook cannot make up the rules as it goes, and anyone concerned about its power should be concerned about allowing this.鈥 Legal observers 鈥渨hether 国产视频 deplatforming represents the start of a new era in how companies police their platforms, or whether it will be merely an aberration.鈥

The decision also raised a more profound question: As an experiment in democratic digital governance, how should we think about the Facebook Oversight Board?

On the surface, the Oversight Board is a classic case of a , one 鈥渆ngaged in authoritative decision-making that was previously the prerogative of sovereign states.鈥 But the Board does not just exercise authority over Facebook鈥檚 activities, playing a role that would typically be the purview of a government regulator. No user votes were cast and no explicit public mandate or authority was given to the Oversight Board to decide the regulation of content.

In fact, on closer examination, the Oversight Board is emblematic of Facebook鈥檚 efforts to expand the scope of private governance. This flavor of private governance is unlike traditional corporate self-regulation that governs corporate conduct through voluntary codes of conduct 鈥 although Facebook in those too. It is also different than delegated private governance such as in war contracting when states designate defense or security firms to fulfill tasks and roles that might normally be performed by a conventional military force. Instead, Facebook generally and the Board specifically is emblematic of private polity-making where corporations govern other publics without explicitly delegated authority.

In this sense, through the Oversight Board, Facebook can act as a gatekeeper typical in a democratic system. Except instead of elected officials who act with the consent of the governed, the gatekeeper here is an unelected corporation governing without legitimate authority. Founder and CEO Mark Zuckerberg has said he does not intend for the Board to become 鈥淚 don鈥檛 think most people want to live in a world where you can only post things that tech companies judge to be 100% true.鈥 Yet, Facebook鈥檚 private polity-making allows Zuckerberg to admit that no one gave the company authority to define acceptable speech for billions while at the same time in policing the speech of billions.

Facebook has had a complicated relationship with responsibility. When reports of fake news on Facebook influencing the 2016 U.S. election first came out, Zuckerberg it a 鈥減retty crazy idea.鈥 It was later revealed that the profiles of at least 87 million Americans were compromised from a third-party application made by a voter-profiling firm, Cambridge Analytica, employed by the Trump campaign. After the scandal broke in March 2018, Zuckerberg went on a around the American states, before that 鈥淲e didn't take a broad enough view of our responsibility, and that was a big mistake. 鈥 Across the board, we have a responsibility to not just build tools, but to make sure that they're used for good.鈥

Zuckerberg鈥檚 mea culpa was a sign of Facebook鈥檚 gradual and reluctant acceptance of its polity-making role. In creating the Oversight Board, Facebook鈥檚 leadership the importance of courting public legitimacy: 鈥淎t the end of the day you can build all the things, but you just have to have enough people that believe in order to make it real,鈥 said Brent Harris, the consultant who led the effort to create the Board. The Board鈥檚 global scope appears vast; it supports 18 languages, three times the UN鈥檚 six. Yet, its mandate to counter algorithmic harms is narrow. For the first seven months, users could only appeal content takedowns, not content that remained on the site鈥攎aking it difficult to combat misinformation. Users still cannot challenge issues related to advertising or algorithms. Moreover, the Board only reviews a tiny fraction of the approximately 200,000 posts eligible for appeal daily from automated and human moderation, issuing 35 case decisions and two policy advisory decisions thus far. The board does publish a and has made changes to its original charter and bylaws, for instance plans to review more cases.

Facebook鈥檚 Oversight Board appears to be better prepared than its counterparts to manage private governance. In October 2022, Twitter鈥檚 CEO Elon Musk proposed a Content Moderation Council, somewhat modeled after the Oversight Board, with Musk that 鈥渘o major content decisions or account reinstatements will happen before the Council convenes.鈥 Yet two weeks later, Musk overrode the Council when he reinstated 国产视频 account following a user poll. Musk鈥檚 action marked a stark difference between Twitter and Facebook in managing digital harms.

Emerging regulations such as the European Union鈥檚 Digital Services Act (DSA) will further strain tensions between private and public polity-making, specifically between legal requirements for content moderation and platforms鈥 terms of service. While Facebook鈥檚 Oversight Board has upheld international human rights laws in its decisions, it is not a legal body, and it does not serve as a solution in the absence of internet platform governance. As alternate public means of governance such as the EU鈥檚 Digital Services Act are implemented, conflicts are sure to emerge. Bodies such as the Facebook Oversight Board are innovative attempts to govern what is still an unruly space, but they may be a poor substitute for true democratic governance.

Swati Srivastava is Assistant Professor of Political Science at Purdue University, where she researches public/private relations in global governance, including the political power and responsibility of Big Tech. Her book Hybrid Sovereignty in World Politics was recently published by Cambridge University Press.

More 国产视频 the Authors

Programs/Projects/Initiatives

Meta鈥檚 Mandate-Free Governance: How to Think about Facebook and the Oversight Board