国产视频

In Short

To Protect the Public Interest Internet, Lawmakers Have Better Tools than Section 230 Reform

IMG_5524

On June 20th, 国产视频鈥檚 Open Technology Institute (OTI) and the convened a hybrid event to discuss what鈥檚 at stake for public interest organizations in the Section 230 reform debate. Against the backdrop of the Supreme Court鈥檚 decision to punt on defining the scope of Section 230 in Gonzalez v Google LLC, the event demonstrated the need for policymakers to consider alternative, more tailored approaches 鈥 such as greater algorithmic accountability and privacy measures 鈥 to create healthier spaces online. Congress should consider algorithmic accountability measures, while industry should more widely and substantively that would provide greater clarity about their content policies and moderation efforts.

Senator Ron Wyden, one of the co-authors of Section 230, delivered the opening keynote address. He emphasized Section 230 empowerment of users to freely speak and access content online, and its role in building American leadership in the digital realm. The accompanying panel, featuring experts from libraries, digital archives, open data projects, and Wikipedia, demonstrated how Section 230鈥檚 liability protections are critical to the survival of public interest organizations. These organizations rely on Section 230 to publish, organize, and curate content that communities, educators, and public institutions create and share online.

In the face of increased , opaque platform policies, and over social media's negative impact on users, policymakers are seeking stronger safeguards against , particularly those that big tech companies permit or algorithmically amplify. But panelists noted that legislative reform efforts often target large technology companies with little regard for the effect of their proposals on overlooked players like small business and nonprofits working in the public interest. They cautioned policymakers that could do more harm than good to the beneficial features of the Internet.

鈥淪topping online conversations won't solve the problems politicians claim they will, but without 230 and the First Amendment, it will be harder for people without power, without clout, without political action committees鈥攖he marginalized voices鈥攖o call out wrongdoing by the powerful. And it'll certainly be easier for government to set the terms of public debate.鈥 鈥 Senator Ron Wyden.

Section 230: The 鈥26 words that created the internet鈥

of the Communications Decency Act serves as a liability shield for organizations when they make 鈥済ood-faith decisions鈥 while moderating objectionable third-party content on their platforms. By providing companies flexibility and freedom to moderate a variety of content, Section 230 has fostered innovation and competition, fueled the rise of user-generated content based platforms, and ushered in the creation of many essential aspects of the Internet. Today, the legislation helps create spaces where vulnerable communities, whistleblowers, dissenters, and activists can openly voice their concerns and speak out against injustices without fear of reprisal. In doing so, Section 230 provides what Senator Wyden calls the 鈥渇irst line of defense鈥 against censorship.

At the same time, the protections afforded by Section 230 have also meant that some algorithms and business models have allowed hate speech, violent content, and radicalism to proliferate in certain online spaces. Some legislators have seized on these harms to propose reforming or completely repealing Section 230 protections. While often well intentioned, these proposals would do more harm than good.

to reform Section 230 often take two contrasting positions. The first camp wants to see an even more permissive environment for online speech. They allege that internet platforms are ideologically biased, and that an approach where platforms would make far fewer content moderation decisions would counteract this bias. But the predictable effect of such a change would be to empower the creation of harmful online speech. Conversely, the second camp argues for imposing stricter duties to moderate. But even narrow attempts to alter Section 230鈥檚 liability shield could inadvertently incentivize censorship. This isn鈥檛 a theoretical concern 鈥 recent evidence demonstrated precisely how this can happen. SESTA/FOSTA鈥檚 carve out of Section 230, which OTI and other digital rights advocates opposed, failed to as intended. Instead, the fear of lawsuits led companies to impose broad content moderation measures that removed content protected by the First Amendment 鈥 all while driving sex workers to more dangerous, darker parts of the internet.

The impact of either approach could be especially devastating for public interest organizations that allow users to report, curate, preserve, share, and archive materials. , through community-generated and -moderated content, operates one of the top reference sites in the world. Libraries run digital repositories and collections, as well as provide internet and network access to the public. Public archival programs create a public copy of the internet, helping save news, research, and content that may otherwise be lost. These public interest services take great pains to moderate responsibly and in good faith, but the removal of Section 230 protections could leave them unable to counter hateful or inaccurate speech. In an environment without the Section 230 liability shield, recent efforts across states to ban, as well as , could make organizations hesitant to allow or host important, often life-saving, information.

Alternative Avenues for Safer Online Spaces

So where should we turn to make online spaces meaningfully safer?

OTI believes that intermediary liability protections, like Section 230, are critical to effective content moderation that balances the important objectives of free expression with safety online. Rather than creating overly broad amendments or striking down Section 230, other legislative and regulatory avenues offer a more tailored approach to addressing the need for more thoughtful content moderation practices.

Creating a healthier online environment requires increased platform and algorithmic accountability implemented through a wide range of stakeholder action. Companies should adopt best practices vetted by civil society organizations, such as the , to provide users clarity around content policies and moderation efforts. Congress should increase algorithmic accountability measures by passing legislation like the , which requires companies to be clearer about how they use artificial intelligence and machine learning tools to shape online content.

In order to address the more broad harms that flow from algorithms powered by data-extractive business models, U.S. policymakers should take other foundational legislative and policy actions, including passing a comprehensive federal privacy law and implementing stronger pro-competitive measures.

Protecting the public interest and free speech online, and making the internet a safer place are not mutually exclusive if we realize the problem is not rooted in the existence of Section 230. Government and industry can take significant and important action to address foundational concerns to fight online harms and prioritize internet openness without weakening or eliminating crucial legislation.

More 国产视频 the Authors

Sarah Forland
Forland_Headshot.original (1)
Sarah Forland

Policy Analyst, Open Technology Institute, 国产视频

To Protect the Public Interest Internet, Lawmakers Have Better Tools than Section 230 Reform