国产视频

In Short

UK’s Draft Online Safety Bill Raises Serious Concerns Around Freedom of Expression

uk government gov.uk
Casimiro PT/Shutterstock.com

On May 12, the UK government published a of its Online Safety Bill, which attempts to tackle illegal and otherwise harmful content online by placing a duty of care on online platforms to protect their users from such content. The move came as no surprise: over the past several years, UK government officials have expressed that online services have not been doing enough to tackle illegal content, particularly child sexual abuse material (commonly known as CSAM) and unlawful terrorist and extremist content (TVEC), as well as content the government has deemed lawful but 鈥渉armful.鈥 The new Online Safety Bill also builds upon the government鈥檚 earlier proposals to establish a duty of care for online providers laid out in its April and its December 2020 to a consultation.

EFF and OTI submitted as part of that consultation on the Online Harms White Paper in July 2019, pushing the government to safeguard free expression as it explored developing new rules for online content. Our views have not changed: while EFF and OTI believe it is critical that companies increase the safety of users on the internet, the recently released draft bill reflects serious threats to freedom of expression online, and must be revised. In addition, although the draft features some notable transparency provisions, these could be expanded to promote meaningful accountability around how platforms moderate online content.

Our Views Have Not Changed: Broad and Vague Notion of Harmful Content

The bill is broad in scope, covering not only 鈥渦ser-to-user services鈥 (companies that enable users to generate, upload, and share content with other users), but also search engine providers. The new statutory duty of care will be overseen by the UK Office of Communications (OFCOM), which has the power to issue high fines and to block access to sites. Among the core issues that will determine the bill鈥檚 impact on freedom of speech is the concept of 鈥渉armful content.鈥 The draft bill opts for a broad and vague notion of harmful content that could reasonably, from the perspective of the provider, have a 鈥渟ignificant adverse physical or psychological impact鈥 on users. The great poses a risk of overbroad removal of speech and inconsistent content moderation.

In terms of illegal content, 鈥淚llegal content duties鈥 comprise the obligations of platform operators to minimize the presence of so-called 鈥減riority illegal content鈥, to be defined through future regulation, and a requirement to take down any illegal content upon becoming aware of it. The draft bill thus departs from the EU鈥檚 (and the proposed ), which abstained from imposing affirmative removal obligations on platforms. For the question of what constitutes illegal content, platforms are put first in line as arbiters of speech: content is deemed illegal if the service provider has 鈥渞easonable grounds鈥 to believe that the content in question constitutes a relevant criminal offence.

The bill also places undue burden on smaller platforms, raising significant concerns that it could erode competition in the online market. Although the bill distinguishes between large platforms (鈥淐ategory 1鈥) and smaller platforms (鈥淐ategory 2鈥) when apportioning responsibilities, it does not include clear criteria for how a platform would be categorized. Rather, the bill provides that the Secretary of State will decide how a platform is categorized. Without clear criteria, smaller platforms could be miscategorized and required to meet the bill鈥檚 more granular transparency and accountability standards. While all platforms should strive to provide adequate and meaningful transparency to their users, it is also important to recognize that certain accountability processes require a significant amount of resources and labor, and platforms that have large user bases do not necessarily also have access to corresponding resources. Platforms that are miscategorized as larger platforms may not have the resources to meet more stringent requirements or pay the corresponding fines, putting them at a significant disadvantage. The UK government should therefore provide greater clarity around how platforms would be categorized for the purposes of the draft bill, to provide companies sufficient notice of their responsibilities.

Lastly, the draft bill contains some notable transparency and accountability provisions. For example, it requires providers to issue annual transparency reports using guidance provided by OFCOM. In addition, the bill seeks to respond to previous concerns around freedom of expression online by requiring platforms to conduct risk assessments around their moderation of illegal content, and it requires OFCOM to also issue a transparency report which summarizes insights and best practices garnered from company transparency reports. These are good first steps, especially considering the fact that governments are increasingly using legal channels to request that companies remove harmful and illegal content.

However, it is important for the UK government to recognize that a one-size-fits-all approach to transparency reporting does not work, and often prevents companies from highlighting trends and data points that are most relevant to the subject at hand. In addition, the structure of the OFCOM transparency report suggests that it would mostly summarize insights, rather than provide accountability around how internet platforms and governments work together to moderate content online. Further, the draft bill does not significantly incorporate features such as providing users with notice and appeals process for content decisions, despite robust advocacy by content moderation and freedom of expression experts. Adequate notice and appeals are integral to ensuring that companies are providing transparency and accountability around their content moderation efforts, and are key components of the , of which EFF and OTI were among the original drafters and endorsers.

UK Government Should Revise the Draft Bill to Protect Freedom of Speech

As social media platforms continue to play an integral role in information sharing and communications globally, governments around the world are taking steps to push companies to remove illegal and harmful content. The newly released version of the UK Government鈥檚 Online Safety Bill is the latest example of this, and it could have a significant impact in the UK and beyond. While well intended, the bill raises some serious concerns around freedom of expression online, and it could do more to promote responsible and meaningful transparency and accountability. We strongly encourage the UK government to revise the current draft of the bill to better protect freedom of speech and more meaningfully promote transparency.

This post was co-written with , Electronic Frontier Foundation (EFF).

More 国产视频 the Authors

Spandana Singh
Spandana Singh

Policy Analyst, Open Technology Institute

christoph schmon
Christoph Schmon

International Policy Director, EFF

UK’s Draft Online Safety Bill Raises Serious Concerns Around Freedom of Expression