Spandana Singh
Policy Analyst, Open Technology Institute
This article in , a collaboration among , , and .
From #ThePushupChallenge to the slew of Renegade dance tutorials, TikTok has been a central form of entertainment and social media during the COVID-19 pandemic. But the platform has also drawn from U.S. lawmakers, who have concerns that because the social media app is owned by Chinese company ByteDance, it poses a national security threat to the United States and a risk to American users. Much to the chagrin of vehement TikTok users, President Trump signed two executive orders in August effectively banning TikTok in the United States by any transactions between U.S. entities and ByteDance, and by the parent company to sell or spin off its TikTok U.S. business within 90 days. To avoid this, TikTok is working on finalizing a deal with Oracle and Walmart, although this process has been with confusion over who would retain majority control over the company.
There are genuine reasons to worry about TikTok鈥檚 security鈥攁nd there also are a variety of security threats posed by . But something often overlooked is that before the Trump order came down, TikTok had made strides toward greater transparency around its content moderation practices, privacy policies, and its use of algorithmic decision-making, some of which went further than current industry standards around accountability.
Activists and researchers have continuously pressed internet platforms to provide greater transparency around their content, data privacy, and algorithmic decision-making policies and practices. This is because internet platforms act as gatekeepers of online speech and engage in vast data collection practices, with little accountability to the public. For example, in 2016, Facebook suspended the account of a Norwegian writer who shared the Pulitzer Prize winning Vietnam War 鈥溾 image of a crying and terrified child as part of a post on photographs that 鈥渃hanged the history of warfare.鈥 His account was suspended for violating the company鈥檚 policy on nudity and child porn. The incident sparked and demonstrated the ease with which an internet platform could omit or censor information, even a widely known, award-winning documentation of history. Similarly, YouTube鈥檚 content moderation algorithms for flagging and subsequently removing documentation of human rights atrocities in Syria as terrorist propaganda. Greater transparency around these content moderation and algorithmic curation policies and practices would help researchers and the public understand how these processes and tools are used, where they fall short, and what effects they have on user speech.
, , , and other social media platforms have responded to these calls by publishing detailed outlines of their content and data privacy policies and by sharing limited information online around how they use algorithmic decision-making to curate and moderate content. However, these companies have provided little transparency around how their human moderators are trained and how they operate, and companies have asserted that due to , they cannot reveal their source code to external audiences.
Before the Trump order came down, TikTok had made strides toward greater transparency around its content moderation practices, privacy policies, and its use of algorithmic decision-making.
TikTok, however, was poised to go beyond other platforms. In July, then-, stated that the entire industry 鈥渟hould be held to an exceptionally high standard鈥 and companies should be proactively disclosing information related to their algorithms, content moderation practices, and data privacy practices to regulators. (Mayer resigned in late August.) As part of this sensibility, in March TikTok that it would be opening two Transparency and Accountability Centers in Los Angeles and Washington, D.C. According to the company, at these centers experts will be able to TikTok鈥檚 human content moderation processes, data security practices, algorithmic systems, and the app鈥檚 source code, which will be available for testing and evaluation. The opening of the center in Los Angeles was due to the onset of the COVID-19 pandemic, but the company says it is still planning to launch both.
Of course, given that the Transparency and Accountability Centers have not fully opened yet, whether they will be an effective means of providing transparency and accountability is yet to be seen. Further, it鈥檚 not clear exactly how valuable the company鈥檚 transparency mechanisms truly are. In particular, some researchers have outlined that sharing access to company source code is not a valuable method for providing transparency. Most wouldn鈥檛 know what they were looking at鈥攅ven technical experts may not be able to the code. It鈥檚 understandable that someone might consider this a bit of transparency theater, so to speak, intended to show skeptical policymakers and eager regulators that the company has nothing to hide. Still, the efforts go further than the current industry standard for providing transparency and accountability around sensitive subjects such as content moderation and data privacy. These moves should put the onus on other platforms to step up their transparency game as well.
TikTok鈥檚 moves build on its first transparency report, which it in December 2019. The report provided data on legal requests for user information, government requests for content takedowns, and copyright-related content removals. Since then, the company has followed in the footsteps of companies such as Facebook and Google, by its transparency report to include data on the scope and scale of its Community Guidelines enforcement efforts.
This is noteworthy given that TikTok is a relatively new company. By comparison, Snap Inc., which has been releasing transparency reports since 2015, only on data points related to legal requests for user data and content removal, and does not report on how it enforces its content policies. Other more prominent companies, , have only recently expanded their transparency reports to include granular data around their content policy enforcement efforts.
TikTok鈥檚 transparency report, however, still has plenty of room for improvement. It only includes a limited set of metrics related to its content moderation operations and lacks granular data that outlines points such as how the platform detects and removes content, and how many appeals the company has received for its content moderation decisions. It also doesn鈥檛 publish ad transparency databases, which , , and all do. More broadly, transparency reports don鈥檛 always tell the full story, as companies can pick and choose which data points to report.
We don鈥檛 know how TikTok鈥檚 operations may transform in the coming months. But I hope that the company continues to follow industry best practices and norms around transparency and accountability, while also setting some of its own鈥攕omething that is uncommon among nascent and smaller internet platforms. Even if TikTok has no future in the United States, its moves could encourage other platforms to do more to provide transparency and accountability around their operations.