Table of Contents
Executive Summary
Advancements in deep learning and natural language processing (NLP) catapulted financial markets into a cultural and economic hype cycle for AI-based tools. Although AI-based tools are not new, consumers have experienced a dramatic increase in access to generative AI tools since 2022.1 Collectively, the AI market will eclipse $298.2 million by the end of 2024 and is projected to rise to roughly $1.8 billion by 2030.2 With the rise of generative AI tools, anxieties about the future of work, an individual鈥檚 likeness and privacy, and election security have increased.3 However, users are often not provided with the information needed to understand the how and why鈥攐r transparency and explainability鈥攂ehind the algorithms that power their favorite applications. 鈥淓xplainability鈥 refers to the ability of human users to understand and trust the results created by machine learning algorithms. Explaining generative AI tools to the general public is confined to algorithmic decision systems in social media feeds and privacy labels, but there are still substantial gaps in explaining appropriate uses, potential harms, and available data privacy protections.4 While explainability efforts focused on social media feeds have increased awareness about the field of algorithmic explainability, there is still a deficit of tools available for consumer audiences.
In 1969, the White House Conference on Food, Nutrition, and Health led to a refocusing of explainability efforts for the U.S. Food and Drug Administration (FDA), resulting in nutrition labels. Since the advent of the FDA nutrition label, iterations of its iconic design have inspired a new generation of labels for broadband policies and data scientists.5 Nutrition labels democratize access to information, increase transparency, and expand freedom of choice.
鈥淣utrition labels democratize access to information, increase transparency, and expand freedom of choice.鈥
However, one significant drawback of nutrition labeling for software products is its static labeling. Software products change with updates, depreciation of features, and bug fixes, therefore the labeling system for software must be dynamic and offer information relevant to the consumer. Many of the labeling efforts for software tools tend to approach labeling from a top-down governance approach by including dense details relevant only to subject matter experts such as lawyers, security professionals, and engineers.
This report seeks to develop a preliminary universal design labeling system for two generative AI tools. The Simplified Algorithms for User Learning (SAUL) label displays three sections of information in plain English, including tool functionality, potential harms of use, and data protection policies.6 In addition, this report seeks to advocate for further research of the label, enforcement of the voluntary design by the Federal Communications Commission (FCC), and oversight by a participatory public council led by the Department of Homeland Security鈥檚 Artificial Intelligence Safety and Security Board.
Citations
- John Schulman et al., 鈥淚ntroducing ChatGPT,鈥 OpenAI Blog (blog), OpenAI, November 30, 2022, ; Linyuan Lu et al., 鈥淩ecommender Systems,鈥 Physics Reports, February 7, 2012, 鈥嬧.
- 鈥淕lobal AI Market Size Worldwide in 2021 With a Forecast until 2030,鈥 Statista, 2024, .
- Anna Milanez, 鈥淭he Impact of AI on the Workplace: Evidence from OECD Case Studies of AI Implementation,鈥 Organization for Economic Cooperation and Development (OECD) Social, Employment and Migration Working Papers No. 289, March 27, 2023, ; Natasha Singer, 鈥淭een Girls Confront an Epidemic of Deepfake Nudes in Schools,鈥 New York Times, April 8, 2024, ; Ali Swenson and Will Weissert, 鈥淣ew Hampshire Investigating Fake Biden Robocall Meant to Discourage Voters Ahead of Primary,鈥 AP News, January 22, 2024, .
- 鈥淚ntroducing 22 System Cards That Explain How AI Powers Experiences on Facebook and Instagram,鈥 Meta AI (blog), Meta AI, June 29, 2023, ; 鈥淧rivacy Labels,鈥 Apple, 2024, .
- Cora Lewis, 鈥淚nternet Providers Must Now Be More Transparent 国产视频 Fees, Pricing, FCC Says,鈥 AP News, April 10, 2024, ; 鈥淭he Data Nutrition Project,鈥 Data Nutrition Project, 2024, .
- Plain English is described as being understood on the Flesch-Kincaid scale of 7th鈥8th grade.