Welcome to New America, redesigned for what’s next.

A special message from New America’s CEO and President on our new look.

Read the Note

Executive Summary

Advancements in deep learning and natural language processing (NLP) catapulted financial markets into a cultural and economic hype cycle for AI-based tools. Although AI-based tools are not new, consumers have experienced a dramatic increase in access to generative AI tools since 2022.1 Collectively, the AI market will eclipse $298.2 million by the end of 2024 and is projected to rise to roughly $1.8 billion by 2030.2 With the rise of generative AI tools, anxieties about the future of work, an individual’s likeness and privacy, and election security have increased.3 However, users are often not provided with the information needed to understand the how and why—or transparency and explainability—behind the algorithms that power their favorite applications. “Explainability” refers to the ability of human users to understand and trust the results created by machine learning algorithms. Explaining generative AI tools to the general public is confined to algorithmic decision systems in social media feeds and privacy labels, but there are still substantial gaps in explaining appropriate uses, potential harms, and available data privacy protections.4 While explainability efforts focused on social media feeds have increased awareness about the field of algorithmic explainability, there is still a deficit of tools available for consumer audiences.

In 1969, the White House Conference on Food, Nutrition, and Health led to a refocusing of explainability efforts for the U.S. Food and Drug Administration (FDA), resulting in nutrition labels. Since the advent of the FDA nutrition label, iterations of its iconic design have inspired a new generation of labels for broadband policies and data scientists.5 Nutrition labels democratize access to information, increase transparency, and expand freedom of choice.

“Nutrition labels democratize access to information, increase transparency, and expand freedom of choice.”

However, one significant drawback of nutrition labeling for software products is its static labeling. Software products change with updates, depreciation of features, and bug fixes, therefore the labeling system for software must be dynamic and offer information relevant to the consumer. Many of the labeling efforts for software tools tend to approach labeling from a top-down governance approach by including dense details relevant only to subject matter experts such as lawyers, security professionals, and engineers.

This report seeks to develop a preliminary universal design labeling system for two generative AI tools. The Simplified Algorithms for User Learning (SAUL) label displays three sections of information in plain English, including tool functionality, potential harms of use, and data protection policies.6 In addition, this report seeks to advocate for further research of the label, enforcement of the voluntary design by the Federal Communications Commission (FCC), and oversight by a participatory public council led by the Department of Homeland Security’s Artificial Intelligence Safety and Security Board.

Citations
  1. John Schulman et al., “Introducing ChatGPT,” OpenAI Blog (blog), OpenAI, November 30, 2022, source; Linyuan Lu et al., “Recommender Systems,” Physics Reports, February 7, 2012, ​​source.
  2. “Global AI Market Size Worldwide in 2021 With a Forecast until 2030,” Statista, 2024, source.
  3. Anna Milanez, “The Impact of AI on the Workplace: Evidence from OECD Case Studies of AI Implementation,” Organization for Economic Cooperation and Development (OECD) Social, Employment and Migration Working Papers No. 289, March 27, 2023, source; Natasha Singer, “Teen Girls Confront an Epidemic of Deepfake Nudes in Schools,” New York Times, April 8, 2024, source; Ali Swenson and Will Weissert, “New Hampshire Investigating Fake Biden Robocall Meant to Discourage Voters Ahead of Primary,” AP News, January 22, 2024, source.
  4. “Introducing 22 System Cards That Explain How AI Powers Experiences on Facebook and Instagram,” Meta AI (blog), Meta AI, June 29, 2023, source; “Privacy Labels,” Apple, 2024, source.
  5. Cora Lewis, “Internet Providers Must Now Be More Transparent About Fees, Pricing, FCC Says,” AP News, April 10, 2024, source; “The Data Nutrition Project,” Data Nutrition Project, 2024, source.
  6. Plain English is described as being understood on the Flesch-Kincaid scale of 7th–8th grade.

Table of Contents

Close