Table of Contents
Policy Recommendations
Although the Simplified Algorithms for User Learning (SAUL) label seeks to add to the field of explainability, policy changes concerning consumer education and data transparency are needed to ensure a successful and iterative rollout of these kinds of nutrition labels as generative AI evolves. Below are the associated policy recommendations for the federal government and civil society to advance the wider field of data nutrition labels and the SAUL label itself.
Supervise the Research Process
The White House Office of Science and Technology Policy (OSTP) should supervise the process of researching the components of a nutrition data label due to their work with the AI Executive Order. The OSTP should also publish a retrospective on pilot initiatives of data labels such as SAUL to further increase transparency efforts. To ensure the viability of data nutrition labels like SAUL, research from government agencies and leading academic organizations in the artificial intelligence space will be recommended to further efforts. Research efforts will be led by agencies such as the Defense Advanced Research Projects Agency and the National Institute of Standards and Technology’s U.S. AI Safety Council.
Convene a Participatory Public Council on Generative AI
A participatory public council that consists of the general public, civil society organizations (CSOs), and non-governmental organizations (NGOs) will ensure that consumer interests, needs, wants, and concerns are addressed. The research process will focus on creating standards for the design and continual integration of feedback from the accountability process in three ways: (1) assurance of keeping up to date on the latest developments of the space, (2) adherence to updates on the design and research process, and (3) commitment to incorporating feedback from consumers. To ensure a policy process that evolves alongside the development of generative AI tools, a four-part cyclical policy process should be considered that sequentially flows from development to deployment to engagement to feedback.
For example, this four-part cycle could include submission to the SAUL development program, research of data policies and algorithms, and a closed-door quality assurance (QA) process with industry and non-industry stakeholders to evaluate SAUL. Stakeholders involved in the development process would include five to seven private companies within the generative AI space. During the research and QA processes, industry partners from the private sector, academia, CSOs, and nonprofits would be selected to help guide research and QA. After successful rounds of QA, members of the Safety Council would determine the deployment of the subsequent releases of the SAUL label.
Commit to Consumer Education and Consumer Rights
The U.S. Department of Education in conjunction with OSTP should work to develop educational benchmarks for K–12 students to learn about the wider field of AI. Moreover, advocacy efforts for consumer rights within the AI field should be developed to ensure that the changing needs of the consumer are understood and integrated into the field of explainability.