The App Store Accountability Act Continues to Pose Serious Concerns for Our Privacy, Security, and Free Expression
Blog Post
Xavier Lorenzo via Getty Images
March 5, 2026
Today, the House Energy and Commerce Committee is considering a wide range of youth online safety bills for markup that touch everything from social media and gaming to the use of chatbot companions.
The markup will include H.R. 3149, the App Store Accountability Act (ASAA), as well as a new amendment that would update and replace ASAA.
When the ASAA was first introduced last year, the Open Technology Institute (OTI) opposed the bill because its sweeping age verification component would require everyone who uses an app store to verify their age, regardless of which app they intend to use. As we explained then, this was “like requiring every person shopping at a grocery store to provide ID upon entering the store, regardless of whether they intend to buy chips, fruit, or alcohol.” Age verification requirements—especially those blocking access to a wide range of speech, as seen in ASAA—undermine everyone’s privacy, online security, and constitutional rights.
These core concerns remain for both the latest version of ASAA and the proposed amendment.
ASAA Creates Privacy and Security Vulnerabilities
A Committee fact sheet circulated among staffers confidently states that “age verification can be done in a privacy-security-protective way.” OTI has written about promising advances on this front, but the digital ecosystem is not yet ready to support it at scale. More importantly, the ASAA’s proposed verification architecture would create major privacy and security vulnerabilities by requiring app stores to collect age information and then share that sensitive information with every app developer, regardless of whether an app actually needs that data.
Requiring people to hand over sensitive personal information, like government IDs, credit cards, or even biometric data, in order to verify their age online puts that information at risk of being exposed, stolen, misused, or subject to surveillance. In just the past year, we’ve seen these risks dramatically materialize. In July 2025, hackers exposed 13,000 selfies and photo IDs used to verify account holders from the Tea Dating Advice app. In October, Discord found that 70,000 users may have had their government-ID photos exposed; they were submitted as part of the platform’s age-gating process.
The Committee fact sheet highlights that ASAA “state[s] that submitting a government ID is not required.” But the bill’s other requirements establish incentives that strongly favor that outcome. ASAA provides safe harbors for app developers but not app stores. And app stores are required to capture granular age categories and are liable for accuracy failures. These mandates are likely to push platforms toward reliance on government-issued IDs. This raises serious implications for people from vulnerable communities or people seeking content that is often stigmatized.
ASAA Raises Constitutional and Free Expression Concerns
ASAA’s overly broad age verification requirement—applied to accessing every app—could chill speech and prevent people without acceptable forms of ID from accessing online spaces and content that they otherwise have a right to access. The rollout of the UK’s Online Safety Act demonstrated how age verification requirements can fuel censorship. In the wake of its implementation, news, journalistic content, and speech otherwise not intended to be blocked were age-gated.
The Committee fact sheet suggests that the ASAA “follows the approach in Texas’s age verification law for pornography that the Supreme Court upheld.” But the qualifier about the type of content is the key. In FSC v Paxton, the Court’s ruling narrowly permits age verification for explicit content. The Supreme Court hasn’t provided a constitutional blank check for age verification applied to all apps and content types, but that is precisely how the ASAA would operate.
This critique isn’t theoretical. In CCIA v. Paxton, a federal court has already temporarily blocked the Texas App Store Accountability Act on First Amendment grounds precisely because “the categories of speech it restricts are so exceedingly overbroad.” Judge Pitman’s order explains that requiring users to verify their age to download general-interest apps is “akin to a law that would require every bookstore to verify the age of every customer at the door.” The analogy powerfully captures the overbreadth problem that is also at the heart of the ASAA. (Similar litigation against Utah’s version of an app store bill is making its way through the courts.)
ASAA Creates Major Circumvention Gaps
These constitutional and privacy risks are even more troubling in light of the major circumvention problems that jeopardize the ASAA’s core safety goals. Minors can readily access much of the content that the bill’s drafters and many parents are eager to restrict via web browsers or, under the amended version of ASAA, through third-party app stores that no longer meet the definition of covered app stores.
This is precisely why OTI has argued that age verification requirements be applied in the least restrictive manner, with privacy-preserving techniques and strong data minimization rules, only to those apps and websites offering legally age-restricted content and only for the users seeking to access it. This approach is not just privacy and security-protective, it also allows for targeted interventions at content without chilling speech across the internet.
What’s Next?
As markup begins, it’s worth remembering that the Committee has less restrictive templates to consider. While not perfect, the Parents Over Platform Act (POPA)—which the committee chose not to mark up—offers a less restrictive, more privacy-forward approach to implementing youth safety guardrails at the app store level. POPA would require app stores to generate an age signal (which indicates only whether a user is an adult or minor) based on a user’s self-declared age, rather than strict age verification. Unlike the approach in ASAA, app stores would only be required to make this age signal available to apps that offer a different experience to young people or are labeled for adults only. Importantly, they would only share this age signal when an account holder or the account holder’s parent has agreed to share it.
POPA’s voluntary approach to sharing age signals, sharing with only a subset of apps, and other data security requirements in the bill are elements worth closer consideration and improvement. The ASAA, meanwhile, would magnify privacy and security vulnerabilities, chill free expression, and leave doors wide open that imperil its stated safety goals.