Why Courts Can't Actually Judge State AI Laws: The Evidence Problem Reshaping Regulation
Courts are being asked to make cost-benefit judgments about state AI laws without the actual data to back up those decisions. As more than 1,500 state-level AI bills have been introduced across the U.S. in 2026 alone, a constitutional doctrine called the dormant Commerce Clause is emerging as a critical battleground. The problem: judges almost never have the empirical evidence they need to determine whether a state's AI regulations unfairly burden out-of-state companies .
What Is the Dormant Commerce Clause and Why Does It Matter for AI?
The dormant Commerce Clause is a constitutional principle that limits states' ability to pass laws that interfere with interstate commerce. When a state AI law is challenged in court, judges must weigh its costs against its benefits to determine if the burden on out-of-state businesses is "clearly excessive" relative to the law's local benefits. This test, known as Pike balancing after a 1970 Supreme Court case, sounds straightforward in theory. In practice, it's nearly impossible to execute fairly .
The doctrine has three main components designed to protect interstate commerce:
- Anti-discrimination principle: States cannot favor in-state interests over out-of-state ones, though this rarely appears in AI legislation
- Anti-extraterritoriality principle: States cannot regulate conduct occurring entirely outside their borders, such as requiring open-source developers in one state to comply with disclosure laws in another
- Anti-excessive burden principle: States cannot impose burdens on interstate commerce that are clearly excessive relative to the law's local benefits, requiring judges to conduct cost-benefit analysis
The third prong presents the hardest analytical challenge. Under Pike balancing, judges are asked to weigh costs against benefits, but the record before them rarely contains systematic data on either side. As two legal scholars noted, "cost-benefit analysis under Pike is much more demanding than current judicial practice contemplates, and it cannot be done by federal courts with any rigor absent a sea change in the way they assess DCC problems" .
Why Is This Evidence Gap a Real Problem Right Now?
The stakes are becoming concrete. Last week, xAI filed a federal lawsuit challenging Colorado's AI Act partly on dormant Commerce Clause grounds, making this constitutional question no longer hypothetical . With states introducing AI bills at an accelerating pace, more legal challenges are likely to follow.
The evidence gap creates a double bind for startups and smaller AI companies. If judges lack data to properly evaluate whether state laws are unconstitutional, two bad outcomes become possible: unconstitutional laws with high compliance costs may survive, making it harder for smaller firms to compete against deep-pocketed platforms that can navigate complex regulatory landscapes; or laws that serve legitimate state interests and impose minimal costs on startups may be struck down simply because judges have no evidentiary record to support them .
This matters because state-level AI regulation is exploding. In 2025, more than 1,000 state-level AI bills were introduced across all 50 states. The pace accelerated in 2026, with lawmakers in 45 states introducing over 1,500 AI-related bills by March, surpassing the entire total for 2025 .
How Can Policymakers and Courts Fix This?
The solution requires both better evidence and better tools for judges to interpret that evidence. Policymakers can take several concrete steps to address this gap:
- Build evidentiary records: Systematically gather data on the burdens, benefits, and alternatives associated with state AI laws before they face legal challenges, giving judges actual numbers to work with
- Develop analytical frameworks: Create methodologies for quantifying burdens and measuring benefits, even when comparing apples and oranges like economic fragmentation versus cybersecurity improvements
- Institutionalize cost-benefit analysis: Both executive and legislative branches can adopt practices that produce the data and analysis the judicial branch needs, similar to how the federal Office of Information and Regulatory Affairs has conducted cost-benefit analysis of federal regulations since the Reagan administration
The White House's "Ensuring a National Policy Framework for Artificial Intelligence" Executive Order has already tasked the Commerce Department with identifying burdensome state laws and created a Justice Department task force to challenge state laws that may violate the dormant Commerce Clause. Executive branch officials will likely develop data and analytical tools to do this work, and some of that information may eventually reach judges' hands .
However, critics of cost-benefit analysis argue that it cannot meaningfully quantify values like human life, health, and environmental protection. But decades of government practice demonstrate that cost-benefit analysis can work in practice, even in areas that seem to resist quantification. The challenge is not whether cost-benefit analysis is possible, but whether courts will have access to the evidence and tools they need to apply it fairly .
What Does This Mean for AI Innovation and Safety?
The evidence gap creates uncertainty that could harm both innovation and public safety. Without clear data on the costs and benefits of state AI laws, companies cannot predict which regulations they'll face, making it harder to plan investments and operations. At the same time, legitimate state interests in AI safety and consumer protection may be undermined if courts strike down well-designed laws simply because the evidentiary record is incomplete .
As states continue to legislate at an extraordinary pace, the constitutional limits on their authority will increasingly shape the regulatory landscape. The question is whether courts will have the information they need to apply those limits fairly, or whether the evidence gap will lead to unpredictable and potentially unjust outcomes for both companies and the public.