Draft Order Would Curb State AI Laws

draft order would curb state ai laws federal preemption strategy takes shape a draft executive order currently under consideration would significantly
draft order would curb state ai laws federal preemption strategy takes shape a draft executive order currently under consideration would significantly

A draft federal order would direct the US Department of Justice to sue states that pass laws regulating artificial intelligence, according to a document obtained by WIRED. The reported directive signals a sharp shift in federal-state relations over AI policy. It also raises urgent questions about who sets the rules for fast-growing technologies used in schools, workplaces, and elections.

The draft, which has not been officially released, would put Washington at odds with state lawmakers who have moved ahead with their own rules. It would test the limit of federal power in the absence of a sweeping national AI law. It could also reshape how companies develop and deploy AI across the country.

What the Draft Says

“The draft order, obtained by WIRED, instructs the US Justice Department to sue states that pass laws regulating AI.”

The language suggests a coordinated federal push to challenge state statutes in court. It is not clear which state laws would be targeted or on what legal theory the Justice Department would rely. The move would likely hinge on arguments about interstate commerce, preemption, or conflicts with existing federal policy.

Why This Matters Now

States filled a policy vacuum as Congress struggled to pass comprehensive AI legislation. In 2024, Colorado enacted a broad AI accountability law set to take effect in 2026. California lawmakers debated sweeping rules for developers of large models. Other states advanced bills on deepfakes, hiring tools, and consumer protection.

At the federal level, the White House issued an executive order in 2023 that directed agencies to set safety and civil rights guardrails. Regulators such as the Federal Trade Commission signaled they would apply existing consumer protection laws to AI. But no national statute sets uniform standards for data, safety, or liability.

See also  Railroads Focus On Friction To Improve Safety

Against that backdrop, a directive to sue states would be a dramatic effort to slow or stop the patchwork of rules. It could favor a single, federal approach—if one emerges.

Legal Questions and Possible Strategies

Legal scholars say the government could argue that state AI laws burden interstate commerce by imposing different standards on software used nationwide. Another path could be conflict preemption if state requirements clash with federal policy. Without a clear federal statute, those arguments may face tough scrutiny in court.

States would likely defend their authority to protect residents from discrimination, privacy harms, and safety risks. They could point to long-standing powers over consumer protection, labor, and civil rights. Courts may weigh the specific scope of each state law rather than issue a blanket ruling.

Industry, Civil Rights, and Public Interest Concerns

Major AI developers often warn that a patchwork of state rules raises costs and creates confusion. Some firms have urged Congress to set national standards. Civil rights groups and labor advocates argue that state laws are needed now to curb bias in hiring, housing, and credit decisions, and to address deepfakes ahead of elections.

  • Tech firms seek clear, uniform rules to streamline compliance.
  • States prioritize local protections for consumers and workers.
  • Advocates push for transparency, auditability, and safety checks.

Litigation led by the Justice Department could freeze or overturn state protections before federal rules are in place. That outcome would leave regulators relying on existing laws not tailored to AI.

What Could Happen Next

If the draft becomes official policy, early targets could be broad accountability statutes or specific rules for high-risk systems. Courts would then set early precedents shaping the balance between state authority and federal oversight. Companies might delay compliance plans while cases move forward, creating more uncertainty for users and consumers.

See also  Police Probe Substance Thrown At Buckingham

Congress could also respond by advancing national legislation to preempt states in some areas while preserving certain local protections. Federal agencies may expand guidance on safety, discrimination, and data security to fill gaps during litigation.

The reported draft order points to an intensifying fight over who writes the rules for AI. States moved first, and the federal government may now try to pull the reins. The outcome will affect how quickly protections arrive, how uniform they are across the country, and who bears responsibility when systems fail. Watch for court filings, agency guidance, and any fresh push in Congress as the next markers of where AI policy is headed.

deanna_ritchie
Managing Editor at DevX

Deanna Ritchie is a managing editor at DevX. She has a degree in English Literature. She has written 2000+ articles on getting out of debt and mastering your finances. She has edited over 60,000 articles in her life. She has a passion for helping writers inspire others through their words. Deanna has also been an editor at Entrepreneur Magazine and ReadWrite.

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.