Compliance brief / US federal

Federal AI policy. Three documents. No federal law.

No broad federal AI law has passed in the United States. What's in motion: a December 2025 Trump executive order proposing federal preemption, a March 2026 White House legislative framework, and a pending AI LEAD Act. Here's what each says, what they don't, and why state laws still apply.

TLDR

On this page

  1. 01 The federal landscape today
  2. 02 The December 2025 executive order
  3. 03 The March 2026 White House framework
  4. 04 The AI LEAD Act and pending bills
  5. 05 What this means for your compliance program
  6. 06 How Northbeams maps to this
  7. 07 FAQ

01 / The federal landscape today

Sector law plus voluntary frameworks plus active litigation.

The US has no horizontal federal AI law. What it has is a layered set of rules that overlap with AI use without being AI-specific.

This is the "federal floor is a series of patches" scenario. Compliance teams have to read each AI use against multiple authorities, none of which is fully harmonized with the others.

02 / The December 2025 executive order

"Ensuring a National Policy Framework for Artificial Intelligence."

President Trump signed the executive order in December 2025. The operational consequences:

The EO is a litigation and rulemaking strategy. It is not a federal AI law. It cannot, on its own, overturn the Colorado AI Act, Texas TRAIGA, California's AI stack, or any other state law. The legal mechanisms that can overturn state law are an act of Congress (rare) or a federal court ruling that a specific state law violates the US Constitution (slow, fact-specific). Both tracks are now in motion.

Until those tracks resolve, organizations must continue to comply with state AI laws as they stand. The Northbeams editorial position: act on the law that exists; track the litigation; revisit your program if and when state laws fall.

03 / The March 2026 White House framework

A legislative wishlist for Congress.

In March 2026, the White House released a comprehensive national legislative framework for AI. Unlike the executive order, it does not bind anyone on its own. It is the administration's stated priorities for AI legislation, intended to push Congress toward a federal bill the administration would sign.

The framework covers four pillars:

The framework signals where federal legislation is most likely to appear first. Compliance programs should track child-safety and IP-watermarking provisions especially closely if they touch the relevant verticals.

04 / The AI LEAD Act and pending bills

Product liability for AI.

The Artificial Intelligence Leadership and Economic Advancement and Development (AI LEAD) Act is the leading pending federal AI bill. It would establish a product liability framework for AI systems, defining the conditions under which providers and deployers can be held liable for AI-caused harm.

Other pending federal bills target narrower issues: AI in elections, deepfake disclosure, foundation-model risk reporting, federal AI procurement standards. Most have moved through committee but not to floor votes. Congress remains divided on the right shape of AI legislation.

Two patterns matter for compliance:

05 / What this means for your compliance program

Build for state law. Build for the EU AI Act. Document for liability.

Three operational moves cover the federal uncertainty.

1. Continue to comply with state AI laws.

Until courts rule or Congress acts, the Colorado AI Act, Texas TRAIGA, California's AI stack, the New York RAISE Act, and the Illinois AI interview law remain enforceable. Audit, document, report.

2. Adopt a recognized framework.

NIST AI RMF or ISO/IEC 42001 (or both) gives you a portable evidence base. Most state laws recognize these as safe harbors. Federal liability legislation, if it passes, will likely defer to similar frameworks.

3. Document everything signed and dated.

If product liability comes, the impact assessment, the algorithmic discrimination audit, the consumer-notice procedure, and the signed audit log are the artifacts that protect you. Build them now under state-law obligations and they stack against federal liability later.

06 / How Northbeams maps to this

One inventory and one signed log that satisfies them all.

Federal preemption fights aside, every state AI law and every voluntary framework requires the same operational primitives: an inventory of AI in use, per-tool risk classification, documented controls, and signed retention. Northbeams produces all four, across browser, desktop, and CLI.

State + federal inventory

Continuous AI inventory across browser, desktop, and CLI.

If federal preemption never lands, you have your state-law evidence. If it does, you already have what any federal liability framework will reasonably ask for.

Liability-grade documentation

Immutable signed event log.

SHA-256 signed CSV exports. Tamper-evident retention. The defense file you'll wish you had.

Framework alignment

Pre-mapped to NIST AI RMF and ISO/IEC 42001.

Whatever federal law eventually overlays the state stack, you're already running on a recognized framework that the law will likely defer to.

Quarterly board report

Auto-generated executive risk-audit PDF.

Sprawl trend, incidents, policy-change history. The board sees the same picture every quarter so the federal noise does not knock you off course.

For the buyer view, the audit-prep page has the full evidence pack and tier comparison.

07 / FAQ

Common questions about US federal AI policy.

Is there a federal AI law in the United States?
Not yet. As of May 2026, no broad federal AI law has been enacted. The federal landscape consists of the December 2025 Trump executive order, the March 2026 White House legislative framework, the pending AI LEAD Act, sector-specific guidance from agencies like NIST and the FDA, and ongoing constitutional litigation over state AI laws.
Did the Trump executive order preempt state AI laws?
No. An executive order cannot overturn state law. The December 2025 EO directed the US Attorney General to challenge state AI laws on commerce-clause and federal-preemption grounds, but that's a litigation strategy, not a preemption. Until courts rule or Congress acts, state AI laws (Colorado, Texas, California, New York, Illinois) remain enforceable and you must comply.
What is the AI LEAD Act?
The Artificial Intelligence Leadership and Economic Advancement and Development (AI LEAD) Act is one of the leading pending federal AI bills in Congress. It would establish a product liability framework for AI systems, defining when and how providers and deployers can be held liable for harms caused by AI. As of May 2026 it has not passed.
What was in the March 2026 White House framework?
The framework set out the administration's legislative priorities for AI: child safety online, free speech protections, AI scam enforcement, and intellectual property protections for creators. It is a policy document, not legislation, and points Congress at what the administration would sign into law.
Should I wait for federal AI law before building a compliance program?
No. State laws are already enforceable, the EU AI Act applies to companies with EU users, and procurement teams require ISO 42001 or NIST AI RMF alignment regardless of federal status. A compliance program built on the canonical numbers and the safe-harbor frameworks travels well; whatever federal law eventually passes will overlay, not replace, what you already operate.

Build for the law that exists. Track the litigation.

Free to discover. Pay to control. Sentinel ships the audit-ready evidence pack with one-click export. Pre-mapped to state law and recognized frameworks.