Senator Marsha Blackburn released a 291-page draft bill on March 18 that would fundamentally reshape how AI and online platforms operate in the United States. The legislation—with the unwieldy title “The Republic Unifying Meritocratic Performance Advancing Machine intelligence by Eliminating Regulatory Interstate Chaos Across American Industry Act”—does something neither party has managed in 30 years: it repeals Section 230.
The White House followed two days later with its own framework, and neither document has the votes to pass. But together they reveal where federal AI regulation is heading—and it’s not where anyone expected.
What the Bill Actually Does
Section 230 repeal: The bill eliminates the liability shield that has protected online platforms since 1996. Platforms would become legally responsible for user-generated content. This alone would upend business models across the tech industry.
Duty of care: AI developers must “exercise reasonable care” in design and operation to prevent foreseeable harms. Covered entities must conduct regular risk assessments examining how algorithmic systems contribute to psychological, physical, financial, and exploitative harms.
Political bias audits: High-risk AI systems would require annual third-party audits to detect “viewpoint discrimination or discrimination based on political affiliation.” Federal procurement rules would ban AI models that show ideological bias.
Copyright rewrite: The unauthorized reproduction of copyrighted works for AI training does not constitute fair use under the bill. Copyright holders gain subpoena powers to discover how their works were used in training. AI-derivative works cannot receive independent copyright protection.
Voice and likeness rights: Individuals can license their voice and visual likeness for digital replicas. Unauthorized synthetic replicas become illegal—a provision backed by SAG-AFTRA, major studios, and record labels.
Products liability: AI systems are classified as products, enabling lawsuits for defective design, failure to warn, and unreasonably dangerous systems. Private rights of action allow individuals to sue developers directly.
Workforce reporting: Publicly traded companies and federal agencies must report AI-related layoffs to the Department of Labor quarterly.
Data center costs: Operators must pay the full cost of energy and water infrastructure needed for their facilities—no impact on residential ratepayers.
The White House Version
The administration’s March 20 framework takes a different approach. It emphasizes limiting developer liability, particularly opposing “open-ended liability” that could trigger “excessive litigation.” It also restricts states’ ability to hold developers accountable for third-party misuse.
The framework addresses child protection, intellectual property, and age verification—overlapping with Blackburn’s bill—but without the aggressive liability expansion. Where Blackburn’s bill treats AI as a product you can sue over, the White House framework treats it as a tool whose makers shouldn’t bear responsibility for how others use it.
Neither document directly endorses the other. The White House statement acknowledged “productive conversations with legislators” while offering no endorsement of Blackburn’s specific proposal.
Republican Division
More than 50 Republican lawmakers signed a letter expressing concern that administration efforts to block state AI legislation represented “an effort to prevent the passage of measures holding the tech industry accountable.”
The tension is visible in how state preemption works—or doesn’t—in Blackburn’s bill. Despite following a December executive order targeting state AI laws, the bill does not preempt “generally applicable law, such as a body of common law.” For child protection provisions, it explicitly allows states to enact stricter laws.
Conservative critics have also called the bill “a kitchen sink of internet and AI regulation that could create more problems than it solves”—not the light-touch approach they expected.
What Industry Thinks
Tech companies have fought for decades to preserve Section 230. But the bill’s coalition is unusual: Google and OpenAI support the voice and likeness protections alongside Hollywood studios and labor unions.
The copyright provisions would directly affect every major AI company. Declaring that AI training isn’t fair use doesn’t just invite lawsuits—it retroactively questions the legal foundation of existing models.
Meanwhile, the administration cut off Anthropic from government contracts for being “woke.” The company is now suing over First Amendment concerns. The political bias audit requirements in Blackburn’s bill suggest this won’t be the last company to face ideology-based enforcement.
What This Means
This bill is a discussion draft, not a final proposal. Key provisions haven’t had committee markups. The Section 230 repeal alone would face coordinated industry opposition.
But the direction matters. Both the bill and the White House framework assume federal preemption of state AI laws—the question is what replaces them. Blackburn’s answer is aggressive federal regulation with extensive liability exposure. The White House prefers limiting liability while blocking states from filling the gap.
Neither approach has clear support. Republicans who want to punish Big Tech like the bias audits and Section 230 repeal. Republicans who want to promote AI development hate the compliance burden. Democrats who want AI accountability support duty-of-care requirements. Democrats who fear censorship worry about government-defined neutrality standards.
The copyright provisions may have the clearest path forward. They align tech companies, Hollywood, and labor unions against a common enemy—unauthorized training on copyrighted works. If anything from this bill becomes law, it might be those sections.
What You Can Do
If you’re an AI developer: The products liability classification is significant. If AI systems become legally equivalent to physical products, insurance and legal exposure change substantially.
If you’re a creator: The voice and likeness provisions have broad industry support. Watch for standalone legislation even if the larger bill stalls.
If you operate platforms: Section 230 repeal remains unlikely, but the conversation has shifted from reform to elimination. Plan for a world with reduced liability protections.
The bill’s 291 pages contain provisions that would please and horrify nearly everyone. That’s usually how legislation dies. But it’s also how negotiations start.