
WASHINGTON, D.C. — In a decisive move to secure the future of human creativity, the Nashville Songwriters Association International (NSAI) has launched an intensified lobbying effort on Capitol Hill this week. Facing what leadership describes as an "existential crisis" posed by unchecked generative artificial intelligence, the organization is rallying bipartisan support for a trio of federal bills: the CLEAR Act, the COPIED Act, and the TRAIN Act.
The delegation, led by NSAI Executive Director Bart Herbison, arrived in Washington with a unified message: the survival of the professional songwriter depends on establishing immediate, enforceable guardrails around AI technology. At the heart of their advocacy is a simple yet comprehensive framework known as the "Four P's"—Permission, Payment, Proof, and Penalties—which outlines the non-negotiable rights creators need in the digital age.
During meetings with the House Judiciary Committee and key Senate leaders, NSAI representatives argued that current copyright laws are insufficient to handle the speed and scale of AI ingestion. They presented the "Four P's" not just as policy requests, but as fundamental ethical standards for the AI industry.
The NSAI's visit coincides with a flurry of legislative activity in February 2026, marking a pivotal moment for IP protection. The songwriters are throwing their weight behind three specific pieces of legislation that, when combined, create a safety net for the creative industry.
The CLEAR Act, introduced just days ago on February 12 by Senators Adam Schiff (D-CA) and John Curtis (R-UT), addresses the critical "Proof" component by mandating public disclosure of training data. Simultaneously, the TRAIN Act and COPIED Act serve complementary roles in transparency and content integrity.
The following table details the legislative landscape NSAI is navigating:
| Legislation Name | Primary Sponsors | Core Mechanism | Impact on Songwriters |
|---|---|---|---|
| CLEAR Act (Copyright Labeling and Ethical AI Reporting) |
Sens. Adam Schiff (D-CA), John Curtis (R-UT) |
Requires AI companies to submit a detailed summary of copyrighted works in training datasets to the Copyright Office 30 days pre-release. |
Provides the "smoking gun" needed to prove infringement; creates a searchable public database of ingested lyrics and melodies. |
| COPIED Act (Content Origin Protection and Integrity) |
Sens. Maria Cantwell (D-WA), Marsha Blackburn (R-TN) |
Mandates NIST standards for content provenance and watermarking; prohibits the removal of origin data from digital files. |
Prevents AI from stripping credit/metadata from songs; allows creators to "attach" conditions to their work that travel with the file. |
| TRAIN Act (Transparency and Responsibility for AI Networks) |
Rep. Madeleine Dean (D-PA), Sen. Peter Welch (D-VT) |
Establishes a subpoena process for copyright holders to access AI training records, modeled after internet piracy laws. |
Grants legal tools to pierce the "black box" of AI models; enables discovery without needing to file a full infringement lawsuit first. |
"We are not Luddites; we understand technology evolves," stated Bart Herbison in a press briefing following a session with the Senate Judiciary Subcommittee on Intellectual Property. "But we cannot allow a trillion-dollar industry to be built on the unpaid, stolen work of American songwriters. If an AI can churn out a country ballad in the style of a Nashville veteran because it was trained on their entire catalog without a dime changing hands, that isn't innovation—it's theft."
The urgency of this push is underscored by the rapid deployment of multimodal AI models capable of generating high-fidelity audio that mimics specific vocalists and songwriting structures. The COPIED Act is particularly vital here, as it seeks to protect the integrity of a file. By making it illegal to remove digital watermarks or provenance information, the bill ensures that a song's "pedigree" remains intact as it moves across the web, preventing AI scrapers from treating it as orphaned data.
For the broader music industry, the passage of these acts represents a potential turning point. If the CLEAR Act passes, the era of opaque datasets—where companies like OpenAI and Anthropic claim trade secret protection over their training materials—would effectively end. This transparency is the prerequisite for the "Payment" pillar of the NSAI's framework; once usage is proven, licensing negotiations can begin in earnest.
However, resistance remains high from the technology sector, which argues that such stringent reporting requirements could stifle American AI innovation and cede ground to foreign competitors with laxer IP laws. Tech lobbyists have countered that the sheer volume of data makes itemized reporting burdensome.
The NSAI, however, remains undeterred. With the TRAIN Act providing the legal mechanism to demand answers and the CLEAR Act mandating proactive disclosure, the "Nashville coalition" is betting that 2026 will be the year federal law finally catches up to the reality of generative AI.
As the legislative session heats up, the eyes of the global creative community are fixed on D.C., watching to see if the "Four P's" will become the law of the land or if the "Wild West" of AI data scraping will continue unabated.