New Senate Bill Aims to Safeguard Arts and Journalism Content from AI Exploitation

A bipartisan coalition of senators has introduced a significant new bill intended to safeguard the rights of artists, songwriters, and journalists by preventing the unauthorized use of their content for training AI models or generating AI-driven content. This legislation, known as the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act), also aims to enhance the identification of AI-generated content and tackle the increasing prevalence of harmful deepfakes.

Key figures in this initiative include Senate Commerce Committee Chair Maria Cantwell (D-WA), Senate AI Working Group member Martin Heinrich (D-NM), and Commerce Committee member Marsha Blackburn (R-TN).

The COPIED Act mandates that companies developing AI technologies must allow users to attach content provenance information to their digital content within two years. This content provenance information will provide machine-readable documentation detailing the origins of various digital assets, like photographs and news articles. Under the provisions of the bill, any content linked to provenance information cannot be utilized to train AI models or produce AI-generated content.

This legislation empowers content creators, including journalists, artists, and songwriters, to safeguard their work. It establishes clear terms of use for their content, including provisions for compensation, and grants them the legal right to sue platforms that exploit their work without permission or manipulate content provenance data.

Additionally, the COPIED Act requires the National Institute of Standards and Technology (NIST) to develop guidelines and standards for content provenance information, watermarking, and synthetic content detection. These standards will help identify whether content has been generated or altered by AI and trace the origins of AI-generated content.

Senator Cantwell highlighted the importance of the COPIED Act, stating in a press release, “This bipartisan initiative will provide much-needed transparency regarding AI-generated content. It will empower creators, including local journalists and artists, by establishing a provenance and watermark framework that is essential for controlling their work.”

Support for the COPIED Act includes endorsements from various artists’ organizations, such as SAG-AFTRA, the National Music Publishers’ Association, The Seattle Times, the Songwriters Guild of America, and the Artist Rights Alliance.

The introduction of this bill coincides with a wave of legislative efforts aimed at regulating AI technology. Recently, Senator Ted Cruz submitted a bill to hold social media platforms like X and Instagram accountable for managing deepfake pornography. This initiative, known as the Take It Down Act, responds to the proliferation of AI-generated explicit images of celebrities circulating on social media.

In May, Senate Majority Leader Chuck Schumer unveiled a comprehensive roadmap for AI regulation that focuses on boosting funding for innovation, addressing deepfake usage in electoral processes, and leveraging AI to enhance national security, among other priorities. Furthermore, Axios reported earlier this year that state legislatures have been introducing approximately 50 AI-related bills weekly, with a total of 407 AI bills proposed across over 40 states as of February—an increase from 67 such bills introduced the previous year.

In response to the rapid development and adoption of AI tools, President Joe Biden issued an executive order last October aimed at establishing safety and security standards for AI technologies. This order mandates that AI developers share their safety assessments and critical data with the government before launching their systems to the public. Notably, former President Donald Trump has pledged to repeal this executive order if he is re-elected.

Most people like

Find AI tools in YBX