Is Trump's AI-Generated Taylor Swift Endorsement Illegal?

On Sunday, former President Donald Trump shared a series of memes on Truth Social — the platform owned by his media company — that suggest Taylor Swift and her fanbase are endorsing his candidacy. However, as new legislation takes effect, these images raise significant concerns regarding the use of AI-generated content in political campaigns, especially when they distort a celebrity’s likeness.

Noah Downs, an IP and entertainment lawyer, noted, "I’m currently observing a surge in AI impersonators making false endorsements." While these AI-generated endorsements are proliferating, even “Shark Tank” had to issue a public service announcement warning fans about scams impersonating its investors.

One of Trump’s images features groups of young women donning “Swifties for Trump” t-shirts. Despite the diverse political views within Swift’s fanbase, these images appear to be fabricated using AI — originating from a satirical post on X. Another meme Trump shared depicts Taylor Swift as Uncle Sam, proclaiming, “Taylor wants you to vote for Donald Trump.”

Although the pop star has yet to publicly address the 2024 U.S. presidential election, she supported the Biden-Harris campaign in 2020 and openly criticized Trump at the time. Recently, some fans speculated about a subtle endorsement of Kamala Harris in an Instagram post by Swift, but this was later clarified as inaccurate.

As a major figure in pop culture, Swift has faced significant deepfake challenges. Earlier this year, explicit non-consensual AI images of her circulated on X, prompting lawmakers to introduce bills aimed at addressing these deepfake issues. White House Press Secretary Karine Jean-Pierre also called on Congress to take action.

Fast forward eight months, and the legal landscape surrounding protections against misleading synthetic media is evolving. In Tennessee, where Swift’s corporate office is located, Governor Bill Lee signed the groundbreaking ELVIS Act into law in March, providing specific protections for artists against unauthorized AI replicas of their work.

“This bipartisan legislation acknowledges the risks AI misuse poses to the public,” Downs stated.

However, since the ELVIS Act is new, it lacks precedent on how it may guard artists. The act primarily addresses AI-generated audio that can replicate a performer’s voice, such as a viral fake Drake song. Downs adds, “While the ELVIS Act is an important step, we need comprehensive national legislation addressing these concerns.” Its relevance stems largely from Swift’s ties to Tennessee, where she holds business and real estate interests.

Avi D. Kelin, a political law partner at PEM Law, expresses skepticism about the applicability of the ELVIS Act, noting the law's primary focus on audio rather than imagery. He also raises a potential future concern over federal election integrity.

“The pressing question is whether the Federal Election Commission (FEC), which regulates political communications, will intervene,” Kelin said. However, he anticipates that the FEC is unlikely to release new guidelines for AI-generated political communications before this election cycle concludes.

Conversely, the Federal Communications Commission (FCC) is advancing new AI transparency requirements for TV and radio advertising. Still, this does not extend to social media posts from politicians, which remain crucial for campaign communication. Research by the Center for Countering Digital Hate (CCDH) revealed that AI-generated disinformation on X surged by an average of 130% per month over the past year.

The significance of these misleading endorsements is underscored by Swift’s immense clout — her backing could critically influence a candidate's success in a close race. According to Morning Consult, over half of U.S. adults consider themselves Taylor Swift fans, while 16% are avid supporters. Given that only about two-thirds of eligible Americans voted in the 2020 election, these figures are particularly notable.

“The ELVIS Act is new, and its specific implications will need to be defined by the courts,” Kelin concluded. “This could serve as a compelling test case!”

Most people like

Find AI tools in YBX