• Home
  • News
  • Fortune 500
  • Tech
  • Finance
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
TechTaylor Swift

Taylor Swift’s deepfake fight with Donald Trump exposes a real legal wrinkle

By
Jenn Brice
Jenn Brice
By
Jenn Brice
Jenn Brice
September 11, 2024, 7:07 PM ET
Taylor Swift performing in concert
The star went on to endorse Harris and her running mate Tim Walz, referring to their support for causes like LGBTQ+ rights and access to reproductive care and abortion. Kevin Winter/Getty Images for TAS Rights Management

In her endorsement of Vice President Kamala Harris on Tuesday, Taylor Swift sounded the alarm about using artificial intelligence to spread misinformation in political campaigns. 

Recommended Video

Last month, Donald Trump posted images on social media of the pop star dressed as Uncle Sam in patriotic red, white, and blue garb, meme-ified to read “Taylor Swift Wants You to Vote for Donald Trump.” He also shared separate images showing crowds of spray-tanned blondes wearing “Swifties for Trump” T-shirts. 

The episode was just the latest example of how political campaigns could use AI to mislead voters—whether by making it seem like a candidate has a celebrity’s support or to impersonate an opposing candidate. But even as concerns mount about how sophisticated and easy-to-use deepfake technology is looming over the presidential election, there are no federal rules yet that specifically dictate how AI can or can’t be used by campaigns.

Just yesterday, the Federal Elections Commission said that it wouldn’t vote on a proposed rule about AI, including deepfakes, ahead of this year’s election. Chairman Sean Cooksey wrote in a memo that deceptive campaign ads are always a violation of the Federal Election Campaign Act, regardless of whether AI or another technology was used to make them. 

In the meantime, the Federal Communications Commission is looking to step in with requirements that the use of AI be disclosed in political ads. However, Lisa Gilbert, copresident of nonprofit consumer advocacy group Public Citizen, notes that the telecommunications regulator only oversees broadcast and telephone campaign communications. That means the FCC rules wouldn’t apply to social media posts like the one Trump shared.

Gilbert described Public Citizen’s push for guardrails around AI in elections as “basically throwing everything at the wall to see where things will stick,” with the ultimate goal being meaningful federal legislation. 

Although there are no federal rules governing how campaigns can use AI, more than 20 states already have election-related deepfake laws enacted or pending. California Gov. Gavin Newsom, for example, indicated in June his support for legislation that would outlaw manipulated political ads, in response to Tesla CEO Elon Musk reposting an altered video of Kamala Harris on social media. 

And while it’s unclear how many people were fooled by Trump’s Taylor Swift deepfake, the U.S. Government Services Administration reports that Swift’s Instagram post on Tuesday endorsing Harris for president funneled more than 330,000 people to register to vote.

“It really conjured up my fears around AI, and the dangers of spreading misinformation,” Swift wrote in her Instagram post about the deepfake Trump shared. While the singer isn’t new to weighing in on politics and getting out the vote, she made it clear that she found the year of the AI election particularly spooky after Trump manipulated her image.

“It brought me to the conclusion that I need to be very transparent about my actual plans for this election as a voter,” Swift continued. “The simplest way to combat misinformation is with the truth.”

The star went on to endorse Harris and her running mate Tim Walz, referring to their support for causes like LGBTQ+ rights and access to reproductive care and abortion. 

Someone like Swift may be able to quickly upend speculation about her political leanings by posting to hundreds of millions of fans, but the average American doesn’t have the same control over their image, Gilbert notes. 

“Unfortunately, for most, there isn’t that kind of recourse,” she said. That makes deepfakes an even more harmful tool when they’re used to distort a politician running in a local election, or the average teenager posting on social media. 

Fortune Brainstorm AI returns to San Francisco Dec. 8–9 to convene the smartest people we know—technologists, entrepreneurs, Fortune Global 500 executives, investors, policymakers, and the brilliant minds in between—to explore and interrogate the most pressing questions about AI at another pivotal moment. Register here.
About the Author
By Jenn Brice
LinkedIn icon
See full bioRight Arrow Button Icon
Rankings
  • 100 Best Companies
  • Fortune 500
  • Global 500
  • Fortune 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Fortune Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Fortune Brand Studio
  • Fortune Analytics
  • Fortune Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Fortune
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map

© 2025 Fortune Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Fortune Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.