Trump using AI images of Taylor Swift highlights a new era of election disinformation

Swift probably won’t like Trump using her reputation to falsely claim she endorsed him, but rules on AI in political campaigns are murky.
Former President Trump shared multiple fake images of young women in “Swifties for Trump” shirts, including an AI-generated image of Taylor Swift in Uncle Sam attire urging people to vote for him. (Courtesy of Arturo Holmes, Brandon Bell; Getty Images)

Credit: Arturo Holmes, Brandon Bell; Getty Images

Credit: Arturo Holmes, Brandon Bell; Getty Images

Former President Trump shared multiple fake images of young women in “Swifties for Trump” shirts, including an AI-generated image of Taylor Swift in Uncle Sam attire urging people to vote for him. (Courtesy of Arturo Holmes, Brandon Bell; Getty Images)

This story was originally published by The 19th.

On Sunday, former President Donald Trump shared multiple fake images of mostly young, white blond women clutching iced coffees wearing “Swifties for Trump” T-shirts. He included an AI-generated image of Taylor Swift dressed in Uncle Sam regalia, imploring the American people to vote for Trump.

One of the images is clearly marked “satire” and was posted by a popular conservative influencer on X the day before. Another is actually a real picture of Jenna Piwowarczyk, a freshman at Liberty University who wore a handmade “Swifties for Trump” T-shirt to his rally in Racine, Wisconsin, on June 18.

Swift had not endorsed Trump, but he declared “I accept!” in his post, implying that maybe she had. The message couldn’t be further from the truth, as the pop star made her support for the Biden-Harris campaign clear in 2020 and tweeted at Trump “We will vote you out in November.” Trump likely would not face legal repercussions related to campaigning, though he could based on using Swift’s likeness. The danger, policy experts say, is less about whether people will genuinely mistake the images as real — the Taylor Swift post is obviously an illustration — but more about the overwhelming quantity of disinformation and how quickly it can spread on social media.

The largest AI-related threats to the election come not from star-spangled images, but rather misinformation about election protocols. Several secretaries of state sent a letter to Elon Musk asking him to correct misinformation about ballot deadlines spouted by Grok, his company X’s chatbot. While a little over 800,000 users had access to the chatbot, the officials claimed the false information circulated to millions of people on the platform.

Only some states have their own guidelines about the use of artificial intelligence in campaigning, and clear directives from the federal government might not be coming. Just last week, in a Wall Street Journal editorial, Federal Election Commission Chair Sean Cooksey proposed dropping any potential rulemaking on the use of artificial intelligence in political ads. Cooksey argues the agency has neither the congressional authority nor technical expertise to regulate how political campaigns use artificial intelligence.

Since nonconsensual AI-generated sexually explicit images of Swift went viral in January, there has been a renewed push to provide federal recourse for victims of computer-generated image-based sexual abuse, often called deepfakes. The White House has been pushing for solutions as abuse has escalated in schools.

The bipartisan DEFIANCE Act would allow victims of computer-generated image-based sexual abuse to sue the creators of the images for damages. It passed the Senate in July and now awaits action in the House.

Sens. Ted Cruz, a Texas Republican, and Amy Klobuchar, a Minnesota Democrat, in June introduced the TAKE IT DOWN Act, which would also require platforms to take down image-based sexual abuse within 48 hours. The bill is currently being evaluated by the Senate Committee on Commerce, Science and Transportation.

States have been trying to tackle the issue for years, most commonly in the context of nonconsensual intimate image sharing, often inaccurately called “revenge porn.”

Artificial intelligence regulation, with an eye toward equity and preventing violence against women, was included In the official Democratic National Committee platform released Sunday ahead of the convention in Chicago.

The platform also promises to ban AI-generated voice impressions. Earlier this year, thousands of New Hampshirites received a robocall with a voice purporting to be that of President Joe Biden discouraging them from voting in the state’s primary the next day. The call’s creator is facing criminal charges and a large fine.

While laws on using generative artificial intelligence in political campaigns remain somewhat piecemeal, Swift could take action against her likeness being used by Trump. Tennessee, where Taylor Swift is headquartered, recently passed a bill protecting the property rights of artists’ name, likeness and voice, as reported by 404 Media. Swift could also pursue action under defamation laws.

Swift became more political in 2016 after years of staying on the sidelines. “I need to be on the right side of history,” she said in her documentary 2020 “Miss Americana.” She endorsed the Biden-Harris campaign with a note on X saying she was cheering for Kamala Harris in the vice presidential debate. Also in 2020 she posted a real picture of herself holding a plate of “Biden-Harris” emblazoned cookies.

Swift has not yet endorsed a candidate in the 2024 presidential race.


ajc.com

Credit: The 19th

icon to expand image

Credit: The 19th

MEET OUR PARTNER

Today’s story comes from our partner The 19th. The 19th is an independent, nonprofit newsroom reporting at the intersection of gender, politics and policy. Visit them online at 19thnews.org or on Instagram @19thnews.

If you have any feedback or questions about our partnerships, you can contact Senior Manager of Partnerships Nicole Williams via email at nicole.williams@ajc.com.