Taylor Swift voices concerns over AI deepfakes amid election endorsement

Taylor Swift has officially endorsed the US Vice President Kamala Harris for the forthcoming presidential election after being photoshopped with images of supporting Donald Trump. This decision by Swift to come out in support of the candidate was informed by the worrying trend of the misuse of artificial intelligence in political communication. 

In a post on Instagram right after the vice-presidential debate, Swift reaffirmed her support for Harris, noting that Harris stands up for issues that are important to her, including the rights of the LGBTQ+ community and access to reproductive health care. 

Buy physical gold and silver online

Apart from the endorsement, Swift also posted about the potential dangers of AI deepfakes on Instagram. Swift highlighted AI-altered photos posted by Trump’s campaign, where she was depicted as supporting the former president. One image showed Swift wearing a “Swifties for Trump” t-shirt, and another image falsely portrayed her in an Uncle Sam outfit supporting Trump. 

Taylor Swift deepfake

Political campaigns increasingly exploit AI-generated content

The use of AI in political campaigns has become rampant, raising questions. Some politicians including Trump are said to have used AI in the dissemination of fake news. Midjourney, for instance, has prohibited the creation of images of political leaders. Grok has also been accused of giving misleading information about the election. Other platforms, including ChatGPT and Google Gemini, have also introduced some ways to prevent fake news.

As previously highlighted by Cryptopolitan, Lingo Telecom, a voice service provider, recently reached a settlement with the Federal Communications Commission (FCC) for $1 million. The company was accused of supporting AI-based automated calls that swayed the voters of the 2024 New Hampshire primary election. 

This is after several residents of New Hampshire received phone calls on January 21, 2024, with a voice that resembled the voice of President Joe Biden. The calls were automated and said that if the targets voted in the state’s presidential primary, they would not be allowed to vote in the November general election.

It is worth noting that some U.S. states have enacted laws on deepfakes in the course of elections. California has recently passed several bills that aim to govern the AI industry and, more precisely, deepfakes. 

This is in line with the national trend; other states like Hawaii, Louisiana, and New Hampshire have also passed similar laws that are deeply fake. Public Citizen, a consumer advocacy organization, has noted that 20 states have enacted laws governing the employment of AI in elections, with others in the pipeline for the same. 

About the author

Why invest in physical gold and silver?
文 » A