With the 2024 election on the horizon, the threat of political deepfakes looms large, but states have been slow to enact regulations to combat this emerging issue.
Only three states have passed laws in 2023 related to the use of artificial intelligence and deepfakes in political campaigns, highlighting a pressing need for action at the state level.
As the federal government remains tepid on the matter, state lawmakers are urged to take the initiative to protect the integrity of elections.
Limited state action in the face of growing threat
In a year where the potential risks associated with artificial intelligence and deepfake technology have become increasingly evident, only three states – Minnesota, Michigan, and Washington – have taken steps to address the issue through legislation.
These states enacted laws with bipartisan support, recognizing the urgency of the situation. However, many more states have yet to respond to the evolving landscape of political disinformation.
Challenges hindering state action
Several challenges have impeded states from swiftly tackling the problem of political deepfakes. First and foremost, any regulations must carefully balance the need to combat disinformation with safeguarding First Amendment rights, which makes the legislative process complex and potentially subject to legal challenges.
Additionally, the rapid advancement of generative AI and deepfake technology poses a significant hurdle for lawmakers who may struggle to keep up with the ever-evolving capabilities of these tools. Many legislators lack a comprehensive understanding of these issues, making it challenging to craft effective policies. Moreover, enforcing regulations would require cooperation from major social media platforms, adding another layer of complexity to the process.
Urgent need for state-level solutions
Experts and advocates in the field emphasize that states cannot afford to wait any longer to address the threat posed by deepfakes. The past year has seen a surge in the use of deepfake technology to create convincing but false videos, raising concerns that the 2024 election could be marred by a flood of political disinformation. Consequently, states must take action now to prevent the potential consequences of such manipulation.
Deepfakes: A growing menace
Deepfakes are deceptive videos generated using artificial intelligence, which can convincingly mimic the appearance and speech of real individuals.
The proliferation of deepfake content on the internet has made it increasingly difficult for the public to discern truth from fiction. As a result, there are growing concerns that deepfakes could be used to spread political disinformation during the upcoming election season, potentially undermining the integrity of the electoral process.
In 2023, three states took the lead in addressing the deepfake issue with innovative legislation. These laws can serve as models for other states to consider:
Washington’s Disclosure Requirement: In May, Washington enacted a law mandating disclosure on “synthetic” media used to influence elections.
This law defines “synthetic” as any manipulated image, audio, or video created with digital technology to create a realistic but false representation. Such a disclosure requirement aims to increase transparency around the use of deepfake content in political campaigns.
Minnesota’s Ban on Election-Related Deepfakes: Minnesota, in August, passed a law that bans the publication of deepfake media intended to influence an election within 90 days of the election date.
The law outlines criteria for prosecution, including the knowledge of dissemination being a deepfake, lack of consent from the depicted individual, and the intent to harm a candidate or influence election outcomes.
Michigan’s Comprehensive Approach: Michigan adopted a law in the previous month that combines a ban on the distribution of materially deceptive media within 90 days of an election with a disclosure requirement. The ban, however, is waived if the media includes a disclosure stating that it has been manipulated. Manipulation is defined differently for images, videos, audio, and text.
The call for urgent action
As the threat of deepfakes continues to evolve, experts and advocates stress the need for immediate state-level action. Daniel Weiner, director of the elections and government program at the nonpartisan Brennan Center, warns against delay, stating that “the really corrosive possibilities [from deepfakes] have fully burst into consciousness in the last year to two years.”
He underscores that effective policy solutions are already available and urges state lawmakers to roll up their sleeves and begin crafting legislation to protect the democratic process.