Deceptive computer-generated audio and images of politicians are one of the latest targets for Georgia lawmakers, as they aim to combat election misinformation.
The Georgia state House passed Senate Bill 9, a measure to outlaw political deepfakes, Thursday in a 152-12 vote. It now heads back to the Senate for another vote before final passage.
“We are responsible in so many different ways to 10.7 million Georgians to ensure that when they go into the election box, they can trust that the election box is, in fact, filled with 100% integrity,” said state Rep. Todd Jones, R-South Forsyth.
Rep. Brad Thomas, R-Holly Springs, introduced a nearly identical proposal last year over fears that voters could be tricked by deceptive deepfakes and jeopardize voters’ faith in elections, such as images, videos or robocalls that mirror the likeness of political candidates. But opponents raised concerns that the measure would limit free speech.
“This is silencing dissent, plain and simple,” said Rep. Charlice Byrd, R-Woodstock.
AI experts, who feared widespread deepfake misinformation campaigns would confuse voters during the 2024 presidential election, said that never materialized as expected.
“Certainly, the reality in the 2024 election didn’t match the concern that was expressed nearly across the board that it was this nightmare scenario,” said Scott Brennen, director of the Center on Technology Policy at New York University.
Still, there were a select few instances that caught people’s attention, like in New Hampshire last year where a deceptive robocall mimicking the voice of then-President Joe Biden discouraged voters from voting in the state primary.
The proposal would make it a misdemeanor to broadcast or publish deceptive information within 90 days of an election with the intent to influence a candidate’s chance of being elected, create confusion about election administration or influence the result. For repeat offenders, it would become a felony.
Repeat offenders would be fined up to $50,000 and face two to five years in prison, and the measure would allow judges to stop individuals from distributing deceptive media. But satire, parody and legitimate journalism would be exempt.
At least 21 states have enacted similar laws limiting or prohibiting the deceptive use of AI in political campaigns and elections, Thomas said.
SB 9 began life as a bill to criminalize AI-generated child pornography. That language was stripped from the bill in a House committee and replaced with the current version targeting political deepfakes. Another bill — House Bill 171 — is nearly identical and awaits action in the Senate.
The last-minute change reflects the lack of transparency in the legislative process, especially late in the Legislature’s 40-day session, which concludes next week. Some of SB 9’s cosponsors said they were not made aware of the change.
“No one has ever come to me and said, ‘Well, you know, we need to look into election deepfake AI as well,” said Sen. Emanuel Jones, D-Decatur, who is a co-sponsor of SB 9.
Thomas said the deepfake elections measure is a companion to criminalizing the distribution and creation of AI-generated child pornography, which has become a growing problem in Georgia and beyond.
In January, a Pepsi vendor servicing Gilmer County High School in Ellijay was arrested for taking photographs of female students and manipulating them into pornography using artificial intelligence. The vendor, Ronald Richardson, is currently in jail, and the arrest is believed to be the first case of AI-generated child pornography in Georgia. Parents of eight victims, who ranged from ages 12 to 17, filed a lawsuit against Richardson and Pepsi in February.
Keep Reading
The Latest
Featured