A state bill introduced during this year’s legislative session seeks to punish distribution of AI-generated sexually explicit images of children with up to 15 years in prison, whether or not the child exists.
Senate Bill 9 comes on the heels of similar bills in legislatures across the country to combat a crime getting attention: AI-created child pornography. Laws specifically aimed at curbing AI child deepfakes sailed through legislatures, including most recently in California and Tennessee. In Georgia, because the bill has bipartisan support, it is expected to pass. The legislation passed Wednesday without dissent out of the House Technology and Infrastructure Innovation Committee to the House Rules Committee.
Sen. Emanuel Jones, D-Decatur, who co-sponsored the bill, said the legislation was created to prevent and deter AI crime being seen across the country.
Credit: Jason Getz/AJC
Credit: Jason Getz/AJC
Within weeks, the issue had come home to Georgia with the arrest of Ronald Richardson, a Pepsi vendor, who took advantage of his access to Gilmer County High School so he could refill soda machines to take photos of girls he encountered, authorities say. He convinced other girls to share with him photos and videos of themselves, Gilmer County sheriff’s officials said. What the girls did not know is that he would allegedly use AI to convert their photos to child pornography that he circulated via the internet. Richardson was charged in January with multiple counts of sexual exploitation for possessing child pornography.
Richardson’s arrest is believed to be the first Georgia case of AI-created child pornography. The arrest confirms for Jones that a new law is needed specifically for this type of activity. “There’s already been one case and it’s just a matter of time before others. We need to get ahead of it,” he said.
In December, one of the girls reported to the school’s resource officer that Richardson had asked her for videos.
Parents of eight alleged female victims filed a lawsuit against Richardson and Pepsi on behalf of their children, who ranged from ages 12 to 17 at the time they say they were victimized.
The investigation began after a student told the school resource officer that she would talk regularly to Richardson when he was at the school and had done so for approximately a year. Richardson had often given her free soft drinks, but this was the first time she had been asked to send him pictures, according to the Gilmer County Sheriff’s incident report.
Gilmer County High School demanded that Pepsi investigate and remove Richardson in September when students reported Richardson taking pictures of them as they walked past him loading the vending machines. Though temporarily removed, Richardson was reinstated to his route and serving the high school because Pepsi was “having difficulty finding other drivers in the area,” according to the lawsuit.
After being reported again in December, Richardson was fired from the company and is currently in jail on multiple counts of sexual exploitation. He was not employed by the district but was given an access badge to freely enter the high school and middle school. Gilmer County Charter Schools Superintendent H. Brian Ridley declined to comment beyond the district’s Jan. 29 news release, which pledged to cooperate with police.
Gilmer County Sheriff Stacy Nicholson said there were many minors who were victims in the case.
“All of the sexually explicit (nude) photographs of the minor children that Richardson is alleged to have possessed are actually normal snapshots which he captured from various social media pages and then altered (or had altered) using AI to make the images appear nude,” said Nicholson.
Richardson was denied bail. The case was referred to the DeKalb County District Attorney’s Office, which agreed to serve as the prosecuting office on Feb. 5.
Sen. John Albers, R-Roswell, chaired the Senate Study Committee on Artificial Intelligence and also co-sponsored SB 9.
“SB 9 is designed to modernize and strengthen the legal provisions concerning obscenity and the use of emerging technologies in criminal activities, thereby offering better protection to the citizens of Georgia from evolving threats and inappropriate materials,” Albers said in a statement to The Atlanta Journal-Constitution.
At the federal level, legislators are also attempting to curb the issue of AI child pornography. The U.S. Senate unanimously passed the Take It Down Act on Feb. 13, which would criminalize “revenge porn,” the distribution of nonconsensual sexually explicit images and videos to humiliate or retaliate against a victim. The act would also require technology companies to take down nonconsensual images within 48 hours of notice from a victim. The bill is co-sponsored by Ted Cruz, R-Texas, and Amy Klobuchar, D-Minn. First lady Melania Trump recently appeared at a Capitol Hill roundtable and urged lawmakers to pass the bill.
Thomas Kadri, an assistant professor at the University of Georgia School of Law, said there are currently very few, if any, legal requirements for platforms to remove sexual deepfakes or ways for users to ask for their removal.
Beyond addressing the issue criminally, Kadri said, there is also potential for states like Georgia to create laws around civil liability such as giving victims the right to sue their perpetrators for AI deepfake creation and distribution.
About the Author
Keep Reading
The Latest
Featured