California school investigates distribution of AI-generated nude photos of students
A California high school is investigating reports of students sharing nude or inappropriate photos created with artificial intelligence only one month after another Southern California school district expelled five students for distributing AI-generated nude photos of classmates.
Laguna Beach High School, part of the Laguna Beach Unified School District, is the latest school to be involved in an incident surrounding the distribution of inappropriate images generated by AI. The high school has not revealed the names of the students responsible for the photos.
Principal Jason Alleman notified parents in a letter last Monday about the investigation, stating that “these incidents can have far-reaching impact on our campus culture," according to The Los Angeles Times. The Laguna Beach Police Department is assisting the school with its investigation.
“These actions not only compromise individual dignity but also undermine the positive and supportive environment we aim to foster at LBHS,” Alleman wrote.
Students reportedly shared the AI-generated photos through text messages, according to The Los Angeles Times. The father of one of the students suspected of creating the images defended his son, claiming that the 17-year-old suffered a brain injury while he was snowboarding in 2019.
“He’s really, really sorry,” the father said, explaining that the teenager struggles with impulse control as a result of the brain injury.
The family’s attorney, Jacqueline Goodman, called it “appalling” that news vans are parked outside of the school and requested journalists refrain from publicizing the incident.
“Let the system take care of it as it ought to. We’re all in a sort of fact-gathering situation and it’s just not appropriate to be outing him, staying at his school and essentially convicting him before authorities have any way to consider this,” Goodman told reporters. “It would be unfortunate to treat an adult like this, but particularly appalling to treat a disabled child like this.”
Anakaren Cárdenas Ureño, the director of communications for Laguna Beach Unified School District, told The Christian Post that schools are required by law to "keep student discipline matters confidential to protect the privacy and well-being of our students."
"The safety and privacy of our students is our top priority and each incident is handled on a case-by-case basis considering the individual circumstances of the situation," the spokesperson stressed.
In a statement shared with the media, the school district recognizes "the profound impact that recent incidents can have on school culture" and is focused on providing "immediate support to our students." Last week, the school hosted panel discussions for students across grade levels "to cultivate an open, ongoing conversation on these critical issues."
"The school leadership team received hundreds of questions from students and is developing an FAQ with relevant resources that will be shared with students and families to continue these discussions," the statement reads. "Students also requested ongoing social-emotional support on campus to help process these experiences and continued education regarding the appropriate use of AI, including how to ethically integrate its use into schoolwork and homework assignments."
The incident comes after the Beverly Hills Unified School District voted in March to confirm the expulsion of five students accused of using AI to create and share nude images of their classmates.
The five students attended Beverly Vista Middle School, The Los Angeles Times previously reported. According to Superintendent Michael Bregy, the five students were the “most egregiously involved” in the incident.
The AI-generated images reportedly featured the faces of real students on simulated nude bodies. The district said that the 16 victims were in the eighth grade, and the images circulated among students at the middle school in February.
California laws concerning child pornography possession and sharing nude photos without consent do not apply to AI-generated images, an issue groups like the National Center on Sexual Exploitation have highlighted, the newspaper notes.
In a February blog post, the anti-sexual exploitation group urged Congress to pass legislation to protect people against AI-generated pornography, also known as “deepfake pornography.” According to a December 2023 Home Security Heroes study cited by NCOSE, the availability of deepfake pornography increased by 550% between 2019 and 2023.
Last month, a Christian school teacher in Florida was arrested for allegedly using yearbook photos to create sexually explicit content featuring children.
Samantha Kamman is a reporter for The Christian Post. She can be reached at: samantha.kamman@christianpost.com. Follow her on Twitter: @Samantha_Kamman