A teenage boy has been arrested at a school northwest of Melbourne, Australia after allegedly creating and disseminating explicit AI-generated porn images of around 50 female students at a Victorian grammar school.
According to Bacchus Marsh Grammar’s principal, Andrew Neal, multiple students from Year 9 to Year 12 were targeted, using their facial images sourced from social media and manipulated with artificial intelligence to produce “obscene pornographic images,” described as “mutilated” and “incredibly graphic.”
The school is providing support to the affected students, with Neal expressing dismay over the situation, particularly its impact on young girls.
“It’s appalling. It strikes at the core of our students, especially girls at this developmental stage,” Neal stated in an interview with ABC. “They deserve to learn and navigate their lives without encountering such unacceptable behaviour.”
The school’s principal suggested that the responsible party, likely a student or group from Bacchus Marsh Grammar, should face severe consequences for their actions, emphasising the seriousness and harm caused by such behaviour.
“These actions are not humorous. They are malicious and must be addressed accordingly,” Neal emphasised. “We need to respond firmly to such behaviour.”
As reported by The Guardian, Victoria police confirmed an ongoing investigation, revealing that several images were transmitted to an individual in the Melton area via an online platform on 7 June 2024.
A woman named Emily, who is both a parent of a student at Bacchus Marsh Grammar and a trauma therapist, recounted seeing the photos when she picked up her 16-year-old daughter from a sleepover.
Emily had a bucket in the car for her daughter, who was “sick to her stomach” on the drive home, she told ABC Radio Melbourne. “She was very upset, and she was throwing up. It was incredibly graphic.”
Emily’s immediate reaction was also to be sick. “I mean they are children… The photos were mutilated, and so graphic. I almost threw up when I saw it.” She then continued. “50 girls is a lot. It is really disturbing.”
Emily said the victims’ Instagram accounts were private. Though her daughter did not appear in the deepfake images, “there’s just that feeling of ‘will this happen again?’ It’s very traumatising.”
Regrettably, incidents like this are not unfamiliar. Remember when approximately 20 girls, aged between 11 and 17, courageously came forward as victims of this troubling exploitation? A quiet town in southern Spain was rattled by the discovery of AI-generated nude images circulating online, depicting local young girls. These images were originally sourced from the victims’ social media profiles without their consent, then altered using an artificial intelligence-powered tool capable of transforming innocent photos into explicit content.
As of now, acting Principal Kevin Richardson affirmed that Bacchus Marsh Grammar takes this matter seriously and has promptly engaged Victoria police. He further noted that while the teenage boy was initially arrested in connection to the incident, he has since been released pending ongoing inquiries.