Celebrity interviewer Bobbi Althoff just became the latest victim of her likeness being used for AI deepfake porn. The 26-year-old podcaster’s name was trending on X, formerly Twitter, as an X-rated video of her masturbating naked in front of a camera went viral. The fabricated video fully exposed her genital parts and is allegedly up to four minutes long.
Many of Althoff’s fans quickly spotted that the clip was fake and called for a serious crackdown on deepfake images and videos circulating online, in light of the scale the incident reached.
The Real Good Podcast host herself has also come forward to clarify that the explicit video of her is fake.
No, it was just Bobbie alone who was featured in the clip. Earlier this month, she and her husband Cory Althoff called it quits after nearly half a decade together. The two tied the knot in 2020, but eventually filed for divorce in February 2024. Needless to say, it’s been a rough month for our girl… Cory, an author of self-help literature, hasn’t commented on the clip as of writing this.
While it’s been asserted multiple times that the video is fake, multiple users still tried to benefit from the clip to gain online attention or even financial remuneration:
A large number of netizens condemned the video, labelling it as a deeply misogynistic attempt to gain online attention, at the costs of Althoff’s public image, privacy and dignity. They highlighted the falsity of the clip and encouraged the influencer to seek legal action.
Nevertheless, links to the illicit clip continue to be shared on various social media platforms. Althoff isn’t the only celebrity to have been affected by AI-generated porn or the unauthorised dissemination of nude images. Earlier this month, a nude video that allegedly shows the Canadian rapper Drake masturbating was shared across X, and exposed a huge content moderation issue on the popular platform. By the time links to the video were removed, it had already been seen and reshared by millions.
Similarly, deepfake pornography images featuring the multi-Grammy award-winning American singer Taylor Swift even shocked the White House. A speaker said that the images are “very alarming” and urged Congress to take legislative action against the increasing phenomenon.
This latest clip highlights the vulnerability of not just celebrities but people in general, as AI-generated videos continue to become indistinguishable from reality, and governments and social platforms have yet to come up with adequate regulations to address the issue.