Since 2018, women-first dating app Bumble has helped pass legislation in both the US and UK to combat cyberflashing, the act of sending sexually explicit material online without consent. According to a previous research carried out by the app, 48 per cent of women aged 18 to 24—out of the 1,793 respondents based in England or Wales—had received an explicit, non-consensual photo in 2019 alone. 59 per cent of them admitted to losing their trust in other users afterward, while one in four felt violated in the process.
In the same year, Bumble harnessed machine learning to better shield its growing community from unwanted nudes, launching an AI tool dubbed ‘Private Detector’ within the app. The feature essentially screens images sent from matches to determine if they depict lewd content or not.
While it was designed with the intention of catching unsolicited nudes, it also helps flag shirtless selfies and images of firearms—both of which aren’t allowed on the platform. I mean, you really have to reevaluate your presence on dating apps if you think such pictures would pull romantic prospects in the first place.
If the AI detects a positive match, the app blurs the image and you’ll be notified of the same. It’s then up to you to decide whether you want to view, block, or report the individual who sent the picture.
Fast forward to October 2022, in a recent press release, the app—which is also reportedly launching a speed dating feature that lets users chat before matching or seeing pictures—announced that it is open-sourcing Private Detector on Github, making the framework publicly available for commercial use, distribution, and modification.
“It’s our hope that the feature will be adopted by the wider tech community as we work in tandem to make the internet a safer place,” the company wrote.
When Bumble first introduced the AI, it claimed that the tool had 98 per cent accuracy. On these terms, it’s worth noting that the technology harbours the potential to help smaller companies—who probably don’t have the time or assets to develop similar tools—integrate the same into their offerings thereby shielding users from cyberflashing.
“There’s a need to address this issue beyond Bumble’s product ecosystem and engage in a larger conversation about how to address the issue of unsolicited lewd photos to make the internet a safer and kinder place for everyone,” Bumble concluded.
A 2018 YouGov poll found that four in 10 women aged between 18 and 36 have been sent a photograph of a penis without having asked for one. Unsolicited dick pics are one of the many drawbacks of the digital era. I’ve received some, you probably have too, and the mere thought of anyone else having to go through the same thing sickens me. Until now, few solutions were offered as means to solve this alternative pandemic. As Tinder recently announced its plans for testing a new AI that monitors DMs in order to cool down the creeps, many highlighted how social media platforms barely did anything to stop those same creeps from sliding into your DMs.
What if I told you that there’s a brand new solution that has appeared, one that, although not fully perfect, comes with an important revenge factor? You’ve heard of NFTs by now, and the many ways people are jumping on the bandwagon—from Cara Delevingne auctioning off an NFT about her vagina to the most popular memes getting sold one after the other more recently, it’s safe to say that anyone can try their hand at coming up with their own non-fungible token.
Zoe Scaman, creative strategist and founder of Bodacious, might just take the cake with her take: using NFT technology to stop men from sending unsolicited dick pics. Scaman, along with some help from the duo Very Serious, turned this idea into a real website. On 24 March, 2021, NFT the DP was created with the simple aim to help even the least tech-savvy among us turn dick pics into cash.
The process is as simple as it gets: if you received an unsolicited nude, you can go on NFT the DP, pay the minting price, upload the dick pic, and start minting. Although the site doesn’t do everything for you, it offers a relatively short list of instructions on how to mint an NFT using two pieces of readily available software, MetaMask and Mintable. This process allows you to create a permanent record on the blockchain ledger of that dick pic with the name of the sender attached as the artist.
The website also includes instructions for what to do if your dick pic has been turned into an NFT by a scorned receiver—if they’ve gotten “NFTDPd.” Those instructions are purposefully less clear, with a generally taunting tone: pay for the NFT, if you can afford it, and send it to a burner wallet. “If you can’t afford it…too bad lol,” the instructions read.
While some of you might find this revengeful punishment too harsh, it is important to note that cyber-flashing is a form of harassment—yet there are no laws explicitly banning the practice in the vast majority of the world. In more extreme cases, law enforcement has used anti-harassment laws to cover it, but for the most part, the law hasn’t caught up with this phenomenon. In the UK, cyber-flashing can potentially fall within the offences of harassment or public nuisance. The same behaviour has been illegal in Scotland since 2010, but England and Wales still don’t have specific laws against it.
The same problem can now be seen with the rise of deepfake porn and deepnudes. The law is yet to catch up on new technologies and the risks they represent. That being said, the legality of NFT the DP also stands on shaky ground, given that revenge porn laws vary greatly by locality.
So, next time you receive an unsolicited nude—because, sadly, they probably will be a next time—be sure to explore your options and potentially consider revenge. You might even end up getting some dolla dolla.