Some would argue that Harry Styles’ December Vogue cover was one of the few saving graces we received during the infamous year of 2020. Not only did it break the internet, but it also brought to light the deep and complicated subjects of gender norms, white privilege, appropriation and subsequently a noticeable trend of queerbaiting across social media. This is not a commentary of Harry Styles himself or his sexuality but an observation of the culture and consequences of rewarding certain people for breaking barriers and demonising others for doing the same.
I had the privilege of sitting down with TikTok star Reece Davey otherwise known as @HouseofDvey to hear his thoughts on the matter. With nearly 90,000 TikTok followers under his belt, Davey has become a champion of queer identity, self-expression and fashion freedom. He seems to share similar concerns when it comes to queerbaiting, “As amazing as it is that Harry Styles is on the cover of magazines in a dress, unfortunately, if we are relying on that to end homophobia then we are in big trouble.”
Although this blurring of gender within clothing is a wonderful thing, it comes with its complications. When history is forgotten, the importance of a movement is diminished and subsequently appropriated. This is what a lot of people refer to as queerbaiting. Davey defines this term as “the use of queer coded clothing or mannerisms by straight people or sexually ambiguous individuals without understanding or supporting the communities and people that created and pioneered them.”
For example, race cannot be ignored when dealing with gender and sexuality; it was fairly pointed out in November 2020 that Harry Styles, a white man, has been celebrated as the champion of a movement he did not start. For Davey, “it’s been happening for a while but even looking back as far as the 90s and 00s there have always been people, especially straight men, who have been praised for doing things that queer people have been championing for years before.”
This disparity is becoming ever more clear in the social media landscape, especially in the TikTok realm. There has been a huge increase online of cisgender heterosexual presenting individuals donning pearl necklaces, nail polish and breaking traditional gender norms in fashion. Perhaps one of the most notable examples that occurred was the maid dress trend. The same applies to femboys as well as soft boys.
TikTok influencer House of Dvey has noticed this increase, “It is definitely to do with the ‘soft boy’ aesthetic being so popular on apps like TikTok.” While this sounds incredibly positive, it can also have harmful consequences that invalidate LGBTQIA+ stories. From coming out ‘pranks’ to sexualising queer intimacy for views, the toxic rise of queerbaiting online commodifies queerness and does little to help those who actually belong to the community. For Davey, it’s not a short-lived trend but the history of his identity, “a PSA to non-queer people who are wearing the clothes that so many people fought and died for. Make sure you understand why they did that and the history that those flares, baby blue nails and pearl necklaces have.”
Unfortunately for many, they don’t. This goes back to Davey’s definition of what queerbaiting means to him. The trending of his real identity has been used as a way for non-queer people to secure views, likes and marketability. As we near Pride month, the influencer offers us to take a more critical look at its rainbow-branded capitalism, “look around Oxford Street in June. Everything is a rainbow, being queer is very profitable for companies but after June, say ciao to any of that same pride.”
This diminishes the experiences that so many of the LGBTQIA+ community face throughout their lives. The obstacles and hardships that huge numbers have had to overcome are watered down to merely aesthetic trends. Davey is unfortunately no stranger to these oppressive binaries, “We are constantly fighting for our identities. I think that’s why it’s so hard to see people who haven’t had these experiences or these traumas walk so freely in our shoes.”
What can offer us some solace in these times are queer influencers like Davey continuing to curate an authentic presence online. There is no better way to end this article than with one of his most important statements made during our conversation, “Please dress as flamboyant as you can but while you’re at it share a GoFundMe for homeless queer youth or sign a petition for trans people’s right to healthcare. Watch a movie about the AIDS epidemic, about the Stonewall riots or the ballroom scene of New York in the 80s. There is so much more to take from the queer community than a cute bag or earring. Just remember where you got it from.”
“I have enough problems going around the world [as a trans person] without literal buildings constantly telling me, ‘Hey, hey, I think you’re a dude’,” Os Keyes, a gender and technology researcher based at the University of Washington told Screen Shot. Keyes was referring to the growing trend of governments and companies deploying automated recognition of gender and sexual orientation in order to identify citizens and consumers in a wide variety of spaces, from airport terminals, retail stores and billboards to social media platforms and mobile applications.
This software, which attempts to classify people as either ‘male’ or ‘female’ based on their facial features, the way they sound and the manner in which they move, places those whose gender doesn’t match the sex they were assigned at birth at great risk of further marginalisation, exclusion and discrimination. Harnessing the rising ubiquity of AI systems, automated gender recognition technology also threatens to reinforce outdated social taboos and stereotypes surrounding gender and effectively erase anything existing outside of the crudest binary perception of ‘male’ and ‘female’.
As the EU embarks on a legislative process of regulating the use of AI within the Union, a joint campaign launched by All Out, Access Now, Reclaim Your Face and Os Keyes is calling on the EU to include an explicit ban on automated gender and sexual orientation recognition in the bill.
On 21 April, the EU Commission—the executive branch of the EU—delivered its proposal for a legal framework to regulate AI. While it did highlight the inherent risks of some AI applications, the Commission did not go as far as prohibiting the deployment of automated gender recognition. The joint campaign to ban the technology, which so far has gained over 24,000 signatures, will now place its focus on the EU Parliament and Council, which are slated to continue working on the AI regulation bill.
The campaign originally stemmed from Keyes’ research about gender recognition systems and their impact on trans and nonbinary people. “I was prompted to study these gender recognition algorithms by having to see them used in my own discipline […] seeing people use it for research purposes and as a consequence producing research that cut out people who these systems cannot recognise,” Keyes told Screen Shot. “As I got in further,” they added, “I got to see more examples of it being used and deployed in the real world and a lot of people talking about deploying it further in situations that seem very, very dangerous for trans and gender non-conforming people.”
Keyes’ research was then referenced in the EU’s five-year LGBTI strategy, in a passage pointing out the danger in deploying automated gender recognition.
When Yuri Guaiana, senior campaign manager at All Out—an international LGBTQI advocacy organisation—came across Keyes’ quote in the EU’s LGBTI strategy he became fascinated with the topic and upon further research had launched a campaign to pressure the EU to ban automated gender and sexual orientation recognition. To that end, All Out joined forces with Access Now, an NGO advocating for a human rights-based regulation of AI, and Reclaim Your Face, a citizen initiative to ban biometric mass surveillance in the EU. They also got the endorsement of Keyes, who signed the letter submitted to the EU Commission along with the petition.
Speaking to Screen Shot, Keyes mentioned various existing applications of automated gender and sexual orientation recognition and highlighted some of the risks this technology poses for trans and gender non-conforming people.
One of the examples they referenced was a campaign by the Berlin Metro on International Women’s Day 2019, where women could pay 21 per cent less than men for a ticket. In order to authenticate a rider’s gender, automated gender recognition software was embedded in ticketing machines; those who failed to be recognised as female by the system were instructed to seek help from a service person at the station.
Keyes has pointed out two main issues in this case: “the first is the fact that you are being told ‘no you do not fit’,” they said. “The second is this idea of ‘well you can just go talk to an employee and they’ll work it out for you’,” they added. “Queer and trans people do not have the best experiences going to officials going ‘hey, just to let you know, I don’t fit, and I’m not meant to be here, and can you please fix this’. And when we think about the proposed deployments in places like bathrooms, you can see pretty clearly how that could get a lot more harmful and difficult.”
Keyes also mentioned the growing use of this technology in advertising, including on physical billboards that curate ads based on the perceived gender of the person walking past it: cars for men, dresses for women, and so on. Keyes pointed out that beyond the harm this application of automated gender recognition could cause trans and non-binary people, it also circulates incredibly negative and limiting social messages pertaining to gender: “This is what you’re allowed to do with gender, this is who you can be, this is what you can buy,” they said. Yuri Guaiana of All Out seconds this analysis. “How are you assuming that just because of your gender you are interested in certain products?” he said, highlighting that “interests are more important than gender in consumer behaviour.”
But Keyes emphasised the particular trauma this type of advertising can inflict on trans and gender non-conforming people. To them, the high potential of such advertising tools to misgender people who do not ‘fall neatly’ into either gender category and its implied message that they simply do not fit embody a blatant manifestation of transphobia. “What [transphobia] actually looks like is lots of small interactions […] it’s a death of a thousand cuts.” Keyes said. “And this is something I think anyone who is trans experiences on a day-to-day basis, like the constant small harms.”
Another application of the technology, which Keyes maintains is rarer but certainly existent, is in passport biometrics and various authentication systems. In this type of deployment, automated gender recognition is used to try and reduce the number of face images the given machine has to sort through in order to confirm the person’s identity. “The problem with this is if it gets it wrong, one way or the other, then what you get is the system concluding that this person does not appear in the database even though they do, and […] someone [could be] locked out of the system for being gender non-conforming,” Keyes said, adding that the secrecy with which this technology is shrouded and the lack of transparency regarding where, when and how it is being deployed amplifies its risk.
“We know that everyone is talking about doing it, and they most certainly are, but we can’t tell where and we can’t tell which discriminatory outcomes are caused by this,” they said, referencing a case where a trans woman’s identity could not be verified by Uber’s algorithm. “That could look a hell of a lot worse if we were talking about places like, again, biometrics, border control, passport security systems; places where you have much fewer rights or abilities to appeal if you can’t even work out what the system is not recognising about you in the first place […] and where the consequences of forced interactions with officials can be much more strenuous.”
Delineating the broader harm automated gender and sexual orientation recognition can inflict, Guaiana of All Out mentioned that the use of this technology could prove life-threatening in countries where being LGBTQI is illegal. “If they are using [automated gender and sexual orientation recognition] in places where being gay is illegal, and they can predict with a huge margin of being wrong that somebody rallying against something or walking in the street is gay—that can have very serious consequences,” Guaiana said. “This technology is used by government agents, but also private companies. It is censorship. Because in certain countries […] they could start surveilling people just because they predicted they are LGBTI.”
After reading over the EU Commission’s proposal last week, Guaiana, as well as other members of the campaign, noted that despite listing some applications of AI that should be prohibited, the Commission did not go as far as it should have in calling for a ban on harmful AI technologies that violate fundamental rights. “There is no explicit—or implicit, for that matter—ban on automatic recognition of gender and sexual orientation. For us, of course, this needs improvement,” Guaiana told Screen Shot.
But All Out and its partners are far from discouraged. “Of course we would have preferred very much for the Commission to put [the ban] in the initial draft,” said Guaiana, “but I think it’s going to be a lengthy legislative process, [and] it’s still a good starting point […] There is still room to grow the campaign, keep the pressure up, and finally win this battle.”
Once more signatures are gathered and the legislative agenda and timeline of the EU Parliament and Council become known, the campaign to ban automated recognition of gender and sexual orientation will direct its resources at the Union’s representatives, recognising that they have the authority to amend the Commission’s recommendation and introduce the ban into the bill.
Guaiana and the other organisers of the campaign all believe that a ban on this particular type of technology in the EU could possibly have a global ripple effect, as did the General Data Protection Regulation (GDPR) back in 2016. Such a prohibition, says Guaiana, could “Help forbid the EU not only from implementing this technology within the EU, but also from exporting it […] and therefore that can help slow down the spread of this technology around the world.”
As we tackle the behemoth that is the tech industry, and as we try to regulate the application of various AI technologies and their deployment by both governments and companies, it is easy to feel powerless in the face of their seemingly inexorable force. Keyes, however, offers a slightly more optimistic—though pragmatist, as they define it—take on the issue. “I happen to believe that people thinking they can’t interfere [with technological development] is why interfering hasn’t worked thus far,” they said, “and there are a lot of examples that we don’t necessarily think about of technologies being banned in ways that did seriously derail things. Like, I’m a trans person, do you know how shitty trans healthcare is partly because nobody bothered doing any research because of the social taboos behind it?”
“We think of them as bad examples, but in a weird way they actually demonstrate that we can intervene in technological development; we can slow things down and we can redirect things,” they said, adding that our objective shouldn’t only be to root out the already existing technologies that prove harmful, but challenge the very way we approach, research and develop technology in the first place. “I think it’s possible,” they finally said, “because, well, if changing how people do things isn’t possible then the technology industry isn’t shit, because that’s what they claim they’ve been doing this whole time. Like, you’re telling me that your app can disrupt society beyond recognition, but also your software developers’ workflow is immutable and cannot be changed? One of those two things is false.”