“I have enough problems going around the world [as a trans person] without literal buildings constantly telling me, ‘Hey, hey, I think you’re a dude’,” Os Keyes, a gender and technology researcher based at the University of Washington told Screen Shot. Keyes was referring to the growing trend of governments and companies deploying automated recognition of gender and sexual orientation in order to identify citizens and consumers in a wide variety of spaces, from airport terminals, retail stores and billboards to social media platforms and mobile applications.
This software, which attempts to classify people as either ‘male’ or ‘female’ based on their facial features, the way they sound and the manner in which they move, places those whose gender doesn’t match the sex they were assigned at birth at great risk of further marginalisation, exclusion and discrimination. Harnessing the rising ubiquity of AI systems, automated gender recognition technology also threatens to reinforce outdated social taboos and stereotypes surrounding gender and effectively erase anything existing outside of the crudest binary perception of ‘male’ and ‘female’.
As the EU embarks on a legislative process of regulating the use of AI within the Union, a joint campaign launched by All Out, Access Now, Reclaim Your Face and Os Keyes is calling on the EU to include an explicit ban on automated gender and sexual orientation recognition in the bill.
On 21 April, the EU Commission—the executive branch of the EU—delivered its proposal for a legal framework to regulate AI. While it did highlight the inherent risks of some AI applications, the Commission did not go as far as prohibiting the deployment of automated gender recognition. The joint campaign to ban the technology, which so far has gained over 24,000 signatures, will now place its focus on the EU Parliament and Council, which are slated to continue working on the AI regulation bill.
The campaign originally stemmed from Keyes’ research about gender recognition systems and their impact on trans and nonbinary people. “I was prompted to study these gender recognition algorithms by having to see them used in my own discipline […] seeing people use it for research purposes and as a consequence producing research that cut out people who these systems cannot recognise,” Keyes told Screen Shot. “As I got in further,” they added, “I got to see more examples of it being used and deployed in the real world and a lot of people talking about deploying it further in situations that seem very, very dangerous for trans and gender non-conforming people.”
Keyes’ research was then referenced in the EU’s five-year LGBTI strategy, in a passage pointing out the danger in deploying automated gender recognition.
When Yuri Guaiana, senior campaign manager at All Out—an international LGBTQI advocacy organisation—came across Keyes’ quote in the EU’s LGBTI strategy he became fascinated with the topic and upon further research had launched a campaign to pressure the EU to ban automated gender and sexual orientation recognition. To that end, All Out joined forces with Access Now, an NGO advocating for a human rights-based regulation of AI, and Reclaim Your Face, a citizen initiative to ban biometric mass surveillance in the EU. They also got the endorsement of Keyes, who signed the letter submitted to the EU Commission along with the petition.
Speaking to Screen Shot, Keyes mentioned various existing applications of automated gender and sexual orientation recognition and highlighted some of the risks this technology poses for trans and gender non-conforming people.
One of the examples they referenced was a campaign by the Berlin Metro on International Women’s Day 2019, where women could pay 21 per cent less than men for a ticket. In order to authenticate a rider’s gender, automated gender recognition software was embedded in ticketing machines; those who failed to be recognised as female by the system were instructed to seek help from a service person at the station.
Keyes has pointed out two main issues in this case: “the first is the fact that you are being told ‘no you do not fit’,” they said. “The second is this idea of ‘well you can just go talk to an employee and they’ll work it out for you’,” they added. “Queer and trans people do not have the best experiences going to officials going ‘hey, just to let you know, I don’t fit, and I’m not meant to be here, and can you please fix this’. And when we think about the proposed deployments in places like bathrooms, you can see pretty clearly how that could get a lot more harmful and difficult.”
Keyes also mentioned the growing use of this technology in advertising, including on physical billboards that curate ads based on the perceived gender of the person walking past it: cars for men, dresses for women, and so on. Keyes pointed out that beyond the harm this application of automated gender recognition could cause trans and non-binary people, it also circulates incredibly negative and limiting social messages pertaining to gender: “This is what you’re allowed to do with gender, this is who you can be, this is what you can buy,” they said. Yuri Guaiana of All Out seconds this analysis. “How are you assuming that just because of your gender you are interested in certain products?” he said, highlighting that “interests are more important than gender in consumer behaviour.”
But Keyes emphasised the particular trauma this type of advertising can inflict on trans and gender non-conforming people. To them, the high potential of such advertising tools to misgender people who do not ‘fall neatly’ into either gender category and its implied message that they simply do not fit embody a blatant manifestation of transphobia. “What [transphobia] actually looks like is lots of small interactions […] it’s a death of a thousand cuts.” Keyes said. “And this is something I think anyone who is trans experiences on a day-to-day basis, like the constant small harms.”
Another application of the technology, which Keyes maintains is rarer but certainly existent, is in passport biometrics and various authentication systems. In this type of deployment, automated gender recognition is used to try and reduce the number of face images the given machine has to sort through in order to confirm the person’s identity. “The problem with this is if it gets it wrong, one way or the other, then what you get is the system concluding that this person does not appear in the database even though they do, and […] someone [could be] locked out of the system for being gender non-conforming,” Keyes said, adding that the secrecy with which this technology is shrouded and the lack of transparency regarding where, when and how it is being deployed amplifies its risk.
“We know that everyone is talking about doing it, and they most certainly are, but we can’t tell where and we can’t tell which discriminatory outcomes are caused by this,” they said, referencing a case where a trans woman’s identity could not be verified by Uber’s algorithm. “That could look a hell of a lot worse if we were talking about places like, again, biometrics, border control, passport security systems; places where you have much fewer rights or abilities to appeal if you can’t even work out what the system is not recognising about you in the first place […] and where the consequences of forced interactions with officials can be much more strenuous.”
Delineating the broader harm automated gender and sexual orientation recognition can inflict, Guaiana of All Out mentioned that the use of this technology could prove life-threatening in countries where being LGBTQI is illegal. “If they are using [automated gender and sexual orientation recognition] in places where being gay is illegal, and they can predict with a huge margin of being wrong that somebody rallying against something or walking in the street is gay—that can have very serious consequences,” Guaiana said. “This technology is used by government agents, but also private companies. It is censorship. Because in certain countries […] they could start surveilling people just because they predicted they are LGBTI.”
After reading over the EU Commission’s proposal last week, Guaiana, as well as other members of the campaign, noted that despite listing some applications of AI that should be prohibited, the Commission did not go as far as it should have in calling for a ban on harmful AI technologies that violate fundamental rights. “There is no explicit—or implicit, for that matter—ban on automatic recognition of gender and sexual orientation. For us, of course, this needs improvement,” Guaiana told Screen Shot.
But All Out and its partners are far from discouraged. “Of course we would have preferred very much for the Commission to put [the ban] in the initial draft,” said Guaiana, “but I think it’s going to be a lengthy legislative process, [and] it’s still a good starting point […] There is still room to grow the campaign, keep the pressure up, and finally win this battle.”
Once more signatures are gathered and the legislative agenda and timeline of the EU Parliament and Council become known, the campaign to ban automated recognition of gender and sexual orientation will direct its resources at the Union’s representatives, recognising that they have the authority to amend the Commission’s recommendation and introduce the ban into the bill.
Guaiana and the other organisers of the campaign all believe that a ban on this particular type of technology in the EU could possibly have a global ripple effect, as did the General Data Protection Regulation (GDPR) back in 2016. Such a prohibition, says Guaiana, could “Help forbid the EU not only from implementing this technology within the EU, but also from exporting it […] and therefore that can help slow down the spread of this technology around the world.”
As we tackle the behemoth that is the tech industry, and as we try to regulate the application of various AI technologies and their deployment by both governments and companies, it is easy to feel powerless in the face of their seemingly inexorable force. Keyes, however, offers a slightly more optimistic—though pragmatist, as they define it—take on the issue. “I happen to believe that people thinking they can’t interfere [with technological development] is why interfering hasn’t worked thus far,” they said, “and there are a lot of examples that we don’t necessarily think about of technologies being banned in ways that did seriously derail things. Like, I’m a trans person, do you know how shitty trans healthcare is partly because nobody bothered doing any research because of the social taboos behind it?”
“We think of them as bad examples, but in a weird way they actually demonstrate that we can intervene in technological development; we can slow things down and we can redirect things,” they said, adding that our objective shouldn’t only be to root out the already existing technologies that prove harmful, but challenge the very way we approach, research and develop technology in the first place. “I think it’s possible,” they finally said, “because, well, if changing how people do things isn’t possible then the technology industry isn’t shit, because that’s what they claim they’ve been doing this whole time. Like, you’re telling me that your app can disrupt society beyond recognition, but also your software developers’ workflow is immutable and cannot be changed? One of those two things is false.”
Yulia Tsvetkova, a 27-year-old feminist and queer rights activist from Komsomolsk-on-Amur in East Russia, has been charged with violating the Russian “gay propaganda” law and distribution of “pornography” for sharing drawings of same-sex families and vaginas on social media.
Last month, the prosecutor’s office in charge of her case approved the indictment against Tsvetkova; if convicted, she could face up to six years in prison. Tsvetkova’s persecution by the Russian authorities reflects a broader campaign by the government to crackdown on members of the queer community and muzzle anyone advocating for their freedom and rights.
All Out, an international NGO fighting for LGBTQ rights, has teamed up with the Moscow Community Center and launched a petition calling for the elimination of the charges against Tsvetkova and for the abolition of Russia’s “gay propaganda” law.
The authorities’ persecution of Tsvetkova began in 2019, when she was preparing to stage a play titled ‘Blue and Pink’ which dealt with gender stereotypes and criticised the country’s culture of militarism. Following mounting pressure from the authorities, Tsvetkova cancelled the play.
“I don’t know which was worse for the authorities, the play about gender, which they don’t understand and are afraid of, or the other play, which was pretty political, very sharp. I guess it’s the combination of both that got me here,” Tsvetkova told CNN.
Following the play incident, Tsvetkova and her mother were summoned to the police station either on a weekly or bi-weekly recurrence as the authorities scoured for any shred of evidence that could help them press criminal charges against her. Finally, the police came across a blog titled ‘The Vagina Monologues’ that Tsvetkova had founded and managed, in which she featured drawings of female body parts created by herself and others.
Through her work, Tsvetkova sought to shatter stereotypes surrounding the vagina and promote body positivity. The text in one of her drawings, for instance, read “Women who are alive have body fat and this is fine!”
It was for posting these drawings that the authorities charged Tsvetkova with promoting pornography. Then, in January 2020, she was charged with violating the notorious “gay propaganda law” after she posted a drawing featuring same-sex families along with the caption “A family is where there is love. Support LGBT+ families!”
After being placed under house arrest, Tsvetkova was released in March 2020, but has since been prohibited from leaving the country or changing her address.
Tsvetkova’s arrest has drawn sharp criticism from human and LGBTQ rights activists and organisations around the world. Last year, Amnesty International, along with several other NGOs, had recognised Tsvetkova as a political prisoner and called for the charges against her to be dropped.
“Russian authorities have tried everything to intimidate Yulia: They searched her home, put her under house arrest for over three months, ordered her not to leave the country, fined her twice for violating the Russian ‘gay propaganda’ law, and brought trumped-up charges against her for ‘distributing pornography’,” said Matt Beard, Executive Director of All Out. “Now her trial can happen any time and she could go to jail for up to six years. And all of this just for sharing on social media innocent drawings of same-sex families and motives promoting inclusivity. Nobody should be prosecuted simply for expressing their wish for equality,” he added.
The controversy has also spread throughout Russia, where, despite the public’s deep-rooted conservatism, individuals and groups have nonetheless taken to social media and the streets to protest Tsvetkova’s arrest. On social media, women have been posting pictures of their bodies (often emphasising hair, curves and skin blemishes) along with the phrase “my body is not pornography” in solidarity with Tsvetkova.
Protests against Tsvetkova’s arrest have been taking place throughout Russia, and have even reached her hometown in the far Eastern region of the country. Numerous artists and media figures have also come out in support of her, something Tsvetkova claims has made her feel less alone in her struggle.
“Anonymity is the scariest thing,” she told DW, “and I know that because I was alone at the beginning. It meant that if I was at the police station, I knew that they could do whatever they want and no one would ever find out.”
“[Tsvetkova] is not the first person to be targeted under the ‘gay propaganda’ law. But with your help, she might be the last,” reads All Out’s petition, which has so far garnered over 165,000 signatures. Her trial could begin any day now.