A shocking revelation made by an in-depth WIRED investigation has exposed that AI was used in Argentina to obtain wholly invasive data about young girls situated in the northern province of Salta back in 2018. The backdrop in which this was occurring coincided with the Argentine Congress’ vigorous debate on the legalisation of abortion in the country—this context is vital to know. The bill to legalise reproductive rights in the country was ultimately rejected at the time, with many stating the influence of the Catholic Church as an overriding factor.
“Lawmakers chose today to turn their backs on hundreds of thousands of women and girls who have been fighting for their sexual and reproductive rights,” Mariela Belski, the director of Argentina’s Amnesty International, told The Guardian at the time. “All that this decision does is perpetuate the circle of violence which women, girls and others who can become pregnant are forced into.” Little did we know at this point that that violence extended into the world of AI.
It is becoming increasingly clear that the realms of tech are just as unsafe as the real world for women—take the recent ‘gang rape’ that took place in Facebook’s metaverse—and now, very disturbing facts are emerging around Microsoft’s AI software. In 2018, the Ministry of Early Childhood, situated in the province of Salta, collaborated with tech corporation Microsoft to develop an algorithmic system powered by AI that would help ‘predict teenage pregnancy’. The program was titled ‘Technology Platform for Social Intervention’. Its purpose? To forecast which girls, specifically those from low-income areas, were most likely to fall pregnant within the next five years.
The system’s data was collated by looking at the age, country of origin, ethnicity, disability and even whether the person in question had hot water to predict if they were predestined for motherhood. As you can imagine, none of the individuals who were analysed gave their consent prior to this little experiment. Following a girl’s ‘predestined to pregnancy’ label, WIRED noted that it was unclear what would happen to her or how such data would be useful in preventing teen pregnancy. In an even more terrifying revelation, the publication reported that those analysed by the creepy AI software were also visited by “territorial agents,” who inspected the homes of the girls, took photos and registered their GPS locations—the girls in question were often poor, migrants or from Indigenous roots.
WIRED’s report makes note of the colonial history of the country and its impact on the many Indigenous communities in the country. Detailing the horrors of the 70s to 80s dictatorship where women were required to ‘populate the country’—with a ban on contraceptions while being murdered following the birth and children adopted into ‘patriotic Catholic families’—the publication highlighted the worrying similarities in rhetorics between then and the alarming eugenics thinking still motivating reproductive rights. Only this time, using AI.
Though it is unclear whether the use of these somewhat dystopian technologies has come to an end, it is thanks to the grassroots feminist activists in the country and their tireless work that such a gross violation of the rights of the women and girls of Salta was exposed. Some of those women include feminist scholars Paz Peña and Joana Varon. “The idea that algorithms can predict teenage pregnancy before it happens is the perfect excuse for anti-women and anti-sexual and reproductive rights activists to declare abortion laws unnecessary,” they wrote. The ‘Technology Platform for Social Intervention’ seems nothing more than yet another bid to control and remove the reproductive rights of women, girls and anyone with a uterus.
“It is also important to point out that the database used in the platform only has data on females. This specific focus on a particular sex reinforces patriarchal gender roles and ultimately, blames female teenagers for unwanted pregnancies, as if a child could be conceived without a sperm,” they continued. The scholars also cited the statement of conservative politician and governor of Salta, Juan Manuel Urtubey, who declared at the time that “with technology, based on name, surname and address, you can predict five or six years ahead which girl, or future teenager, is 86 per cent predestined to have a teenage pregnancy.” Peña and Varon revealed that the Ministry of Early Childhood joined forces with anti-abortion NGO the CONIN Foundation, along with Microsoft, to develop this algorithm.
“According to their narratives, if they have enough information from poor families, conservation public policies can be deployed to predict and avoid abortions by poor women. Moreover, there is a belief that, ‘If it is recommended by an algorithm, it is mathematics, so it must be true and irrefutable’,” Peña and Varon continued. Ana Pérez Declercq, director of the Observatory of Violence Against Women, shared similar sentiments with WIRED, “It confounds socioeconomic variables to make it seem as if the girl or woman is solely to blame for her situation. It is totally lacking any concern for context. This AI system is one more example of the state’s violation of women’s rights. Imagine how difficult it would be to refuse to participate in this surveillance.”
The algorithm—heralded by Microsoft as “one of the most pioneering cases in the use of AI”—presents a worrying display of how tech can be used by those in power to exacerbate the already evident disparities in equality across digital spaces and how that can be used to influence real world changes.