4

The Case of the Creepy Algorithm That ‘Predicted’ Teen Pregnancy

 2 years ago
source link: https://www.wired.com/story/argentina-algorithms-pregnancy-prediction/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
Feb 16, 2022 7:00 AM

The Case of the Creepy Algorithm That ‘Predicted’ Teen Pregnancy

A government leader in Argentina hailed the AI, which was fed invasive data about girls. The feminist pushback could inform the future of health tech.
Photo-Illustration: Sam Whitney; Getty Images

para leer este articulo en español por favor aprete aqui.

In 2018, while the Argentine Congress was hotly debating whether to decriminalize abortion, the Ministry of Early Childhood in the northern province of Salta and the American tech giant Microsoft presented an algorithmic system to predict teenage pregnancy. They called it the Technology Platform for Social Intervention.

ABOUT

Diego Jemio is a journalist, educator, and podcaster. He currently writes for the Clarín newspaper (Buenos Aires), Vértice (Mexico), and other media. He is the creator of the podcast Epistolar.

Alexa Hagerty is an anthropologist researching human rights, technology, and AI resistance. She is an Associate Fellow at the University of Cambridge and a senior consultant in the JUST AI network.

Florencia Aranda is an Argentine feminist activist, poet, and independent researcher. She studies contemporary Latin American literature at the University of Buenos Aires.

“With technology you can foresee five or six years in advance, with first name, last name, and address, which girl—future teenager—is 86 percent predestined to have an adolescent pregnancy,” Juan Manuel Urtubey, then the governor of the province, proudly declared on national television. The stated goal was to use the algorithm to predict which girls from low-income areas would become pregnant in the next five years. It was never made clear what would happen once a girl or young woman was labeled as “predestined” for motherhood or how this information would help prevent adolescent pregnancy. The social theories informing the AI system, like its algorithms, were opaque.

The system was based on data—including age, ethnicity, country of origin, disability, and whether the subject’s home had hot water in the bathroom—from 200,000 residents in the city of Salta, including 12,000 women and girls between the ages of 10 and 19. Though there is no official documentation, from reviewing media articles and two technical reviews, we know that "territorial agents" visited the houses of the girls and women in question, asked survey questions, took photos, and recorded GPS locations. What did those subjected to this intimate surveillance have in common? They were poor, some were migrants from Bolivia and other countries in South America, and others were from Indigenous Wichí, Qulla, and Guaraní communities.

Although Microsoft spokespersons proudly announced that the technology in Salta was “one of the pioneering cases in the use of AI data” in state programs, it presents little that is new. Instead, it is an extension of a long Argentine tradition: controlling the population through surveillance and force. And the reaction to it shows how grassroots Argentine feminists were able to take on this misuse of artificial intelligence.

In the 19th and early 20th centuries, successive Argentine governments carried out a genocide of Indigenous communities and promoted immigration policies based on ideologies designed to attract European settlement, all in hopes of blanquismo, or “whitening” the country. Over time, a national identity was constructed along social, cultural, and most of all racial lines.

Advertisement

This type of eugenic thinking has a propensity to shapeshift and adapt to new scientific paradigms and political circumstances, according to historian Marisa Miranda, who tracks Argentina’s attempts to control the population through science and technology. Take the case of immigration. Throughout Argentina’s history, opinion has oscillated between celebrating immigration as a means of “improving” the population and considering immigrants to be undesirable and a political threat to be carefully watched and managed.

More recently, the Argentine military dictatorship between 1976 and 1983 controlled the population through systematic political violence. During the dictatorship, women had the “patriotic task” of populating the country, and contraception was prohibited by a 1977 law. The cruelest expression of the dictatorship’s interest in motherhood was the practice of kidnapping pregnant women considered politically subversive. Most women were murdered after giving birth and many of their children were illegally adopted by the military to be raised by “patriotic, Catholic families.”

While Salta’s AI system to “predict pregnancy” was hailed as futuristic, it can only be understood in light of this long history, particularly, in Miranda’s words, the persistent eugenic impulse that always “contains a reference to the future” and assumes that reproduction “should be managed by the powerful.”

Due to the complete lack of national AI regulation, the Technology Platform for Social Intervention was never subject to formal review and no assessment of its impacts on girls and women has been made. There has been no official data published on its accuracy or outcomes. Like most AI systems all over the world, including those used in sensitive contexts, it lacks transparency and accountability.

Though it is unclear whether the technology program was ultimately suspended, everything we know about the system comes from the efforts of feminist activists and journalists who led what amounted to a grassroots audit of a flawed and harmful AI system. By quickly activating a well-oiled machine of community organizing, these activists brought national media attention to how an untested, unregulated technology was being used to violate the rights of girls and women.

“The idea that algorithms can predict teenage pregnancy before it happens is the perfect excuse for anti-women and anti-sexual and reproductive rights activists to declare abortion laws unnecessary,” wrote feminist scholars Paz Peña and Joana Varon at the time. Indeed, it was soon revealed that an Argentine nonprofit called the Conin Foundation, run by doctor Abel Albino, a vocal opponent of abortion rights, was behind the technology, along with Microsoft.

Featured Video

“[The technology program] is a patriarchal contrivance,” said Ana Pérez Declercq, director of the Observatory of Violence Against Women. “It confounds socioeconomic variables to make it seem as if the girl or woman is solely to blame for her situation. It is totally lacking any concern for context. This AI system is one more example of the state's violation of women's rights. Imagine how difficult it would be to refuse to participate in this surveillance.” She added that families depend on the program’s sponsoring agency, the Ministry of Early Childhood, for services like vaccinations and free milk. In a country that ended 2021 with half its population living in poverty, this is crucial support that vulnerable girls and women can’t afford to risk by speaking out.

The Applied Artificial Intelligence Laboratory at the University of Buenos Aires highlighted the platform’s serious technical and design errors and challenged the developers' claims that the model made “correct predictions 98.2 percent of the time.” Technical reviews were based on incomplete information because the system lacked transparency. Nevertheless it was revealed that the system database included ethnic and socioeconomic data, but included nothing about access to sex education or contraception, which public health efforts worldwide recognize as the most important tools in reducing rates of teenage pregnancy. “Methodological problems such as the unreliability of the data pose the risk of leading policy makers to take misguided actions,” said Diego Fernandez Slezak, director of the lab.

While Salta’s plan to predict pregnancy was publicly critiqued by academics and journalists, feminist activists used this media attention to enforce a measure of public accountability, even in face of a complete lack of AI regulation by the state. This effective resistance to the AI system was possible because Argentine feminists had already built a powerful social movement.

As far back as the 19th century, immigrant women in Buenos Aires advocated for equality and reproductive rights in the anarchist paper La Voz de la Mujer (Women’s Voice) with the slogan “Ni Dios, ni patrón, ni marido” (neither god, nor boss, nor husband).

See What’s Next in Tech With the Fast Forward Newsletter

From artificial intelligence and self-driving cars to transformed cities and new startups, sign up for the latest news.
By signing up you agree to our User Agreement (including the class action waiver and arbitration provisions), our Privacy Policy & Cookie Statement and to receive marketing and account-related emails from WIRED. You can unsubscribe at any time.

After the dictatorship, Argentine women began to organize. The first National Women's Meeting took place in 1986, inspired by the Latin American and Caribbean Feminist Meeting in 1981 and the Third World Women's Conference in Nairobi in 1985. Thousands of women attend the annual gatherings, which have served to develop strategies to fight for reproductive rights. The National Campaign for the Right to Legal, Safe, and Free Abortion was inaugurated at the 2005 meeting with the slogan “Educación sexual para decidir, anticonceptivos para no abortar, aborto legal para no morir,” meaning “sexual education to decide, contraceptives to avoid abortion, legal abortions to survive.”

The feminist campaign against gender-based violence #NiUnaMenos (Not One Woman Less) erupted in Argentina in 2015, ignited by the discovery of the body of Chiara Paez, a 14-year-old pregnant teenager murdered by her boyfriend in the province of Santa Fe. The massive movement attracted women from all walks of life and brought together a wide variety of activist movements including groups advocating for Indigenous rights, LGBTQ rights, and immigrant rights. Protests quickly spread to Uruguay, Chile, and Mexico. Cities such as Washington, DC, Paris, Barcelona, Oslo, Amsterdam, Geneva, and Beijing joined in with mobilizations largely led by Latin American women living in those cities.

#NiUnaMenos caught fire on social networks. The movement exposed the almost total absence of official data and statistics on gender-based violence in Argentina and other countries, leading to grassroots data activism to track and publicize femicide and other forms of violence. (Efforts pioneered by activists were later taken up by the government, including a national system in Argentina to register cases of gender violence—.)

Then, in 2018, a new feminist movement emerged when more than a million protesters filled the streets of Buenos Aires demanding the legalization of abortion. This launched the “Marea Verde” or Green Wave, which took its name from the green scarves the activists wore, inspired by the iconic white headscarves of the Madres of the Plaza de Mayo, mothers and grandmothers who courageously defied the dictatorship to search for their murdered children and kidnapped grandchildren. Like #NiUnaMenos, the Green Wave quickly spread to other Latin American countries, and even to Poland where activists rallying against abortion restrictions wore green scarves at protests in 2021. Propelled by the Green Wave, in 2020 Argentina became one of the first countries in Latin America to legalize the right to abortion.

As Zeynep Tufekci and other scholars have documented, data activism does not occur in a vacuum. It may spread through hashtags, but it only works when it is supported by well-organized on-the-ground networks—like #NiUnaMenos and Green Wave—that Argentine feminists have built.

Argentina has declared its intention to be a regional leader in technology development and AI. Buenos Aires has cultivated a thriving startup scene, despite the country’s ongoing economic instability. In Argentina, as in many parts of the world, AI development booms, but regulation lags dangerously behind. The government of former president Mauricio Macri announced a national AI plan in 2019, but it has been largely abandoned. For a national AI plan be effective at safeguarding society, particularly historically marginalized populations, policymakers should look to the on-the-ground work feminists have been doing.

The feminist collective Socorro Rosa, or “Pink Rescue,” like the Social Intervention Technology platform seeks to tackle the issue of adolescent pregnancy, and both use the word destino, meaning “fate” or “destiny,” to describe their projects. But they stand in stark contrast. Socorro Rosa highlights data that reveals the high number of teenage pregnancies that are a consequence of sexual violence and asserts the right of adolescents to safety, consent, sex education, and access to safe abortion. The group describes the campaign as a means of “challenging, questioning, and daring to imagine other fates [destinos]” for girls and women. The AI system, in contrast, focuses on individual girls and women living in poverty, with no social context. Taking the fate of these teenagers to be sealed, with some “predestined” for pregnancy, it spits out secret predictions. Meanwhile, Socorro Rosa and other feminist activists operate under the assumption that girls and women are entangled in larger political systems, and that social change is possible.

In tackling teenage pregnancy, as with other social issues, feminist activists in Argentina have worked to make knowledge and data accessible to all and to promote forms of solidarity and empowerment rather than surveillance and control. This offers an important model for how complex social issues should be engaged: with participation, context, and transparency. These are the values that should inform the design of health technologies.

The EU and US are developing concrete plans to audit algorithmic systems. AI impact assessments must take into consideration social context and lived experience. One size doesn’t fit all. A Microsoft platform may cause different harms in Salta than in Redmond, where the company is based. Local communities and activists have crucial insights into the social impacts of AI systems on their lives. They also have strategies for resistance and mobilization that can scale and spread. The feminist activists of Argentina may spark yet another wave of resistance, this time to transform how we live with AI and evaluate its harms—a “pink rescue” from discriminatory and dangerous AI.

The authors wish to thank the Histories of AI: A Genealogy of Power seminar at the University of Cambridge for generous feedback on this research.


More Great WIRED Stories

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK