"A ‘feminist approach’ to Generative AI (GenAI) involves a thorough reassessment of how we think about, create, and apply this technology."

Feminist AI: transforming and challenging the current Artificial Intelligence (AI) Industry

As a discipline and as a practice, data science has traditionally been led by a narrow cohort of people, representing a homogeneous demographic. This lack of diversity has contributed to the (often unconscious) perpetration of biases and stereotypes, which pose significant risks to society. Increasing diversity in data science and Artificial Intelligence (AI) not only mitigates these risks, but also fosters creativity and produces better outcomes.

The Feminist Generative AI Lab is started by the Convergence-programme AI, Data & Digitalisation, and is led by Sara Colombo at the TU Delft. Feminist AI offers a new perspective on the AI industry by challenging the notion that AI design, development, and deployment should be guided by – and should mainly benefit, a limited group of individuals, predominantly from Western male backgrounds. Colombo advocates for other voices in data science and AI. “Only by making other voices heard can we ensure a more fair and equitable development of data science and AI, and the products and services that are built on them.”

Why should women be concerned with AI?

In a predominantly male-dominated industry, there can be challenges in developing services that cater specifically to women, or that adequately consider their distinct needs when generating new AI solutions, Colombo explains. “Whether consciously or unconsciously, AI solutions often reinforce the biases, inequalities, and imbalances that women experience daily. AI can discriminate against women by preventing equal access to certain services. Examples of this are AI systems that give women less bank credit than men with the same financial assets, or clinical AI decision-making systems performing better for men than women.”

Colombo points out AI is something women should be aware of and should increasingly engage with, because AI systems are everywhere, and they affect women’s lives, not always in positive ways. Moreover, women can bring diverse and unique perspectives to the development of services powered by data and AI, and can leverage AI to design services addressing problems and needs specific to women, such as domestic violence, or health conditions that mainly affect women.

Only by making other voices heard, can we ensure a more fair and equitable development of data science and AI, and the products and services that are built on it.

What does a ‘feminist approach’ mean for Generative AI (GenAI)?

Although the term ‘feminist’ may seem to refer to women only, a feminist approach to technology expands beyond addressing gender disparities to encompass a broader spectrum of social inequalities, says Colombo. A ‘feminist approach’ to Generative AI (GenAI) involves a thorough reassessment of how we think about, create, and apply this technology.

A ‘feminist approach’ requires us to critically examine the power dynamics and forms of oppression inherent in society, and reflected by AI and GenAI, and to actively elevate the voices of those who have traditionally been marginalised, such as women. Colombo elaborates: “This means considering how GenAI can be developed with greater ethical integrity, tailored to benefit those with less societal privilege, and aligned with the values and needs of marginalised communities. Ultimately, it's about using technology as a tool for empowerment and social justice, to address issues such as healthcare disparities and economic inequality.”

What is data feminism?

Data feminism is a framework that examines the intersection of data science and feminist theory. It challenges standard practices in data science, which can perpetuate and reinforce existing biases and power imbalances, also regarding gender. Data feminism analyses and surfaces such inequalities, and addresses these issues by advocating for more inclusive, equitable, embodied, and context-aware approaches to data collection, analysis, visualisation, and interpretation, as well as a more equal access to services based on data, and a more participatory way to design and envision the future of this technology.

Colombo advocates that we should openly recognise data biases and mitigate them instead of pretending data is neutral. “Data feminism matters because it proposes an alternative to current practices and approaches in data science, which are often limited in their ability to produce inclusive, just, and accessible systems.”

Data feminism matters because it proposes an alternative to current practices and approaches in data science, which are often limited in their ability to produce inclusive, just, and accessible systems

Where does gender bias in AI come from?

According to Colombo, the lack of diversity leads to the (often unconscious) perpetration of biases and stereotypes, which pose a risk to society. She adds that bias in AI often comes from the datasets used to train AI-models, which may not adequately represent the diversity of human experiences. This leads to models that perform better for certain groups of people (or certain genders). Gender biases can manifest in less accurate (or even harmful) AI predictions, worse user experiences, or discriminatory outcomes. For instance, certain AI-based recruiting systems have shown to favour man over women candidates; AI-generated images of women often prioritise certain physical features or depict them in domestic roles; and Natural Language Processing (NLP) models may generate gendered language that associates women to certain roles or professions.

What would the effects of ‘feminist AI’ be?

Feminist AI can transform the way we approach the AI industry, says Colombo. “It challenges the notion that only a few, typically affluent, Western, male individuals should have control over the way AI is designed, developed, and deployed. The effects of feminist AI include prioritising positive societal impact, promoting fairness, inclusion, and empowerment; recognising and addressing biases, and giving more agency and power to the people who are affected by AI systems, also by engaging them in design processes.”

How does gender-based discrimination or violence happen through GenAI?

Recently, Colombo tested a Large Language Model (LLM), asking it to generate names for doctors and nurses. The LLM primarily associated male names to doctors and female names to nurses. She adds: “GenAI systems may discriminate in many other ways. For instance, by failing to accurately diagnose health conditions that are more prevalent among women. If not trained properly, GenAI models used in conversational agents may dismiss symptoms associated with women-specific diseases, or may fail to generate the right follow-up questions.  They may even fail to recognise and properly respond to signs of distress or violence. All this can result in disparities in medical care and perpetuate gender-based discrimination and harm.”

Many other domains are affected by similar issues. For example, if trained on biased datasets containing misogynistic or discriminatory language, GenAI online moderation systems may inadvertently allow, or even reinforce the propagation of gender-based discrimination and violence in online conversations. Despite efforts to address such issues in GenAI, Colombo claims there remains a significant need for ongoing research to ensure GenAI systems mitigate, rather than perpetuate, gender biases and discrimination across industries.

Why train and use AI in ways that prevent existing (gender) biases from being perpetrated, reinforced, and amplified by AI?

To me, the answer is crystal clear, Colombo responds. “However, not everyone agrees on what worldviews and values should be reflected by AI systems. I believe women’s presence (and that of other marginalised groups) in the AI industry is essential – among other things, to foster meaningful discussions on why and how we should proactively use AI to fight sexism, discrimination, bias, and inequalities that are entrenched in society, rather than reflect or even amplify them. Their presence is also fundamental to debate in an open, participatory, respectful, and inclusive manner what future we want to build, also through technology. Because without the presence of women, these conversations are less likely to occur, and we will miss the chance to reduce inequalities, redistribute power, and shape a fairer society by developing a more just technology.”

Instead of human-centred AI, should we focus on woman-centred AI?

“The concept of ‘woman-centred AI’ replacing ‘human-centred AI’ might be a bit misleading,” Sara Colombo states, “although we do need more AI solutions addressing women’s needs, problems, and conditions. To me, the key is to make ‘human-centred AI’ truly humanity-centred, by ensuring that diverse voices, groups, and communities are kept at the centre of AI development. However, “women-driven AI” is a fascinating concept, and I’m curious to see how the growing presence of women in AI will transform this research domain and the AI industry.”

To me, the key is to make ‘human-centred AI’ truly humanity-centred, by ensuring that diverse voices, groups, and communities are kept at the centre of AI development.

About the work of the Convergence AI Lab: Feminist Generative AI Lab

The Feminist Generative AI Lab is started by the Convergence-programme AI, Data & Digitalisation, and is a joint research lab between TU Delft and Erasmus University of Rotterdam. Sara Colombo is the co-founder and co-director of the Feminist Generative AI Lab. The lab serves as a pioneering hub where generative AI, design, and data feminism meet. The aim of the lab is to find alternative pathways in AI development that prioritise equality, societal good, and celebrate the richness of diversity in human experience.

All researchers involved are aware of the profound impacts of GenAI applications and are committed to exploring the ethical implications and possibilities within this emerging field through a ‘feminist lens’. As previously discussed, a feminist approach to technology is not just about gender. It is also about fostering inclusivity, challenging binary perspectives, and embracing diversity.

Photography: Zhiying Liu