Blog | Approaching AI as a Site of Feminist Research: A Reflection
Credits: Unsplash
Event
/
Aug 2024

Blog | Approaching AI as a Site of Feminist Research: A Reflection

Sasha John

The event, titled 'Research as Resistance: People's Research for Movement Building in Asia Pacific,' included panels on topics like "Contexts We Research In," "Building People Power," and "Decolonising Research." Initially, my panel on "Research, AI, and Digital Rights and Justice" seemed distant from these themes, which seemed much closer to feminist research around labour rights, sexual violence, and economic policy.

But as I thought about DFL’s work, both what we’re doing now and what we’ll continue to do, it became immediately clear to me that approaching AI as a site of research is as much a feminist research prerogative as any (or it should be), given the far-reaching impacts the AI value chain has on the representations of and biases against women, the changing nature of work, labour rights, climate change, and the welfare of marginalised communities in the Global South and otherwise.

Feminist Participatory Action Research (FPAR) & Researching Tech

FPAR encompasses research methodologies, practices and objectives that are fundamentally interested in challenging entrenched, unjust power relations through the centring and amplification of women’s voices, knowledge and rights. Research, as we discussed at the convening, is about building evidence to understand real-world issues and their impact on people—something central to DFL's mission. Research can challenge dominant narratives around technology and AI, mobilising people facing rejection, inadequate representation, and discrimination. We were acutely aware of the importance of legitimising women's stories and ground truths as valuable sources of knowledge, challenging the reduction of lived experiences to mere data points.

At DFL, we try to highlight the human side of technology through storytelling, not only because this might be a more effective way to reach people but because it emphasises that at the heart of tech, its innovation and its impact is everyday people. We also discussed the need to make research accessible and understandable for everyone, connecting the dots in ways that stakeholders can easily grasp. FPAR emphasises putting people at the centre of research, focusing on their needs and agency. While this isn’t always possible, it’s worth considering how gig workers, AI users, and data annotators could influence our research process.

AI as a Site for Feminist Research

During our panel, I discussed how feminist researchers should approach AI as a research site, not just a tool. I explained that AI processes large amounts of data to find patterns but doesn’t create anything truly original. It’s essentially advanced statistics playing at human ingenuity. AI can help shine a light on correlations, but these can’t always be applied to real-life experiences because real-lived experiences are not just disaggregated data points. People and their lives are not disaggregated data points. We embody and exist with many intersections in constantly changing contexts, and AI cannot always possibly capture this or offer anything too useful.

I also explained the AI value chain quite simply, touching on data collection, curation, annotation, and model building. These processes are resource-intensive and occur within specific socio-political and economic contexts that can be extractive of the communities and resources in the Global South. Understanding AI as a socio-technical phenomenon, not just a technical one, helps us see its broader impact. For example, the construction of data centres and the rise of data annotation work in the Global South highlight the environmental and labour issues intertwined with AI development.

It was important for me to convey that, even if feminist research organisations don’t focus on AI or digital rights, they should still care about how these technologies are developed and deployed. AI impacts labour, gender, agriculture, climate action, and civil society advocacy, touching the lives and livelihoods of the people these organisations serve. By viewing AI innovation as a sum of many processes shaped by political, social, and economic factors,there will always be a reason to research AI, even if it’s not the organisation’s main focus.

I left my co-participants with these action items, to work toward the vision of developing a stake in global AI innovation trajectories:

  • Build evidence at the level of the phase of the AI lifecycle: if you work with hospital volunteers and farmers, document their stories with respect to data collection, for example. What is their lived experience of having to do this work? How is it materially affecting their day-to-day life? It’s a question of “numbers and stories work[ing] together.”
  • Challenge the dominant rhetoric: will unfettered innovation actually help these communities? Is digital upskilling actually productive for the labourer or the tech companies? What is a margin of error that is acceptable, especially in critical sectors and critical public service delivery?
  • Partner and collaborate with tech research and policy organisations and engage in transdisciplinary work: AI, or at least how it is developing, does not have to be inevitable and more coalition-building is required to shape this development.
  • Equip affected communities with critical knowledge of AI, tech and the associated benefits and risks.