News

Power Imbalances and Climate Impact in AI Systems

Published: October 9, 2024

On October 2, Wallenberg AI, Autonomous Systems and Software Program – Humanity and Society (WASP-HS) arranged an online meeting on the topic “Queering AI”. Researchers and representatives from the industry participated and discussed issues focusing on queer perspectives in AI developments, implementations, and discourse.

The WASP-HS event “Queering AI” brought together researchers and industry professionals to discuss the crossing of queer theory and artificial intelligence, focusing on how data-driven technologies reinforce or challenge gender-based normativity and power imbalances. The event, chaired by Matilda Tudor from Uppsala University and Karin Danielsson from Umeå University, aimed to highlight often-overlooked perspectives and foster critical discussions about the potentials and limitations of AI.

Following a welcome from WASP-HS Program Director Christofer Edling, the program opened with a keynote by Daniella Gati, lecturer in Games & Interactive Media at the University of Salford. Gati’s presentation explored the transformative impact AI is having on the way knowledge is understood and produced, highlighting the need for inclusive approaches in technological development.

AI Accelerates Climate Change

One issue raised at the event was the notion of, or the general lack of notion of, the limits of AI.

“We are facing a lot of problems and dilemmas in society, big and small. Many parties believe that we can solve these problems via AI as long as we keep on adding more data.” Says Daniella Gati, lecturer in Games & Interactive Media, at University of Salford, Manchester.

The climate crisis is an example of such a problem which was brought up. Although many believe that the climate crisis can be solved through the use of AI, the systems require plenty of server storage to run, servers that in turn require a lot of energy. The unsustainable way in which a lot of the energy we use today is generated makes the use of AI limited. Furthermore, the energy used to run AI-systems is a contributing factor to the accelerating climate change.

Stereotypical vs. User-Friendly AI

Another key concern raised at the event was the lack of diversity in AI development and the perpetuation of harmful stereotypes within these systems. Participants highlighted how, when AI models are designed to be “tuneable” or given more options, their default settings often reflect stereotypical assumptions rather than diverse or inclusive perspectives. The discussion touched on how interactions with stereotypical may be more user-friendly, but also harmful in terms of biased interactions.

“Building technologies that truly challenge existing norms may be harder to design and use, but they are essential to avoid perpetuating inequalities and harm in AI” says Ericka Johnson, Professor of Gender Studies at Linköping University, and Co-Director of WASP-HS.

Summarizing Report Available for All

WASP-HS is compiling a report in which the discussions from the event will be summarized. The report will be published on the WASP-HS website in the coming weeks.

See all previously published reports.

Recent News