Tiina Lieno Lindell
Postdoctoral researcher at the Department of Applied IT at University of Gothenburg

Blog Posts

How Can We Ensure Algorithmic Fairness to Protect Children?

Published: May 2, 2023

Tiina Leino Lindell, postdoctoral researcher in the WASP-HS project The Missing Teacher In AI: Involving Teachers in Metadesign of AI to Ensure FAIRness, shares her experience, and insights, from the WASP-HS community reference meeting AI, Education and Children that took place on 19 April, 2023.

Today’s children are using digital technologies more than ever, and AI has become ubiquitous across most digital services that they use, such as social media, browsing the web, and gaming. However, despite being active users of digital technology, children’s rights have not been sufficiently considered in the design of AI systems. Recently, my colleagues Johan Lundin, Professor of Informatics at the University of Gothenburg, and Associate Professor Marisa Ponti held a round table discussion at the WASP-HS event AI, Education and Children to address how we can safeguard the rights of children in the design of AI. Together, we are conducting research in the WASP-HS research project The Missing Teacher In AI: Involving Teachers in Metadesign of AI to Ensure FAIRness where we investigate how AI systems can be adapted for schools to promote local needs for fairness. We are exploring the perspectives of students, teachers, principals, and school developers on defining fairness and designing AI systems to meet their needs.

During the round table discussion, everyone agreed that the rights of children using AI services must be protected to a greater extent. However, the question of how to achieve this sparked a broader debate. One particularly interesting discussion related to what responsibility we can give children in the design of AI systems. On one hand, involving children in the design of AI systems gives them greater influence, but on the other hand, they are still children. The question is whether children can bear the same responsibility as adults. In most societies, specific obligation bearers are responsible for safeguarding children’s rights, but to what extent can they do so? Such a role requires knowledge of what children’s rights entail in the relevant context. What is considered fair may vary across cultures and contexts, and for adults to protect children’s rights, they need insight into children’s perspectives on what is fair in the specific context at hand. With regard to children’s use of AI systems, this means that obligation bearers also need to understand how young people use AI and the injustices they experience in the systems in relation to the local culture where the systems are used.

At the same time, I think it’s worth considering whether questions of fairness are only locally contextualized. Injustices often become visible when we can compare different situations and settings. Therefore, I think it is essential for obligation bearers at all levels of society to understand local needs in the context of broader societal structures relevant to the protection and promotion of children’s rights. This complexity highlights the importance of research to contribute knowledge to our society. But also, to have meeting places for knowledge exchange between people representing all levels of obligation bearers. By exchanging experiences in, for example, round table discussions, we have an opportunity to address these issues from a societal perspective.

While arenas for conversations between adults are undoubtedly important, I couldn’t help but notice the absence of children in the discussions. This led me to wonder whether it’s possible to organize round table discussions that include children’s perspectives and their representation? Our research project’s preliminary analysis of research data has revealed that students, teachers, principals, and school developers hold differing viewpoints on the meaning of fairness and the appropriate design of AI systems. Including children in round table conversations could bring alternative perspectives to the discussions and enrich our understanding of the issues. By doing so, we wouldn’t just be discussing children’s rights in relation to their needs, but also talking with the children themselves. But if we want to prioritize children’s perspectives and representation, in such discussions a crucial question arises: how can we involve them meaningfully in conversations at all levels of society that relate to their rights?

Through my participation in the round table discussions, I have gained a deeper understanding of the complexities involved in promoting children’s rights. The conversations have sparked new insights and raised thought-provoking questions about whether there are alternative approaches to promoting children’s rights beyond the current ones. The dialogue has challenged my perspectives and prompted me to reflect on the crucial importance of continuing to discuss and collaborate at all levels of society to advance fairness in AI systems used by children. One possible way forward could be to organize more conversations and events where children can be directly involved in discussions about their rights in relation to AI. By allowing children to take part and express their views, we can gain a better understanding of their needs and concerns, and thus develop more meaningful strategies for protecting their rights.

Recent Blog Posts