Kristina Höök, KTH, is the project leader for the project Ethics as Enacted through Movement – Shaping and Being Shaped by Autonomous Systems. Together with Airi Lampinen, Stockholm University, she wants to explore how ethics arise in the moment – between user and autonomous systems such as drones.
What’s your view of being a part of WASP-HS? What does this mean for you?
That we would be included in WASP-HS was a lovely surprise to me! Once I saw the other researchers involved from different disciplines and parts of Sweden, I became very pleased as there are so many really excellent researchers that I really want to exchange ideas with.
What are your expectations on WASP-HS, both for society at large and for re-search in Sweden?
WASP-HS gives me hope that we will have novel perspectives and thereby open creative possibilities to take AI into use in sensible and accountable ways. To me AI is in some ways just another technology, but one that needs to be ‘tamed’ to properly incorporate ethical and accountable practices.
At the same time as WASP-HS gave me hope, I also believe we need to work on hard to integrate WASP-HS with WASP. I truly believe that interdisciplinarity is the only path forward for this field (even if I find it very hard and time-consuming to engage with). I look forward to finding ways we can engage with WASP-researchers.
What do you want your own project to lead to?
As interaction designers we are interested in how ethics is enacted and shaped by exactly how we design autonomous systems. In this project we are focusing in particular on aerial drones.
Drones are fascinating as we, in a sense, get superhuman powers: we become cyborgs or centaurs as we get entangled with them. They move, makes a lot of noise, and behaves in ways that look intelligent and alive to onlookers. As our ways of understanding the world fundamentally sees movement as a sign of intentionality, drones become the ‘other’ to us.
Even more interesting to us, is how drones and other autonomous technologies (depending on how they are designed) require that we move in certain ways to interact with them, spurring certain aesthetic experiences, certain practices and responses, while discouraging others. The argument driving our project is that it is precisely in that interplay – in those movements and adaptations of behaviors – that ethics is enacted and enforced.
Ethics to an interaction designer attempting to create drone behaviors is not a bunch of abstract principles residing in committees and institutions, it is not an ‘attribute’ that we ‘give’ to a system, formulated into some sort of ethical risk management checklist, nor is it something that can be described in terms of individual, rational, decision-making. Instead, ethics is emergent in the interactions we, as designers (and users) enable. We shape and are shaped by these autonomous systems. Ethics becomes emergent and enacted in the human-drone entanglement.