Author

admin

Blog Posts

Emotions and human-robot interactions

Published: December 12, 2019

Ericka Johnson, Professor at Linköping University, is project leader in the WASP-HS project The ethics and social consequences of AI & caring robots. Learning trust, empathy and accountability.

What’s your view on being a part of WASP-HS?

For us, being a part of WASP-HS means being a part of an interdisciplinary group of researchers who are concerned about the implications of new technologies on existing and future social practices.

Our research looks at the production of emotions during human-robot interactions. At one level, we are interested in what affects in the human are produced during this interaction. But we are also interested in which particulars of specific relational interactions produce those affects or emotions. And, therewith, we ask questions that are directed to the contingent and temporal relations between the humans and the robots.

We direct our research to the entirety of the interaction and the context within which it is occurring just as much as to the production and display of emotions in the human component of that relation.

What do you want your own project to lead to?

We assume that robots – of various shapes, sizes, and forms – are going to be providing aspects of care work that today are done by humans. Because this will be a change from how we expect care to be provided, to us and to those we love, our research is going to ask how developers program robots to behave in a way that people find trustworthy and approachable. One aspect of this care work is communication, often verbal. Therefore, part of our research is going to explore how the material-discursive of human-robotic interactions impacts the creation of rapport in various conversations.

Additionally, we will be studying how designers and users perceive and express emotions by analysing their understanding of sensor data and robotic perception through the theoretical prism of affect.

Our starting point is the belief that we and our children are going to be expected to interact with robots as they perform different kinds of care for and with us at different life stages. Therefore, we are asking what that will do to how we – and how the robots’ designers – think of care. And how we are going to produce accountability, trust and empathy in the relational intra-actions we have, together.

What are your expectations on WASP-HS?

WASP-HS is particularly important for our work because it also allows us to be a part of the WASP-HS network of projects and part of the wider WASP community. As such, we are able to engage in generative collaborations and conversations with other robot research groups, many of whom have similar research interests but very different ways of formulating them.

What we notice is that even when we are all asking what, on the surface, could appear to be similar questions, our various theoretical backgrounds produce very different approaches. Our hope is that the organizational aspects of WASP-HS (conferences, meetings, workshops and the graduate school) are going to allow us to converse with these other groups and approaches in a way that challenges our preformulated assumptions and makes us rethink how we study robots and challenges what we see when we do.

PhD student positions related to the project:
The ethics and social consequences of AI & caring robots. Learning trust, empathy and accountability

Recent Blog Posts