Blog Post

AI Helps in Understanding What Makes Us Human

Published: September 14, 2022

Mariana Busarello, WASP-HS PhD student at the Swedish University of Agricultural Sciences, shares her experience and reflections on the WASP-HS Summer School 2022.

I am an outgoing person – I love to get to know new people, talk, listen, and learn. So it’s no surprise how much I was looking forward to take part in the WASP-HS Summer School in August. Five days with colleagues of unique academic backgrounds, working together to ponder over the aspects of society-robot interaction, under the guidance of professionals doing cutting-edge research in Sweden? Sign me up!

For the project developed during the week, we were introduced to Epi (more specifically, Epi’s head), a humanoid robot developed by the Cognitive Science Robotics Group from Lund University. We were asked to make Epi perform several facial expressions to communicate feelings. Groups were divided and told by Christian Balkanius: the division is set up taking into account your skills. It was up to us find out how they best work together.

Unfortunately on my very first day I tested positive for COVID, so I self-isolated until my symptoms were gone and I tested negative. The programme administration offered their help all the way, always checking how I was feeling and managing. It was proposed that, if I wanted to, I could try to work remotely with my group. I was extra thankful for that, my group was very accommodating and made it possible for me to work and participate through online meetings!

Back to the project, as my fellow WASP-HS colleague Sergio Passero pointed out: this was a multimodal agent, which meant that every dimension needed to be considered to model expressions. We could control head movement, eye colour, pupil dilation, eye movement, mouth intensity and colour. Some movements such as nodding and shaking the head created an obvious association with approval or disapproval, but how to take that further? For example, research shows that pupil dilation is linked to positive and negative feelings [1], and eye gaze can also affect trust during task execution [2]. In view of this, we started to think about different applications that we could test.

But in what context would our robot perform? Considering the (then) upcoming Swedish election, we decided to develop a “political” robot: Val-E. It was discussed how it could be used in society to reach out to potential voters, or represent the party’s position when posed a political question. The friendly appearance could provide a better interaction to a citizen bringing their concerns to the party as well, recording the interaction and sending it to the respective political affiliation.

Some of the group members giving feedback on Val-E’s movements

Val-E would react to some of the political questions gathered from the Valkompass website, leaving it to the audience to interpret what the reaction meant: agreement or disagreement. People were encouraged to join through an interactive presentation platform, and given a few moments to vote after witnessing the movement. It was pretty interesting to perceive through the audience’s feedback the successful conveyance of a particular expression or not – which helped us see that some reactions were modelled too ambiguously. At the end of the performance, the public was asked to describe Val-E. This showed us that, depending on how an agent politically positions itself, the way others perceive it might be changed.

This project also made me ponder how difficult it could be to portray agreement or disagreement in a situation with limited channels, even for a human being. For this reason, the study of cognitive sciences and artificial intelligence is always fascinating to me: by considering how to make a robot behave in a human-like manner, we also contemplate which aspects of our own behaviour make us humans.

Val-E performs an enthusiastic agreement

References

[1] Babiker, Areej, Ibrahima Faye, and Aamir Malik. “Pupillary behavior in positive and negative emotions.” 2013 IEEE International Conference on Signal and Image Processing Applications. IEEE, 2013.

[2] Stanton, Christopher, and Catherine J. Stevens. “Robot pressure: the impact of robot eye gaze and lifelike bodily movements upon decision-making and trust.” International conference on social robotics. Springer, Cham, 2014.

Author

admin

Recent Blog Posts