Heart Wiring

Why we should strive to make robots as empathy as humanly possible

Slowly but surely robots are moving from an industrial setting into more interactive and social environments. To navigate social environments and social interactions, robots must successfully be able to interact with humans. Often these robots are modeled taking human cognition into account, both because we have to share some common ground and understanding of each other to a certain extent for interaction, and because for a lot of cognitive capabilities humans are the best example we have. After all, humans generally consider themselves to be the smartest and most social species on earth [6].

Empathy: what is it?

One of these fundamental human capacities is the ability to empathise with one another. Empathy has been thoroughly studied. However, there is no consensus over the definition of empathy. Generally, one can speak of three different kinds of empathy. Cognitive Empathy, Affective (or Emotional) empathy, and a combination of both [7,12,13]. Cognitive empathy can be roughly defined by people’s ability to understand other people’s emotions and behaviour through mental perspective taking. Affective empathy, also known as the sharing of vicarious emotions, can be defined as follows: “considering it as a process that makes a person (or agent) have “feelings that are more congruent with another’s situation than with his own situation.”” [12]. A. Tapus and M. Mataric made guidelines of the needed capacities of a social robot to emulate empathy [14]. In their paper they acknowledge machines cannot feel and express empathy, however, they focus on how a social robot can appear empathic. 

Also, B. Duffy states that due to the embodied differences between humans and robots, we cannot ensure robots will have the same empathic emotions. However, he argues that for successful social interactions it would be enough for a robot to appear intelligent or have social capabilities. Within human-human social interactions people do not use standardised metrics as measures for intelligence and social capacities. We simply observe them. When a robot appears to be (socially) intelligent it can already facilitate social interactions as they “speak our language” [6]. We might not be able to model “the real deal” when trying to make empathic robots, but we can make them appear to be empathic. Here we would use empathy not as much as a set of feelings, but as a driving force in behaviour and social interactions. But why would we want to have empathic robots? What makes empathy so crucial in social interactions?

Empathy: fundamental to human interactions

As stated earlier, empathy is fundamental to human cognition and human-to-human social interactions. It mainly has been associated with, and is a known mediator of, prosocial behaviour, which can significantly impact how people treat each other [10]. An example of prosocial behaviour would be donating money or extending a helping hand when someone had an accident. It has been argued that cognitive empathy has evolved because of the complex social demands in human interactions during evolution [13]. Cognitive empathy enables the following behaviours: facilitating conversations and social expertise and predicting behaviour. But also less savoury behaviours such as lying and deceiving, or recognizing when such things happen to you. Affective empathy, on the other hand, is mainly associated with altruistic behaviour and nurturing relationships and is related to the moral mechanisms and similar behaviours. A simple example would be that you might be less inclined to punch someone in the face because you understand it will hurt and can empathize with the other’s pain. Another example which shows how empathy modulates social relations is given by Bastian et al., who made students eat spicy chilli peppers in groups and showed that the shared pain facilitated group formation [1].

Empathy and moral decision making

As robots are entering social spaces and their autonomy grows, the need for moral agency in robotics increases. How and if robots can be moral agents is highly discussed. B. Duffy argues that especially a humanoid robot specifically designed for social interaction needs to have moral rights and duties. It uses and looks like a human and thus uses our frame of reference, which sets high expectations. Even more fiercely discussed is the role of empathy in moral agency. Morality and empathy are often mentioned in the same breath. However, the relationship between morality and empathy is not yet very clear [15]. Generally, there is consensus about the importance and influence of perspective taking in moral judgements. Empathy allows agents to understand the effect of their actions on the emotions and mental state of others. This directly influences people’s behaviour towards others, which refers to what we earlier described as cognitive empathy. Others, however, have argued for a more fundamental role of emotions in empathy and moral decision-making. L. Damm describes an account of moral responsibility which critically depends on the empathic capacities of the agent. In her paper, she argues that moral responsibility depends on their status as a moral agent. When one cannot satisfy the criteria for moral agency, they are considered not fully responsible [5]. This would indicate we also need a form of affective empathy mechanisms in robots for them to be considered fully morally responsible.

Whether this is possible, is highly contested, but still open for debate and critically depends on whether we believe you need a biological body to experience emotions [4]. This does mean that a robot cannot be fully morally responsible much like a child cannot, given the assumption that affective empathy is indeed needed for full moral responsibility. Whether one believes robots can experience emotions and thus can be truly and fully empathic does not mean the role of robot empathy in robot morality is over. We can still use the construct of cognitive empathy. Or abstract away from biology and view emotions as one of the driving forces behind behaviour. We have now seen how empathy is a foundation for social interactions and morality and that this is much needed in the future. But how does this information translate into practice? What are the applications we can dream about in the future apart from moral robots?

Empathy from a practical standpoint

Current empathic agents are still in its infancy and are very limited in their empathic abilities. However, there are a few domains in which the use of these systems are being investigated. An example is empathic tutoring. A current, very ambitious project is the EMOTE project. This project aims to develop an empathic tutoring system which should facilitate the learning experience of children [3]. Knowledge from social psychology and models of empathy are applied to create emotionally intelligent and responsive tutor agents (robotic and virtual).

A. Tapus and M. Mataric made their recommendations on how to model empathy with the intervention to improve therapeutic robots [14]. They found that empathy is a crucial component of therapy and a robotic system that is supposed to be a therapeutic aid, thus also should have empathic capabilities. Social robots may also be used in therapy for example in supporting ASD patients [8]. Another use case for emotional robots could be robots as companions. Leite et al. developed a social robot companion which reacted in an empathic manner to a chess game played between people by displaying facial expressions and uttering phrases [10]. People to which the robot reacted empathically to, rated the robots as being friendlier. Empathy can also go the other way around: the robot evokes empathy in humans. Paiva et al. addressed how an empathy evoking agent can be used to persuade children to do “the right” action[11]. This demonstrates we can even use robots to enhance people’s moral or social behaviour!

Empathy: a demon we need to tackle

Apart from the debate on if robots can experience emotions and thus can truly be empathic themselves, it is important to know that affective intelligence in cognitive robotics is highly controversial [4]. A critical moral dilemma accompanying social robots that portray empathy and are there to be a listening ear (for example patients) is that of deception [2]. If it merely appears to be empathic and understanding, it might betray our trust. The question is whether the end justifies the means.

However, we have seen that empathy is crucial in social interactions, and we would also need to implement empathic capabilities to a certain extent to make robots capable of navigating social environments. Empathy could have practical benefits when we try to use it to make tutoring or therapeutic systems, but it could also be crucial for a companion robot. We have seen that the need for robot ethics is rising and empathy is a crucial component in human morality. All in all, we should strive to make robots as empathic as humanly possible.

References:

[1] Bastian, B., Jetten, J., & Ferris, L. J. (2014). Pain as Social Glue: Shared Pain Increases Cooperation. Psychological Science, 25(11), 2079–2085. https://doi.org/10.1177/0956797614545886

[2] Bradwell, H. L., Winnington, R., Thill, S., & Jones, R. B. (2020). Ethical perceptions towards real-world use of companion robots with older people and people with dementia: Survey opinions among younger adults. BMC Geriatrics, 20(1), 1–10. https://doi.org/10.1186/s12877-020-01641-5

[3] Castellano, G., Paiva, A., Kappas, A., Aylett, R., Hastie, H., Barendregt, W., Nabais, F., & Bull, S. (2013). Towards empathic virtual and robotic tutors. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 7926 LNAI, 733–736. https://doi.org/10.1007/978-3-642-39112-5-100

[4] Cowie, R. (2012). Ethical issues in affective computing 2 . Formal and informal foundations of ethics. 1740.

[5] Damm, L. (2010). Emotions and moral agency. Philosophical Explorations, 13(3), 275–292. https://doi.org/10.1080/13869795.2010.501898

[6] Duffy, B. R. (2006). Fundamental Issues in Social Robotics. International Review of Information Ethics (IRIE), 6(March 2003), 31–36.

[7] Edele, A., Dziobek, I., & Keller, M. (2013). Explaining altruistic sharing in the dictator game: The role of affective empathy, cognitive empathy, and justice sensitivity. Learning and Individual Differences, 24, 96–102. https://doi.org/10.1016/j.lindif.2012.12.020

[8] Esteban, P. G., Baxter, P., Belpaeme, T., Billing, E., Cai, H., Cao, H. L., Coeckelbergh, M., Costescu, C., David, D., De Beir, A., Fang, Y., Ju, Z., Kennedy, J., Liu, H., Mazel, A., Pandey, A., Richardson, K., Senft, E., Thill, S., … Ziemke, T. (2017). How to build a supervised autonomous system for robot-enhanced therapy for children with autism spectrum disorder. Paladyn, 8(1), 18–38. https://doi.org/10.1515/pjbr-2017-0002

[9] Leiberg, S., Eippert, F., Veit, R., & Anders, S. (2012). Intentional social distance regulation alters affective responses towards victims of violence: An FMRI study. Human Brain Mapping, 33(10), 2464–2476. https://doi.org/10.1002/hbm.21376

[10] Leite, I., Pereira, A., Mascarenhas, S., Martinho, C., Prada, R., & Paiva, A. (2013). The influence of empathy in human-robot relations. International Journal of Human Computer Studies, 71(3), 250–260. https://doi.org/10.1016/j.ijhcs.2012.09.005

[11] Paiva, A., Dias, J., Sobral, D., Woods, S., Aylett, R., Sobreperez, P., Zoll, C., & Hall, L. (2004). Caring for agents and agents that care: Building empathic relations with synthetic agents. Proceedings of the Third International Joint Conference on Autonomous Agents and Multiagent Systems, AAMAS 2004, 1, 194–201.

[12] Paiva, A., Leite, I., Boukricha, H., & Wachsmuth, I. (2017). Empathy in virtual agents and robots: A survey. ACM Transactions on Interactive Intelligent Systems, 7(3). https://doi.org/10.1145/2912150

[13] Smith, A. (2006). Cognitive empathy and emotional empathy in human behavior and evolution. Psychological Record, 56(1), 3–21. https://doi.org/10.1007/BF03395534

[14] Tapus, A., & Matarić, M. J. (2007). Emulating empathy in socially assistive robotics. AAAI Spring Symposium – Technical Report, SS-07-07, 93–96.

[15] Ugazio, G., Majdandžić, J., & Lamm, C. (2014). Are empathy and morality linked? Insights from moral psychology, social and decision neuroscience, and philosophy. 1–27.