Hostname: page-component-78c5997874-m6dg7 Total loading time: 0 Render date: 2024-11-13T00:51:19.546Z Has data issue: false hasContentIssue false

Can robots express facial emotions dominantly enough for use in dementia care?

Published online by Cambridge University Press:  21 July 2020

Evgenios Vlachos*
Affiliation:
The Maersk Mc-Kinney Moller Institute, University of Southern Denmark, Campusvej 55, 5230Odense M, Denmark The University Library of Southern Denmark, University of Southern Denmark, Campusvej 55, 5230Odense M, Denmark
Zheng-Hua Tan
Affiliation:
Signal and Information Processing, Department of Electronic Systems, Aalborg University, Fredrik Bajers Vej 7, 9220Aalborg, Denmark
*
Correspondence should be addressed to: Evgenios Vlachos, The Maersk Mc-Kinney Moller Institute, University of Southern Denmark, Campusvej 55, 5230Odense M, Denmark. Phone: +45 65 50 94 86. Email: evvl@mmmi.sdu.dk.
Rights & Permissions [Opens in a new window]

Abstract

Type
Letter to the Editor
Copyright
© International Psychogeriatric Association 2020

Impaired facial emotion recognition could be a contributing factor in the social and cognitive deterioration observed in persons with dementia (PWD) influencing well-being, social competence, and increasing the stress levels of caregivers. Even in mild stages of dementia, impaired recognition of emotion is frequent if the emotion is not expressed dominantly enough (Spoletini et al., Reference Spoletini2008). This work constitutes a step toward robot-assisted dementia care, and the underlying concept is that robots can express certain emotions quite dominantly.

Our objective is to evaluate the recognition, and denomination of the six basic emotional facial expressions as displayed by iSocioBot (Supplementary Figure, DOI: 10.5281/zenodo.3834225) to PWD, and to compare it with the results from the evaluation of static photographs of humans from the Paul Ekman database (Supplementary Method, DOI: 10.5281/zenodo.3834225) in order to investigate the differences in recognition rates among the two stimuli. iSocioBot’s facial expressions were previously validated by healthy adults (Tan et al., Reference Tan2018). This study can be replicated using robots with screens, and its novelty lies in the unprecedented use of a robot for that purpose.

This is a cross-sectional study following the between-subjects approach, conducted on PWD (with a score <29 in the mini-mental state examination) at three care homes in Denmark. Participants, anonymously and voluntarily, were prompted to answer the forced-choice question “What emotion do you think the face is showing?”, by selecting from the six basic emotions in a random order. Confusion matrix and Fisher’s exact test were used in the analysis. Informed written consent was obtained, and the study was approved by the Research Ethics Committee of the North Denmark Region.

Twenty PWD of mean age 82.3 years (age range 69–92) were randomized to the photo (n:9) and the robot condition (n:11). PWD recognize dominantly the positive emotional robotic expressions of “happiness” and “surprise,” as well as the negative emotional expression of “sadness,” but “fear,” “disgust,” and “anger” are often confused (Supplementary Table, DOI: 10.5281/zenodo.3834225). Cohen’s Kappa level of agreement is considered fair (K robot = 0.260) for the robot condition and moderate (K human = 0.467) for the human condition. No significant difference (p = 0.461) is found in the recognition of emotions in the two different conditions.

Since this was the first study using a robot for facial expression recognition with PWD in real time, we can only discuss our findings in relation to studies using human photographs. In Henry et al. (Reference Henry2008), PWD had difficulty in denominating fearful, angry, and happy expressions. We confirm that the emotions of “fear” and “anger” had low recognition rates in both stimuli, but the recognition rates for the positive emotions (“happiness” and “surprise”) were high confirming Guaita et al. (Reference Guaita2009). In Hargrave et al. (Reference Hargrave, Maddock and Stone2002), PWD failed to denominate the emotions of “sadness,” “surprise,” and “disgust.” Our results confirm that “disgust” is an emotion with low recognition rates independent of stimulus, in contrast to “surprise” one of the most recognizable emotions.

Our findings suggest that robots with the capacity to convey specific dominant facial emotions may be used in dementia care complementing human presence without reducing, or replacing it.

Acknowledgments

The authors would like to thank Postdoc Xiaodong Duan, MD, PhD, Krystian Figlewski, Bent Fuglsbjerg from SOSU Nord, Bent Sørensen from Aalborg Municipality, and all the personnel in Lions Park, Fremtidens Plejehjem, and Skipper Klement for their support and assistance. The research was partly supported by the Danish Council for Independent Research – Technology and Production Sciences (grant number: 1335-00162).

Conflict of interest

None.

Description of authors’ roles

Evgenios Vlachos formulated the research question, designed the study, collected the data, analyzed the data, and wrote the paper. Zheng-Hua Tan obtained the funding and supervised the study, developed and controlled the robotic apparatus, assisted with designing the study, and reviewed the paper.

References

Guaita, A.et al. (2009). Impaired facial emotion recognition and preserved reactivity to facial expressions in people with severe dementia. Archives of Gerontology and Geriatrics, 49, 135146. doi: 10.1016/j.archger.2009.09.023.CrossRefGoogle ScholarPubMed
Hargrave, R., Maddock, R.J. and Stone, V. (2002). Impaired recognition of facial expressions of emotion in Alzheimer’s disease. The Journal of Neuropsychiatry and Clinical Neurosciences, 14(1), 6471. doi: 10.1176/jnp.14.1.64.CrossRefGoogle ScholarPubMed
Henry, J.D.et al. (2008). Recognition of disgust is selectively preserved in Alzheimer’s disease. Neuropsychologia, 46(5), 13631370. doi: 10.1016/j.neuropsychologia.2007.12.012CrossRefGoogle ScholarPubMed
Spoletini, I.et al. (2008). Facial emotion recognition deficit in amnestic mild cognitive impairment and Alzheimer disease. The American Journal of Geriatric Psychiatry, 16(5), 389398. doi: 10.1097/JGP.0b013e318165dbceCrossRefGoogle ScholarPubMed
Tan, Z.H.et al. (2018). iSocioBot: a multimodal interactive social robot. International Journal of Social Robotics, 10(1), 519. doi: 10.1007/s12369-017-0426-7CrossRefGoogle Scholar