Equipping Social Robots with Culturally-Sensitive Facial Expressions of Emotion Using Data-Driven Methods

Chen, C. , Hensel, L. B., Duan, Y., Ince, R. , Garrod, O. G.B., Beskow, J., Jack, R. E. and Schyns, P. G. (2019) Equipping Social Robots with Culturally-Sensitive Facial Expressions of Emotion Using Data-Driven Methods. In: 14th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2019), Lille, France, 14-18 May 2019, ISBN 9781728100890 (doi: 10.1109/FG.2019.8756570)

[img] Text
178630.pdf - Accepted Version

10MB

Abstract

Social robots must be able to generate realistic and recognizable facial expressions to engage their human users. Many social robots are equipped with standardized facial expressions of emotion that are widely considered to be universally recognized across all cultures. However, mounting evidence shows that these facial expressions are not universally recognized - for example, they elicit significantly lower recognition accuracy in East Asian cultures than they do in Western cultures. Therefore, without culturally sensitive facial expressions, state-of-the-art social robots are restricted in their ability to engage a culturally diverse range of human users, which in turn limits their global marketability. To develop culturally sensitive facial expressions, novel data-driven methods are used to model the dynamic face movement patterns that convey basic emotions (e.g., happy, sad, anger) in a given culture using cultural perception. Here, we tested whether such dynamic facial expression models, derived in an East Asian culture and transferred to a popular social robot, improved the social signalling generation capabilities of the social robot with East Asian participants. Results showed that, compared to the social robot's existing set of facial `universal' expressions, the culturally-sensitive facial expression models are recognized with generally higher accuracy and judged as more human-like by East Asian participants. We also detail the specific dynamic face movements (Action Units) that are associated with high recognition accuracy and judgments of human-likeness, including those that further boost performance. Our results therefore demonstrate the utility of using data-driven methods that employ human cultural perception to derive culturally-sensitive facial expressions that improve the social face signal generation capabilities of social robots. We anticipate that these methods will continue to inform the design of social robots and broaden their usability and global marketability.

Item Type:Conference Proceedings
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Garrod, Dr Oliver and Duan, Yaocong and Hensel, Laura and Jack, Professor Rachael and Chen, Dr Chaona and Schyns, Professor Philippe and Ince, Dr Robin
Authors: Chen, C., Hensel, L. B., Duan, Y., Ince, R., Garrod, O. G.B., Beskow, J., Jack, R. E., and Schyns, P. G.
College/School:College of Medical Veterinary and Life Sciences > School of Psychology & Neuroscience
College of Science and Engineering > School of Psychology
ISBN:9781728100890
Copyright Holders:Copyright © 2019 IEEE
Publisher Policy:Reproduced in accordance with the copyright policy of the publisher

University Staff: Request a correction | Enlighten Editors: Update this record