Cultural facial expressions dynamically convey emotion category and intensity information

Chen, C. , Messinger, D. S., Chen, C., Yan, H., Duan, Y., Ince, R. A.A. , Garrod, O. G.B., Schyns, P. G. and Jack, R. E. (2024) Cultural facial expressions dynamically convey emotion category and intensity information. Current Biology, 34(1), 213-223.e5. (doi: 10.1016/j.cub.2023.12.001) (PMID:38141619)

[img] Text
310188.pdf - Published Version
Available under License Creative Commons Attribution.

4MB

Abstract

Communicating emotional intensity plays a vital ecological role because it provides valuable information about the nature and likelihood of the sender’s behavior.1,2,3 For example, attack often follows signals of intense aggression if receivers fail to retreat.4,5 Humans regularly use facial expressions to communicate such information.6,7,8,9,10,11 Yet how this complex signaling task is achieved remains unknown. We addressed this question using a perception-based, data-driven method to mathematically model the specific facial movements that receivers use to classify the six basic emotions—"happy,” “surprise,” “fear,” “disgust,” “anger,” and “sad”—and judge their intensity in two distinct cultures (East Asian, Western European; total n = 120). In both cultures, receivers expected facial expressions to dynamically represent emotion category and intensity information over time, using a multi-component compositional signaling structure. Specifically, emotion intensifiers peaked earlier or later than emotion classifiers and represented intensity using amplitude variations. Emotion intensifiers are also more similar across emotions than classifiers are, suggesting a latent broad-plus-specific signaling structure. Cross-cultural analysis further revealed similarities and differences in expectations that could impact cross-cultural communication. Specifically, East Asian and Western European receivers have similar expectations about which facial movements represent high intensity for threat-related emotions, such as “anger,” “disgust,” and “fear,” but differ on those that represent low threat emotions, such as happiness and sadness. Together, our results provide new insights into the intricate processes by which facial expressions can achieve complex dynamic signaling tasks by revealing the rich information embedded in facial expressions.

Item Type:Articles
Additional Information:This work was supported by the Leverhulme Trust (Early Career Fellowship, ECF-2020-401), the University of Glasgow (Lord Kelvin/Adam Smith Fellowship, 201277) and the Chinese Scholarship Council (201306270029) awarded to C.C. (Chaona Chen); the National Science Foundation (2150830) and the National Institute for Deafness and Other Communication Disorders (R01DC018542) awarded to D.S.M.; the Natural Science Foundation of China (62276051) awarded to H.M.Y.; the China Scholarship Council (201606070109) awarded to Y.D.; the Wellcome Trust (214120/Z/18/Z) awarded to R.A.A.I.; the Wellcome Trust (Senior Investigator Award, UK; 107802) and the Multidisciplinary University Research Initiative/Engineering and Physical Sciences Research Council (USA, UK; 172046-01) awarded to P.G.S.; and the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement FACESYNTAX no. 759796), the British Academy (SG113332 and SG171783), the Economic and Social Research Council (ES/K001973/1 and ES/K00607X/1) and the University of Glasgow (John Robertson Bequest) awarded to R.E.J.
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Jack, Professor Rachael and Yan, Professor Hongmei and Garrod, Dr Oliver and Schyns, Professor Philippe and Chen, Dr Chaona and Duan, Mr Yaocong and Ince, Dr Robin
Authors: Chen, C., Messinger, D. S., Chen, C., Yan, H., Duan, Y., Ince, R. A.A., Garrod, O. G.B., Schyns, P. G., and Jack, R. E.
College/School:College of Medical Veterinary and Life Sciences > School of Psychology & Neuroscience
Journal Name:Current Biology
Publisher:Elsevier (Cell Press)
ISSN:0960-9822
ISSN (Online):1879-0445
Published Online:22 December 2023
Copyright Holders:Copyright © 2023 The Authors
First Published:First published in Current Biology 34(1): 213-223.e5
Publisher Policy:Reproduced under a Creative Commons License
Data DOI:10.17605/OSF.IO/3M95W

University Staff: Request a correction | Enlighten Editors: Update this record

Project CodeAward NoProject NamePrincipal InvestigatorFunder's NameFunder RefLead Dept
310397Equipping artificial agents psychologically-derived dynamic facial expressionsChaona ChenLeverhulme Trust (LEVERHUL)ECF-2020-401SPN - Centre for Social Cognitive & Affective Neuroscience
304240Beyond Pairwise Connectivity: developing an information theoretic hypergraph methodology for multi-modal resting state neuroimaging analysisRobin InceWellcome Trust (WELLCOTR)214120/Z/18/ZSPN - Centre for Cognitive Neuroimaging (CCNi)
172413Brain Algorithmics: Reverse Engineering Dynamic Information Processing Networks from MEG time seriesPhilippe SchynsWellcome Trust (WELLCOTR)107802/Z/15/ZSPN - Centre for Cognitive Neuroimaging (CCNi)
172046Visual Commonsense for Scene UnderstandingPhilippe SchynsEngineering and Physical Sciences Research Council (EPSRC)EP/N019261/1SPN - Centre for Cognitive Neuroimaging (CCNi)
304001Computing the Face Syntax of Social CommunicationRachael JackEuropean Research Council (ERC)759796SPN - Psychology & Neuroscience Pedagogy Unit
167641Mapping Cultural Differences in Facial Expressions of Emotion.Rachael JackBritish Academy (BRITACAD)SG113332Psychology
300796Equipping Social Robots with Culturally Valid Facial ExpressionsRachael JackBritish Academy (BRITACAD)SG171783Psychology
190558Mapping the Cultural Landscape of Emotions for Social InteractionRachael JackEconomic and Social Research Council (ESRC)ES/K001973/1Psychology
190552DADIOS (Data-driven Analysis of the Dynamics of Information-acquisition Over time during Social judgement)Philippe SchynsEconomic and Social Research Council (ESRC)ES/K00607X/1SPN - Centre for Cognitive Neuroimaging (CCNi)