Facial expressions elicit multiplexed perceptions of emotion categories and dimensions

Liu, M., Duan, Y., Ince, R. A.A. , Chen, C. , Garrod, O. G.B., Schyns, P. G. and Jack, R. E. (2022) Facial expressions elicit multiplexed perceptions of emotion categories and dimensions. Current Biology, 32(1), 200-209.e6. (doi: 10.1016/j.cub.2021.10.035) (PMID:34767768) (PMCID:PMC8751635)

[img] Text
257107.pdf - Published Version
Available under License Creative Commons Attribution.

3MB

Abstract

Human facial expressions are complex, multi-component signals that can communicate rich information about emotions,1, 2, 3, 4, 5 including specific categories, such as “anger,” and broader dimensions, such as “negative valence, high arousal.”6, 7, 8 An enduring question is how this complex signaling is achieved. Communication theory predicts that multi-component signals could transmit each type of emotion information—i.e., specific categories and broader dimensions—via the same or different facial signal components, with implications for elucidating the system and ontology of facial expression communication.9 We addressed this question using a communication-systems-based method that agnostically generates facial expressions and uses the receiver’s perceptions to model the specific facial signal components that represent emotion category and dimensional information to them.10, 11, 12 First, we derived the facial expressions that elicit the perception of emotion categories (i.e., the six classic emotions13 plus 19 complex emotions3) and dimensions (i.e., valence and arousal) separately, in 60 individual participants. Comparison of these facial signals showed that they share subsets of components, suggesting that specific latent signals jointly represent—i.e., multiplex—categorical and dimensional information. Further examination revealed these specific latent signals and the joint information they represent. Our results—based on white Western participants, same-ethnicity face stimuli, and commonly used English emotion terms—show that facial expressions can jointly represent specific emotion categories and broad dimensions to perceivers via multiplexed facial signal components. Our results provide insights into the ontology and system of facial expression communication and a new information-theoretic framework that can characterize its complexities.

Item Type:Articles
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Garrod, Dr Oliver and Duan, Yaocong and Jack, Professor Rachael and Chen, Dr Chaona and Liu, Meng and Schyns, Professor Philippe and Ince, Dr Robin
Creator Roles:
Liu, M.Conceptualization, Methodology, Formal analysis, Investigation, Writing – original draft, Writing – review and editing, Visualization
Jack, R. E.Conceptualization, Methodology, Writing – original draft, Writing – review and editing, Visualization, Supervision, Funding acquisition
Schyns, P. G.Conceptualization, Methodology, Software, Writing – original draft, Writing – review and editing, Supervision, Funding acquisition
Duan, Y.Methodology, Formal analysis
Ince, R. A.A.Methodology, Formal analysis, Writing – review and editing
Garrod, O. G.B.Methodology, Software, Visualization
Chen, C.Investigation, Writing – review and editing
Authors: Liu, M., Duan, Y., Ince, R. A.A., Chen, C., Garrod, O. G.B., Schyns, P. G., and Jack, R. E.
College/School:College of Medical Veterinary and Life Sciences > School of Psychology & Neuroscience
Journal Name:Current Biology
Publisher:Elsevier (Cell Press)
ISSN:0960-9822
ISSN (Online):1879-0445
Published Online:11 November 2021
Copyright Holders:Copyright © 2021 Crown Copyright
First Published:First published in Current Biology 32(1): 200-209.e6
Publisher Policy:Reproduced under a Creative Commons License
Related URLs:

University Staff: Request a correction | Enlighten Editors: Update this record

Project CodeAward NoProject NamePrincipal InvestigatorFunder's NameFunder RefLead Dept
310397Equipping artificial agents psychologically-derived dynamic facial expressionsChaona ChenLeverhulme Trust (LEVERHUL)ECF-2020-401Psychology
304240Beyond Pairwise Connectivity: developing an information theoretic hypergraph methodology for multi-modal resting state neuroimaging analysisRobin InceWellcome Trust (WELLCOTR)214120/Z/18/ZNP - Centre for Cognitive Neuroimaging (CCNi)
304001Computing the Face Syntax of Social CommunicationRachael JackEuropean Research Council (ERC)759796Psychology
190558Mapping the Cultural Landscape of Emotions for Social InteractionRachael JackEconomic and Social Research Council (ESRC)ES/K001973/1Psychology
190552DADIOS (Data-driven Analysis of the Dynamics of Information-acquisition Over time during Social judgement)Philippe SchynsEconomic and Social Research Council (ESRC)ES/K00607X/1NP - Centre for Cognitive Neuroimaging (CCNi)
172413Brain Algorithmics: Reverse Engineering Dynamic Information Processing Networks from MEG time seriesPhilippe SchynsWellcome Trust (WELLCOTR)107802/Z/15/ZNP - Centre for Cognitive Neuroimaging (CCNi)
172046Visual Commonsense for Scene UnderstandingPhilippe SchynsEngineering and Physical Sciences Research Council (EPSRC)EP/N019261/1NP - Centre for Cognitive Neuroimaging (CCNi)