Ge, X. , Jose, J. M. , Wang, P., Iyer, A. , Liu, X. and Han, H. (2023) ALGRNet: Multi-relational adaptive facial action unit modelling for face representation and relevant recognitions. IEEE Transactions on Biometrics, Behavior, and Identity Science, 5(4), pp. 566-578. (doi: 10.1109/TBIOM.2023.3306810)
Text
304923.pdf - Accepted Version 3MB |
Abstract
Facial action units (AUs) represent the fundamental activities of a group of muscles, exhibiting subtle changes that are useful for various face analysis tasks. One practical application in real-life situations is the automatic estimation of facial paralysis. This involves analyzing the delicate changes in facial muscle regions and skin textures. It seems logical to assess the severity of facial paralysis by combining well-defined muscle regions (similar to AUs) symmetrically, thus creating a comprehensive facial representation. To this end, we have developed a new model to estimate the severity of facial paralysis automatically and is inspired by the facial action units (FAU) recognition that deals with rich, detailed facial appearance information, such as texture, muscle status, etc. Specifically, a novel Adaptive Local-Global Relational Network (ALGRNet) is designed to adaptively mine the context of well-defined facial muscles and enhance the visual details of facial appearance and texture, which can be flexibly adapted to facial-based tasks, e.g., FAU recognition and facial paralysis estimation. ALGRNet consists of three key structures: (i) an adaptive region learning module that identifies high-potential muscle response regions, (ii) a skip-BiLSTM that models the latent relationships among local regions, enabling better correlation between multiple regional lesion muscles and texture changes, and (iii) a feature fusion&refining module that explores the complementarity between the local and global aspects of the face. We have extensively evaluated ALGRNet to demonstrate its effectiveness using two widely recognized AU benchmarks, BP4D and DISFA. Furthermore, to assess the efficacy of FAUs in subsequent applications, we have investigated their application in the identification of facial paralysis. Experimental findings obtained from a facial paralysis benchmark, meticulously gathered and annotated by medical experts, underscore the potential of utilizing identified AU attributes to estimate the severity of facial paralysis.
Item Type: | Articles |
---|---|
Additional Information: | This research was supported in part by the National Key R&D Program of China (grant 2018AAA0102501), and Natural Science Foundation of China (grant 62176249). Xuri Ge’s research was supported in part by China Scholarship Council (CSC) from the Ministry of Education of China (No. 202006310028). |
Status: | Published |
Refereed: | Yes |
Glasgow Author(s) Enlighten ID: | Jose, Professor Joemon and Ge, Xuri and Iyer, Mr Arunachalam |
Authors: | Ge, X., Jose, J. M., Wang, P., Iyer, A., Liu, X., and Han, H. |
College/School: | College of Medical Veterinary and Life Sciences > School of Medicine, Dentistry & Nursing College of Science and Engineering > School of Computing Science |
Journal Name: | IEEE Transactions on Biometrics, Behavior, and Identity Science |
Publisher: | IEEE |
ISSN: | 2637-6407 |
ISSN (Online): | 2637-6407 |
Published Online: | 21 August 2023 |
Copyright Holders: | Copyright © 2023 IEEE |
First Published: | First published in IEEE Transactions on Biometrics, Behavior, and Identity Science 5(4):566-578 |
Publisher Policy: | Reproduced in accordance with the publisher copyright policy |
University Staff: Request a correction | Enlighten Editors: Update this record