A benchmark of dynamic versus static methods for facial action unit detection

Alharbawee, L. and Pugeault, N. (2021) A benchmark of dynamic versus static methods for facial action unit detection. Journal of Engineering, 2021(5), pp. 252-266. (doi: 10.1049/tje2.12001)

[img] Text
223574.pdf - Published Version
Available under License Creative Commons Attribution.

3MB

Abstract

Action Units activation is a set of local individual facial muscle parts that occur in time constituting a natural facial expression event. AUs occurrence activation detection can be inferred as temporally consecutive evolving movements of these parts. Detecting AUs automatically can provide explicit benefits since it considers both static and dynamic facial features. Our work is divided into three contributions: first, we extracted the features from Local Binary Patterns, Local Phase Quantisation, and dynamic texture descriptor LPQTOP with two distinct leveraged network models from different CNN architectures for local deep visual learning for AU image analysis. Second, cascading the LPQTOP feature vector with Long Short-Term Memory is used for coding longer term temporal information. Next, we discovered the importance of stacking LSTM on top of CNN for learning temporal information in combining the spatially and temporally schemes simultaneously. Also, we hypothesised that using an unsupervised Slow Feature Analysis method is able to leach invariant information from dynamic textures. Third, we compared continuous scoring predictions between LPQTOP and SVM, LPQTOP with LSTM, and AlexNet. A competitive substantial performance evaluation was carried out on the Enhanced CK dataset. Overall, the results indicate that CNN is very promising and surpassed all other methods

Item Type:Articles
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Pugeault, Dr Nicolas
Authors: Alharbawee, L., and Pugeault, N.
College/School:College of Science and Engineering > School of Computing Science
Journal Name:Journal of Engineering
Publisher:Institution of Engineering and Technology
ISSN:2051-3305
ISSN (Online):2051-3305
Published Online:20 April 2021
Copyright Holders:Copyright © 2021 The Authors
First Published:First published in Journal of Engineering 2021(5): 252-266
Publisher Policy:Reproduced under a Creative Commons License

University Staff: Request a correction | Enlighten Editors: Update this record