Realistic facial animation generation based on facial expression mapping

Yu, H., Garrod, O., Jack, R. and Schyns, P. (2014) Realistic facial animation generation based on facial expression mapping. In: Fifth International Conference on Graphic and Image Processing (ICGIP 2013), Hong Kong, China, 26-27 Oct 2013, pp. 906903-1. (doi: 10.1117/12.2049921)

Full text not currently available from Enlighten.


Facial expressions reflect internal emotional states of a character or in response to social communications. Though much effort has been taken to generate realistic facial expressions, it still remains a challenging topic due to human being’s sensitivity to subtle facial movements. In this paper, we present a method for facial animation generation, which reflects true facial muscle movements with high fidelity. An intermediate model space is introduced to transfer captured static AU peak frames based on FACS to the conformed target face. And then dynamic parameters derived using a psychophysics method is integrated to generate facial animation, which is assumed to represent natural correlation of multiple AUs. Finally, the animation sequence in the intermediate model space is mapped to the target face to produce final animation. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

Item Type:Conference Proceedings
Glasgow Author(s) Enlighten ID:Garrod, Dr Oliver and Yu, Mr Hui and Jack, Professor Rachael and Schyns, Professor Philippe
Authors: Yu, H., Garrod, O., Jack, R., and Schyns, P.
College/School:College of Medical Veterinary and Life Sciences > School of Psychology & Neuroscience
College of Science and Engineering > School of Psychology

University Staff: Request a correction | Enlighten Editors: Update this record