Towards Automatic Analysis of Gestures and Body Expressions in Depression

Mahmoud, M. and Robinson, P. (2016) Towards Automatic Analysis of Gestures and Body Expressions in Depression. In: 10th EAI International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth '16), Cancun, Mexico, 16-19 May 2016, pp. 276-277. ISBN 9781631900518

Full text not currently available from Enlighten.

Publisher's URL:


Depression is a common mental disorder and one of the main causes of disease burden worldwide. Several studies in depression address the relation between non-verbal cues and different levels of depression. Manual coding of non-verbal cues is the common practice for running such studies, which is time consuming and non-objective. Recent research has looked into automatic detection of cues associated with depression. However, most of the work has focussed on facial cues such as facial expressions, gaze and head pose. Few studies have looked into multimodal features for analysis of depression, mainly focusing on facial movements, head movements and vocal prosody. Body gestures are an under-studied modality in that field. We propose to investigate assessment of depression using automatic detection of nonverbal signals of body gestures. Moreover, we propose the use of multimodal fusion of features to incorporate body as well as face and head for better inference of depression level. Automatic analysis of such body cues can serve as a tool for experimental psychologists. Also, it can assist physicians in diagnosing by providing quantitative measures after or during face to face sessions or telemedicine sessions or even in systems like a virtual coach.

Item Type:Conference Proceedings
Glasgow Author(s) Enlighten ID:Mahmoud, Dr Marwa
Authors: Mahmoud, M., and Robinson, P.
College/School:College of Science and Engineering > School of Computing Science

University Staff: Request a correction | Enlighten Editors: Update this record