Modulating the Non-Verbal Social Signals of a Humanoid Robot

Deshmukh, A. , Craenen, B., Vinciarelli, A. and Foster, M. E. (2017) Modulating the Non-Verbal Social Signals of a Humanoid Robot. In: 19th ACM International Conference on Multimodal Interaction (ICMI 2017), Glasgow, Scotland, 13-17 Nov 2017, pp. 508-509. ISBN 9781450355438 (doi: 10.1145/3136755.3143028)

149723.pdf - Accepted Version



In this demonstration we present a repertoire of social signals generated by the humanoid robot Pepper in the context of the EU-funded project MuMMER. The aim of this research is to provide the robot with the expressive capabilities required to interact with people in real-world public spaces such as shopping malls-and being able to control the non-verbal behaviour of such a robot is key to engaging with humans in an effective way. We propose an approach to modulating the non-verbal social signals of the robot based on systematically varying the amplitude and speed of the joint motions and gathering user evaluations of the resulting gestures. We anticipate that the humans' perception of the robot behaviour will be influenced by these modulations.

Item Type:Conference Proceedings
Glasgow Author(s) Enlighten ID:Foster, Dr Mary Ellen and Craenen, Dr Bart and Vinciarelli, Professor Alessandro and Deshmukh, Dr Amol
Authors: Deshmukh, A., Craenen, B., Vinciarelli, A., and Foster, M. E.
College/School:College of Science and Engineering > School of Computing Science
Copyright Holders:Copyright © 2017 Association for Computing Machinery
First Published:First published in Proceedings of the 19th ACM International Conference on Multimodal Interaction (ICMI 2017): 508-509
Publisher Policy:Reproduced in accordance with the publisher copyright policy

University Staff: Request a correction | Enlighten Editors: Update this record

Project CodeAward NoProject NamePrincipal InvestigatorFunder's NameFunder RefLead Dept
701651MUMMERMary Ellen FosterEuropean Commission (EC)688147SCHOOL OF COMPUTING SCIENCE