Face mediated human–robot interaction for remote medical examination

Lalitharatne, T. D., Costi, L., Hashem, R., Nisky, I., Jack, R. E. , Nanayakkara, T. and Iida, F. (2022) Face mediated human–robot interaction for remote medical examination. Scientific Reports, 12, 12592. (doi: 10.1038/s41598-022-16643-z) (PMID:35869154) (PMCID:PMC9307637)

[img] Text
275827.pdf - Published Version
Available under License Creative Commons Attribution.

2MB

Abstract

Realtime visual feedback from consequences of actions is useful for future safety-critical human–robot interaction applications such as remote physical examination of patients. Given multiple formats to present visual feedback, using face as feedback for mediating human–robot interaction in remote examination remains understudied. Here we describe a face mediated human–robot interaction approach for remote palpation. It builds upon a robodoctor–robopatient platform where user can palpate on the robopatient to remotely control the robodoctor to diagnose a patient. A tactile sensor array mounted on the end effector of the robodoctor measures the haptic response of the patient under diagnosis and transfers it to the robopatient to render pain facial expressions in response to palpation forces. We compare this approach against a direct presentation of tactile sensor data in a visual tactile map. As feedback, the former has the advantage of recruiting advanced human capabilities to decode expressions on a human face whereas the later has the advantage of being able to present details such as intensity and spatial information of palpation. In a user study, we compare these two approaches in a teleoperated palpation task to find the hard nodule embedded in the remote abdominal phantom. We show that the face mediated human–robot interaction approach leads to statistically significant improvements in localizing the hard nodule without compromising the nodule position estimation time. We highlight the inherent power of facial expressions as communicative signals to enhance the utility and effectiveness of human–robot interaction in remote medical examinations.

Item Type:Articles
Additional Information:This work was supported by the Robopatient project funded by the EPSRC Grants No EP/T00603X/1, and EP/T00519X/1.
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Jack, Professor Rachael
Authors: Lalitharatne, T. D., Costi, L., Hashem, R., Nisky, I., Jack, R. E., Nanayakkara, T., and Iida, F.
College/School:College of Medical Veterinary and Life Sciences > School of Psychology & Neuroscience
Journal Name:Scientific Reports
Publisher:Nature Research
ISSN:2045-2322
ISSN (Online):2045-2322
Copyright Holders:Copyright © 2022 The Authors
First Published:First published in Scientific Reports 12: 12592
Publisher Policy:Reproduced under a Creative Commons License
Data DOI:10.17605/OSF.IO/6M9D7

University Staff: Request a correction | Enlighten Editors: Update this record