Continuous perception for deformable objects understanding

Martínez, L., Ruiz-del-Solar, J., Sun, L., Siebert, J. P. and Aragon-Camarasa, G. (2019) Continuous perception for deformable objects understanding. Robotics and Autonomous Systems, 118, pp. 220-230. (doi: 10.1016/j.robot.2019.05.010)

187631.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.



We present a robot vision approach to deformable object classification, with direct application to autonomous service robots. Our approach is based on the assumption that continuous perception provides robots with greater visual competence for deformable objects interpretation and classification. Our approach classifies the category of clothing items by continuously perceiving the dynamic interactions of the garment’s material and shape as it is being picked up. For this, we extract continuously visual features of a RGB-D video sequence and fusing features by means of the Locality Constrained Group Sparse Representation (LGSR) algorithm. To evaluate the performance of our approach, we created a fully annotated database featuring 150 garment videos in random configurations. Experiments demonstrate that by continuously observing an object deform, our approach achieves a classification score of 66.7%, outperforming state-of-the-art approaches by a ~ 27.3% increase.

Item Type:Articles
Additional Information:LM was funded in this work by CONICYT- PCHA/Doctorado Nacional/2014-21140280. JRS was funded in this work by CONICYT-FONDECYT project 1161500. GAC thanks the support of NVIDIA Corporation for the donation of a Titan Xp GPU on which some.
Glasgow Author(s) Enlighten ID:Siebert, Dr Paul and Sun, Mr Li and Martinez, Luz and Aragon Camarasa, Dr Gerardo
Authors: Martínez, L., Ruiz-del-Solar, J., Sun, L., Siebert, J. P., and Aragon-Camarasa, G.
College/School:College of Science and Engineering > School of Computing Science
Journal Name:Robotics and Autonomous Systems
ISSN (Online):1872-793X
Published Online:30 May 2019
Copyright Holders:Copyright © 2019 Elsevier B.V.
First Published:First published in Robotics and Autonomous Systems 118: 220-230
Publisher Policy:Reproduced in accordance with the publisher copyright policy

University Staff: Request a correction | Enlighten Editors: Update this record