Continuous Perception for Classifying Shapes and Weights of Garments for Robotic Vision Applications

Duan, L. and Aragon-Camarasa, G. (2022) Continuous Perception for Classifying Shapes and Weights of Garments for Robotic Vision Applications. In: 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP, 6-8 Feb 2022, pp. 348-355. ISBN 9789897585555 (doi: 10.5220/0010804300003124)

[img] Text
260293.pdf - Published Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.



We present an approach to continuous perception for robotic laundry tasks. Our assumption is that the visual prediction of a garment’s shapes and weights is possible via a neural network that learns the dynamic changes of garments from video sequences. Continuous perception is leveraged during training by inputting consecutive frames, of which the network learns how a garment deforms. To evaluate our hypothesis, we captured a dataset of 40K RGB and depth video sequences while a garment is being manipulated. We also conducted ablation studies to understand whether the neural network learns the physical properties of garments. Our findings suggest that a modified AlexNet-LSTM architecture has the best classification performance for the garment’s shapes and discretised weights. To further provide evidence for continuous perception, we evaluated our network on unseen video sequences and computed the ’Moving Average’ over a sequence of predictions. We found that our network has a classification accuracy of 48% and 60% for shapes and weights of garments, respectively.

Item Type:Conference Proceedings
Glasgow Author(s) Enlighten ID:Duan, Mr Li and Aragon Camarasa, Dr Gerardo
Authors: Duan, L., and Aragon-Camarasa, G.
College/School:College of Science and Engineering > School of Computing Science
Copyright Holders:Copyright © 2022 by SCITEPRESS – Science and Technology Publications
Publisher Policy:Reproduced under a Creative Commons licence
Related URLs:

University Staff: Request a correction | Enlighten Editors: Update this record