Murali, P. K. , Dutta, A., Gentner, M., Burdet, E., Dahiya, R. and Kaboli, M. (2022) Active visuo-tactile interactive robotic perception for accurate object pose estimation in dense clutter. IEEE Robotics and Automation Letters, 7(2), pp. 4686-4693. (doi: 10.1109/LRA.2022.3150045)
Text
264921.pdf - Published Version Available under License Creative Commons Attribution. 2MB |
Abstract
This work presents a novel active visuo-tactile based framework for robotic systems to accurately estimate pose of objects in dense cluttered environments. The scene representation is derived using a novel declutter graph (DG) which describes the relationship among objects in the scene for decluttering by leveraging semantic segmentation and grasp affordances networks. The graph formulation allows robots to efficiently declutter the workspace by autonomously selecting the next best object to remove and the optimal action(prehensile or non-prehensile) to perform.Furthermore, we propose a novel translation-invariant Quaternion filter (TIQF) for active vision and active tactile based pose estimation. Both active visual and active tactile points are selected by maximising the expected information gain. We evaluate our proposed framework on a system with two robots coordinating on randomised scenes of dense cluttered objects and perform ablation studies with static vision and active vision based estimation prior and post decluttering a baselines. Our proposed active visuo-tactile interactive perception framework shows up to 36% improvement in pose accuracy compared to active vision base line and lowest average error (less than 1 cm) compared to other state-of-the-art approaches.
Item Type: | Articles |
---|---|
Status: | Published |
Refereed: | Yes |
Glasgow Author(s) Enlighten ID: | Dahiya, Professor Ravinder and Murali, Prajval Kumar |
Authors: | Murali, P. K., Dutta, A., Gentner, M., Burdet, E., Dahiya, R., and Kaboli, M. |
College/School: | College of Science and Engineering > School of Engineering > Electronics and Nanoscale Engineering |
Journal Name: | IEEE Robotics and Automation Letters |
Publisher: | IEEE |
ISSN: | 2377-3774 |
ISSN (Online): | 2377-3766 |
Published Online: | 10 February 2022 |
Copyright Holders: | Copyright © 2022 The Authors |
First Published: | First published in IEEE Robotics and Automation Letters 7(2): 4686-4693 |
Publisher Policy: | Reproduced under a Creative Commons License |
University Staff: Request a correction | Enlighten Editors: Update this record