Coupled real-synthetic domain adaptation for real-world deep depth enhancement

Gu, X., Guo, Y., Deligianni, F. and Yang, G.-Z. (2020) Coupled real-synthetic domain adaptation for real-world deep depth enhancement. IEEE Transactions on Image Processing, 29, pp. 6343-6356. (doi: 10.1109/TIP.2020.2988574)

214086.pdf - Accepted Version



Advances in depth sensing technologies have allowed simultaneous acquisition of both color and depth data under different environments. However, most depth sensors have lower resolution than that of the associated color channels and such a mismatch can affect applications that require accurate depth recovery. Existing depth enhancement methods use simplistic noise models and cannot generalize well under real-world conditions. In this paper, a coupled real-synthetic domain adaptation method is proposed, which enables domain transfer between high-quality depth simulators and real depth camera information for super-resolution depth recovery. The method first enables the realistic degradation from synthetic images, and then enhances degraded depth data to high quality with a color-guided sub-network. The key advantage of the work is that it generalizes well to real-world datasets without further training or fine-tuning. Detailed quantitative and qualitative results are presented, and it is demonstrated that the proposed method achieves improved performance compared to previous methods fine-tuned on the specific datasets.

Item Type:Articles
Additional Information:This work was supported by the Engineering and Physical Sciences Research Council (EPSRC) under Grant (EP/R026092/1).
Glasgow Author(s) Enlighten ID:Deligianni, Dr Fani
Authors: Gu, X., Guo, Y., Deligianni, F., and Yang, G.-Z.
College/School:College of Science and Engineering > School of Computing Science
Journal Name:IEEE Transactions on Image Processing
ISSN (Online):1941-0042
Published Online:23 April 2020
Copyright Holders:Copyright © 2020 IEEE
First Published:First published in IEEE Transactions on Image Processing 29: 6343-6356
Publisher Policy:Reproduced in accordance with the publisher copyright policy

University Staff: Request a correction | Enlighten Editors: Update this record