Federated deep learning with prototype matching for object extraction from very-high-resolution remote sensing images

Zhang, X., Zhang, B., Yu, W. and Kang, X. (2023) Federated deep learning with prototype matching for object extraction from very-high-resolution remote sensing images. IEEE Transactions on Geoscience and Remote Sensing, 61, 5603316. (doi: 10.1109/TGRS.2023.3244136)

Full text not currently available from Enlighten.

Abstract

Deep convolutional neural networks (DCNNs) have become the leading tools for object extraction from very-high-resolution (VHR) remote sensing images. However, the label scarcity problem of local datasets hinders the prediction performances of DCNNs, and privacy concerns regarding remote sensing data often arise in the traditional deep learning schemes. To cope with these problems, we propose a novel federated learning scheme with prototype matching (FedPM) to collaboratively learn a richer DCNN model by leveraging remote sensing data distributed among multiple clients. This scheme conducts the federated optimization of DCNNs by aggregating clients’ knowledge in the gradient space without compromising data privacy. Specifically, the prototype matching method is developed to regularize the local training using prototypical representations while reducing the distribution divergence across heterogeneous image data. Furthermore, the derived deviations across local and global prototypes are applied to quantify the effects of local models on the decision boundary and optimize the global model updating via the attention-weighted aggregation scheme. Finally, the sparse ternary compression (STC) method is used to alleviate communication costs. Extensive experimental results derived from VHR aerial and satellite image datasets verify that the FedPM can dramatically improve the prediction performance of DCNNs on object extraction with lower communication costs. To the best of our knowledge, this is the first time that federated learning has been applied for remote sensing visual tasks.

Item Type:Articles
Additional Information:This work was supported in part by the National Key Research and Development Program of China under Grant 2021YFA0715203, in part by the Major Program of the National Natural Science Foundation of China under Grant 61890962, and in part by the National Natural Science Foundation of China under Grant 62221002, Grant 61890962, Grant 62201207, and Grant 62101183.
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:ZHANG, Boning
Authors: Zhang, X., Zhang, B., Yu, W., and Kang, X.
College/School:College of Science and Engineering > School of Computing Science
Journal Name:IEEE Transactions on Geoscience and Remote Sensing
Publisher:IEEE
ISSN:0196-2892
ISSN (Online):1558-0644
Published Online:10 February 2023

University Staff: Request a correction | Enlighten Editors: Update this record