Privacy-Preserving Visual Content Tagging using Graph Transformer Networks

Vu, X.-S., Le, D.-T., Edlund, C., Jiang, L. and Nguyen, H. D. (2020) Privacy-Preserving Visual Content Tagging using Graph Transformer Networks. In: 8th ACM International Conference on Multimedia, Seattle, WA, USA, 12-16 Oct 2020, pp. 2299-2307. ISBN 9781450379885 (doi: 10.1145/3394171.3414047)

Full text not currently available from Enlighten.

Abstract

With the rapid growth of Internet media, content tagging has become an important topic with many multimedia understanding applications, including efficient organisation and search. Nevertheless, existing visual tagging approaches are susceptible to inherent privacy risks in which private information may be exposed unintentionally. The use of anonymisation and privacy-protection methods is desirable, but with the expense of task performance. Therefore, this paper proposes an end-to-end framework (SGTN) using Graph Transformer and Convolutional Networks to significantly improve classification and privacy preservation of visual data. Especially, we employ several mechanisms such as differential privacy based graph construction and noise-induced graph transformation to protect the privacy of knowledge graphs. Our approach unveils new state-of-the-art on MS-COCO dataset in various semi-supervised settings. In addition, we showcase a real experiment in the education domain to address the automation of sensitive document tagging. Experimental results show that our approach achieves an excellent balance of model accuracy and privacy preservation on both public and private datasets.

Item Type:Conference Proceedings
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Nguyen, Dr Hoang D.
Authors: Vu, X.-S., Le, D.-T., Edlund, C., Jiang, L., and Nguyen, H. D.
College/School:College of Science and Engineering > School of Computing Science
Journal Name:Proceedings of the 28th ACM International Conference on Multimedia
ISBN:9781450379885

University Staff: Request a correction | Enlighten Editors: Update this record