Shirian, A., Somandepalli, K., Sanchez, V. and Guha, T. (2022) Visually-Aware Acoustic Event Detection Using Heterogeneous Graphs. In: INTERSPEECH 2022, Incheon, South Korea, 18-22 Sep 2022, pp. 2428-2432. (doi: 10.21437/Interspeech.2022-10670)
Text
276564.pdf - Accepted Version 3MB |
Abstract
Perception of auditory events is inherently multimodal relying on both audio and visual cues. A large number of existing multimodal approaches process each modality using modality-specific models and then fuse the embeddings to encode the joint information. In contrast, we employ heterogeneous graphs to explicitly capture the spatial and temporal relationships between the modalities and represent detailed information about the underlying signal. Using heterogeneous graph approaches to address the task of visually-aware acoustic event classification, which serves as a compact, efficient and scalable way to represent data in the form of graphs. Through heterogeneous graphs, we show efficiently modelling of intra- and inter-modality relationships both at spatial and temporal scales. Our model can easily be adapted to different scales of events through relevant hyperparameters. Experiments on AudioSet, a large benchmark, shows that our model achieves state-of-the-art performance.
Item Type: | Conference Proceedings |
---|---|
Status: | Published |
Refereed: | Yes |
Glasgow Author(s) Enlighten ID: | Guha, Dr Tanaya |
Authors: | Shirian, A., Somandepalli, K., Sanchez, V., and Guha, T. |
College/School: | College of Science and Engineering > School of Computing Science |
Research Group: | GIST |
Copyright Holders: | Copyright © 2022 ISCA |
First Published: | First published in Proceedings of Interspeech 2022, pp 2428-2432 |
Publisher Policy: | Reproduced in accordance with the copyright policy of the publisher |
Related URLs: |
University Staff: Request a correction | Enlighten Editors: Update this record