Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

PILOT: Introducing Transformers for Probabilistic Sound Event Localization

About

Sound event localization aims at estimating the positions of sound sources in the environment with respect to an acoustic receiver (e.g. a microphone array). Recent advances in this domain most prominently focused on utilizing deep recurrent neural networks. Inspired by the success of transformer architectures as a suitable alternative to classical recurrent neural networks, this paper introduces a novel transformer-based sound event localization framework, where temporal dependencies in the received multi-channel audio signals are captured via self-attention mechanisms. Additionally, the estimated sound event positions are represented as multivariate Gaussian variables, yielding an additional notion of uncertainty, which many previously proposed deep learning-based systems designed for this application do not provide. The framework is evaluated on three publicly available multi-source sound event localization datasets and compared against state-of-the-art methods in terms of localization error and event detection accuracy. It outperforms all competing systems on all datasets with statistical significant differences in performance.

Christopher Schymura, Benedikt B\"onninghoff, Tsubasa Ochiai, Marc Delcroix, Keisuke Kinoshita, Tomohiro Nakatani, Shoko Araki, Dorothea Kolossa• 2021

Related benchmarks

TaskDatasetResultRank
3D Sound LocalizationTUT Sound Events REAL component 2018 (test)
Localization Error (°)4.2
5
Showing 1 of 1 rows

Other info

Follow for update