Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Riemannian batch normalization for SPD neural networks

About

Covariance matrices have attracted attention for machine learning applications due to their capacity to capture interesting structure in the data. The main challenge is that one needs to take into account the particular geometry of the Riemannian manifold of symmetric positive definite (SPD) matrices they belong to. In the context of deep networks, several architectures for these matrices have recently been proposed. In our article, we introduce a Riemannian batch normalization (batchnorm) algorithm, which generalizes the one used in Euclidean nets. This novel layer makes use of geometric operations on the manifold, notably the Riemannian barycenter, parallel transport and non-linear structured matrix transformations. We derive a new manifold-constrained gradient descent algorithm working in the space of SPD matrices, allowing to learn the batchnorm layer. We validate our proposed approach with experiments in three different contexts on diverse data types: a drone recognition dataset from radar observations, and on emotion and action recognition datasets from video and motion capture data. Experiments show that the Riemannian batchnorm systematically gives better classification performance compared with leading methods and a remarkable robustness to lack of data.

Daniel Brooks, Olivier Schwander, Frederic Barbaresco, Jean-Yves Schneider, Matthieu Cord• 2019

Related benchmarks

TaskDatasetResultRank
BCI classificationHinss2021 (inter-session)
Balanced Accuracy53.83
16
BCI classificationHinss inter-subject 2021
Balanced Accuracy50.65
16
Video ClassificationAFEW (val)
Accuracy35.39
4
Video ClassificationNTU RGB+D (val)
Accuracy41.92
4
Video ClassificationFPHA (val)
Accuracy65.03
4
Video ClassificationHDM05 (val)
Accuracy67.25
4
Showing 6 of 6 rows

Other info

Follow for update