Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Universal approximations of invariant maps by neural networks

About

We describe generalizations of the universal approximation theorem for neural networks to maps invariant or equivariant with respect to linear representations of groups. Our goal is to establish network-like computational models that are both invariant/equivariant and provably complete in the sense of their ability to approximate any continuous invariant/equivariant map. Our contribution is three-fold. First, in the general case of compact groups we propose a construction of a complete invariant/equivariant network using an intermediate polynomial layer. We invoke classical theorems of Hilbert and Weyl to justify and simplify this construction; in particular, we describe an explicit complete ansatz for approximation of permutation-invariant maps. Second, we consider groups of translations and prove several versions of the universal approximation theorem for convolutional networks in the limit of continuous signals on euclidean spaces. Finally, we consider 2D signal transformations equivariant with respect to the group SE(2) of rigid euclidean motions. In this case we introduce the "charge--conserving convnet" -- a convnet-like computational model based on the decomposition of the feature space into isotypic representations of SO(2). We prove this model to be a universal approximator for continuous SE(2)--equivariant signal transformations.

Dmitry Yarotsky• 2018

Related benchmarks

TaskDatasetResultRank
Node ClassificationPATTERN (test)
Test Accuracy84.641
88
Graph ClassificationEXP (test)
Accuracy50
33
Graph SeparationGRAPH8c random initialization
Non-Separated Pairs0.00e+0
11
Graph SeparationEXP random initialization
Non-separated Graph Pairs0.00e+0
11
Position Regressionn-body (test)
Position MSE0.0041
9
Showing 5 of 5 rows

Other info

Follow for update