Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Exponential Moving Average Normalization for Self-supervised and Semi-supervised Learning

About

We present a plug-in replacement for batch normalization (BN) called exponential moving average normalization (EMAN), which improves the performance of existing student-teacher based self- and semi-supervised learning techniques. Unlike the standard BN, where the statistics are computed within each batch, EMAN, used in the teacher, updates its statistics by exponential moving average from the BN statistics of the student. This design reduces the intrinsic cross-sample dependency of BN and enhances the generalization of the teacher. EMAN improves strong baselines for self-supervised learning by 4-6/1-2 points and semi-supervised learning by about 7/2 points, when 1%/10% supervised labels are available on ImageNet. These improvements are consistent across methods, network architectures, training duration, and datasets, demonstrating the general effectiveness of this technique. The code is available at https://github.com/amazon-research/exponential-moving-average-normalization.

Zhaowei Cai, Avinash Ravichandran, Subhransu Maji, Charless Fowlkes, Zhuowen Tu, Stefano Soatto• 2021

Related benchmarks

TaskDatasetResultRank
Image ClassificationImageNet 1% labeled
Top-5 Accuracy83.4
118
Image ClassificationImageNet (10% labels)
Top-1 Acc74
98
Image ClassificationImageNet 1k (10% labels)
Top-1 Acc74
92
KNN ClassificationImageNet-1k (val)
Top-1 Accuracy64.9
53
Image ClassificationImageNet 1k (1%)
Top-1 Acc63
49
Image ClassificationImageNet 10% label fraction 2012 (val)
Top-1 Acc72.8
18
Image ClassificationImageNet 10% labels 1K (val)
Top-5 Error88.5
18
Image ClassificationImageNet 1% labels 1k (val)
Top-1 Accuracy57.4
16
Category-level retrievalImageNet-1k (val)
mAP47.9
14
Image ClassificationImageNet-1k (val)
Top-1 Acc (1% labels)63
9
Showing 10 of 11 rows

Other info

Code

Follow for update