Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Split-Brain Autoencoders: Unsupervised Learning by Cross-Channel Prediction

About

We propose split-brain autoencoders, a straightforward modification of the traditional autoencoder architecture, for unsupervised representation learning. The method adds a split to the network, resulting in two disjoint sub-networks. Each sub-network is trained to perform a difficult task -- predicting one subset of the data channels from another. Together, the sub-networks extract features from the entire input signal. By forcing the network to solve cross-channel prediction tasks, we induce a representation within the network which transfers well to other, unseen tasks. This method achieves state-of-the-art performance on several large-scale transfer learning benchmarks.

Richard Zhang, Phillip Isola, Alexei A. Efros• 2016

Related benchmarks

TaskDatasetResultRank
Semantic segmentationPASCAL VOC 2012 (val)
Mean IoU36
2142
Image ClassificationImageNet-1k (val)
Top-1 Accuracy35.4
1469
Semantic segmentationPASCAL VOC 2012 (test)
mIoU36
1415
Object DetectionPASCAL VOC 2007 (test)
mAP46.7
844
Semantic segmentationPASCAL VOC 2012
mIoU36
218
ClassificationPASCAL VOC 2007 (test)
mAP (%)67.1
217
Semantic segmentationPascal VOC
mIoU0.36
180
Scene ClassificationPlaces 205 categories (test)
Top-1 Acc34.1
150
Image ClassificationSTL-10--
129
Scene ClassificationPlaces-205 (val)
Top-1 Acc34.1
97
Showing 10 of 30 rows

Other info

Follow for update