Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Robust Pre-Training by Adversarial Contrastive Learning

About

Recent work has shown that, when integrated with adversarial training, self-supervised pre-training can lead to state-of-the-art robustness In this work, we improve robustness-aware self-supervised pre-training by learning representations that are consistent under both data augmentations and adversarial perturbations. Our approach leverages a recent contrastive learning framework, which learns representations by maximizing feature consistency under differently augmented views. This fits particularly well with the goal of adversarial robustness, as one cause of adversarial fragility is the lack of feature invariance, i.e., small input perturbations can result in undesirable large changes in features or even predicted labels. We explore various options to formulate the contrastive task, and demonstrate that by injecting adversarial perturbations, contrastive pre-training can lead to models that are both label-efficient and robust. We empirically evaluate the proposed Adversarial Contrastive Learning (ACL) and show it can consistently outperform existing methods. For example on the CIFAR-10 dataset, ACL outperforms the previous state-of-the-art unsupervised robust pre-training approach by 2.99% on robust accuracy and 2.14% on standard accuracy. We further demonstrate that ACL pre-training can improve semi-supervised adversarial training, even when only a few labeled examples are available. Our codes and pre-trained models have been released at: https://github.com/VITA-Group/Adversarial-Contrastive-Learning.

Ziyu Jiang, Tianlong Chen, Ting Chen, Zhangyang Wang• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationSTL-10 (test)--
357
Robust Image ClassificationCIFAR-10
Clean Accuracy79.96
68
Image ClassificationCIFAR-10-C (test)--
61
Image ClassificationCIFAR-100-C v1 (test)--
60
Image ClassificationCIFAR-100 pre-trained on CIFAR-10 (test)
AA44.07
24
Image ClassificationSTL-10 pre-trained on CIFAR-10 (test)--
22
Image ClassificationCIFAR-10 (test)
AA49.42
12
Image ClassificationCIFAR-100 (test)
Average Accuracy (AA)24.16
12
Image ClassificationSTL-10 pre-trained on CIFAR-100 (test)
Average Accuracy28.76
12
Image ClassificationCIFAR-10-C CS-1 1.0 (test)
Mean Accuracy79.15
12
Showing 10 of 15 rows

Other info

Follow for update