Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

CSI: Novelty Detection via Contrastive Learning on Distributionally Shifted Instances

About

Novelty detection, i.e., identifying whether a given sample is drawn from outside the training distribution, is essential for reliable machine learning. To this end, there have been many attempts at learning a representation well-suited for novelty detection and designing a score based on such representation. In this paper, we propose a simple, yet effective method named contrasting shifted instances (CSI), inspired by the recent success on contrastive learning of visual representations. Specifically, in addition to contrasting a given sample with other instances as in conventional contrastive learning methods, our training scheme contrasts the sample with distributionally-shifted augmentations of itself. Based on this, we propose a new detection score that is specific to the proposed training scheme. Our experiments demonstrate the superiority of our method under various novelty detection scenarios, including unlabeled one-class, unlabeled multi-class and labeled multi-class settings, with various image benchmark datasets. Code and pre-trained models are available at https://github.com/alinlab/CSI.

Jihoon Tack, Sangwoo Mo, Jongheon Jeong, Jinwoo Shin• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10 (test)
Accuracy94.8
3381
Image ClassificationImageNet-1K
Top-1 Acc74.27
1239
Out-of-Distribution DetectionTextures
AUROC0.8647
168
Out-of-Distribution DetectionCIFAR-10
AUROC96.87
121
Out-of-Distribution DetectionImageNet
FPR9586.8
108
Out-of-Distribution DetectionCIFAR-100
AUROC89.2
107
Anomaly DetectionWBC
ROCAUC0.504
104
Out-of-Distribution DetectionCIFAR-10 vs CIFAR-100 (test)
AUROC92.2
93
OOD DetectionPlaces (OOD)
AUROC76.27
93
Near-OOD DetectionCIFAR-100 Near-OOD (test)
AUROC71.45
93
Showing 10 of 115 rows
...

Other info

Code

Follow for update