Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Active Learning by Acquiring Contrastive Examples

About

Common acquisition functions for active learning use either uncertainty or diversity sampling, aiming to select difficult and diverse data points from the pool of unlabeled data, respectively. In this work, leveraging the best of both worlds, we propose an acquisition function that opts for selecting \textit{contrastive examples}, i.e. data points that are similar in the model feature space and yet the model outputs maximally different predictive likelihoods. We compare our approach, CAL (Contrastive Active Learning), with a diverse set of acquisition functions in four natural language understanding tasks and seven datasets. Our experiments show that CAL performs consistently better or equal than the best performing baseline across all tasks, on both in-domain and out-of-domain data. We also conduct an extensive ablation study of our method and we further analyze all actively acquired datasets showing that CAL achieves a better trade-off between uncertainty and diversity compared to other strategies.

Katerina Margatina, Giorgos Vernikos, Lo\"ic Barrault, Nikolaos Aletras• 2021

Related benchmarks

TaskDatasetResultRank
Text ClassificationAG-News
Accuracy83.91
248
Sentiment ClassificationSST2 (test)
Accuracy59.3
214
Sentiment AnalysisSST-5 (test)
Accuracy25.3
173
Traffic ForecastingPeMS08
RMSE28.48
166
Sentiment ClassificationMR (test)
Accuracy66.2
142
Question ClassificationTREC (test)
Accuracy31.8
124
Text ClassificationIMDB
Accuracy87.62
107
Topic ClassificationAG News (test)
Accuracy42.3
98
Spatio-temporal forecastingPEMS08 (test)
MAPE12.4
96
Spatio-temporal forecastingUrbanEV (test)
MAPE30.81
73
Showing 10 of 17 rows

Other info

Follow for update