Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds
About
We design a new algorithm for batch active learning with deep neural network models. Our algorithm, Batch Active learning by Diverse Gradient Embeddings (BADGE), samples groups of points that are disparate and high-magnitude when represented in a hallucinated gradient space, a strategy designed to incorporate both predictive uncertainty and sample diversity into every selected batch. Crucially, BADGE trades off between diversity and uncertainty without requiring any hand-tuned hyperparameters. We show that while other approaches sometimes succeed for particular batch sizes or architectures, BADGE consistently performs as well or better, making it a versatile option for practical active learning problems.
Jordan T. Ash, Chicheng Zhang, Akshay Krishnamurthy, John Langford, Alekh Agarwal• 2019
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Image Classification | CIFAR-100 (test) | Accuracy29.81 | 3518 | |
| Image Classification | CIFAR-10 (test) | Accuracy86.43 | 3381 | |
| Image Classification | Food-101 | Accuracy69.11 | 494 | |
| Image Classification | DTD | Accuracy61.52 | 487 | |
| Image Classification | Flowers102 | Accuracy96.44 | 478 | |
| Image Classification | DTD | Accuracy61.7 | 419 | |
| Image Classification | UCF101 | Top-1 Acc77.9 | 404 | |
| Image Classification | TinyImageNet (test) | Accuracy26 | 366 | |
| Action Recognition | UCF101 | Accuracy74.49 | 365 | |
| 3D Object Detection | ScanNet V2 (val) | mAP@0.2557.1 | 352 |
Showing 10 of 55 rows