Multi-class Classification without Multi-class Labels
About
This work presents a new strategy for multi-class classification that requires no class-specific labels, but instead leverages pairwise similarity between examples, which is a weaker form of annotation. The proposed method, meta classification learning, optimizes a binary classifier for pairwise similarity prediction and through this process learns a multi-class classifier as a submodule. We formulate this approach, present a probabilistic graphical model for it, and derive a surprisingly simple loss function that can be used to learn neural network-based models. We then demonstrate that this same framework generalizes to the supervised, unsupervised cross-task, and semi-supervised settings. Our method is evaluated against state of the art in all three learning paradigms and shows a superior or comparable accuracy, providing evidence that learning multi-class classification without multi-class labels is a viable learning option.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Generalized Category Discovery | CIFAR-100 | -- | 133 | |
| New Intent Discovery | BANKING | NMI80.98 | 56 | |
| New Intent Discovery | M-CID | NMI64.09 | 56 | |
| Open intent recognition | StackOverflow | Accuracy70.82 | 54 | |
| Novel Class Discovery | CIFAR-10 (unlabelled set) | Clustering Accuracy70.9 | 21 | |
| Novel Class Discovery | CIFAR-100 (unlabelled set) | Clustering Accuracy21.5 | 21 | |
| Clustering | ImageNet unlabelled (train) | Clustering Accuracy74.4 | 14 | |
| Clustering | CIFAR10 unlabelled (train) | Clustering Accuracy70.9 | 14 | |
| Intent Clustering | CLINC full 2019 | NMI87.72 | 13 | |
| Intent Clustering | BANKING 2020 (full) | NMI75.68 | 13 |