Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Multi-class Classification without Multi-class Labels

About

This work presents a new strategy for multi-class classification that requires no class-specific labels, but instead leverages pairwise similarity between examples, which is a weaker form of annotation. The proposed method, meta classification learning, optimizes a binary classifier for pairwise similarity prediction and through this process learns a multi-class classifier as a submodule. We formulate this approach, present a probabilistic graphical model for it, and derive a surprisingly simple loss function that can be used to learn neural network-based models. We then demonstrate that this same framework generalizes to the supervised, unsupervised cross-task, and semi-supervised settings. Our method is evaluated against state of the art in all three learning paradigms and shows a superior or comparable accuracy, providing evidence that learning multi-class classification without multi-class labels is a viable learning option.

Yen-Chang Hsu, Zhaoyang Lv, Joel Schlosser, Phillip Odom, Zsolt Kira• 2019

Related benchmarks

TaskDatasetResultRank
Generalized Category DiscoveryCIFAR-100--
133
New Intent DiscoveryBANKING
NMI80.98
56
New Intent DiscoveryM-CID
NMI64.09
56
Open intent recognitionStackOverflow
Accuracy70.82
54
Novel Class DiscoveryCIFAR-10 (unlabelled set)
Clustering Accuracy70.9
21
Novel Class DiscoveryCIFAR-100 (unlabelled set)
Clustering Accuracy21.5
21
ClusteringImageNet unlabelled (train)
Clustering Accuracy74.4
14
ClusteringCIFAR10 unlabelled (train)
Clustering Accuracy70.9
14
Intent ClusteringCLINC full 2019
NMI87.72
13
Intent ClusteringBANKING 2020 (full)
NMI75.68
13
Showing 10 of 20 rows

Other info

Follow for update