Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

LegendreTron: Uprising Proper Multiclass Loss Learning

About

Loss functions serve as the foundation of supervised learning and are often chosen prior to model development. To avoid potentially ad hoc choices of losses, statistical decision theory describes a desirable property for losses known as \emph{properness}, which asserts that Bayes' rule is optimal. Recent works have sought to \emph{learn losses} and models jointly. Existing methods do this by fitting an inverse canonical link function which monotonically maps $\mathbb{R}$ to $[0,1]$ to estimate probabilities for binary problems. In this paper, we extend monotonicity to maps between $\mathbb{R}^{C-1}$ and the projected probability simplex $\tilde{\Delta}^{C-1}$ by using monotonicity of gradients of convex functions. We present {\sc LegendreTron} as a novel and practical method that jointly learns \emph{proper canonical losses} and probabilities for multiclass problems. Tested on a benchmark of domains with up to 1,000 classes, our experimental results show that our method consistently outperforms the natural multiclass baseline under a $t$-test at 99% significance on all datasets with greater than 10 classes.

Kevin Lam, Christian Walder, Spiridon Penev, Richard Nock• 2023

Related benchmarks

TaskDatasetResultRank
Multiclass Classificationcleveland
L1 calibration error0.863
26
Multiclass ClassificationBalance Scale
Accuracy91
6
Multiclass ClassificationGlass
Accuracy68.4
4
Showing 3 of 3 rows

Other info

Follow for update