Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Credal Learning Theory

About

Statistical learning theory is the foundation of machine learning, providing theoretical bounds for the risk of models learned from a (single) training set, assumed to issue from an unknown probability distribution. In actual deployment, however, the data distribution may (and often does) vary, causing domain adaptation/generalization issues. In this paper we lay the foundations for a `credal' theory of learning, using convex sets of probabilities (credal sets) to model the variability in the data-generating distribution. Such credal sets, we argue, may be inferred from a finite sample of training sets. Bounds are derived for the case of finite hypotheses spaces (both assuming realizability or not), as well as infinite model spaces, which directly generalize classical results.

Michele Caprio, Maryam Sultana, Eleni Elia, Fabio Cuzzolin• 2024

Related benchmarks

TaskDatasetResultRank
Emotion ClassificationGoEmotions
PCC (Uepi vs Uale)0.6
12
Hate speech classificationHateXplain
Accuracy0.72
7
Question AnsweringMAQA
Accuracy0.63
7
Question AnsweringAmbigQA
Accuracy59
7
Sentiment AnalysisCEBaB
Accuracy82
7
Showing 5 of 5 rows

Other info

Follow for update