Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Unsupervised Ensemble Learning Through Deep Energy-based Models

About

Unsupervised ensemble learning emerged to address the challenge of combining multiple learners' predictions without access to ground truth labels or additional data. This paradigm is crucial in scenarios where evaluating individual classifier performance or understanding their strengths is challenging due to limited information. We propose a novel deep energy-based method for constructing an accurate meta-learner using only the predictions of individual learners, potentially capable of capturing complex dependence structures between them. Our approach requires no labeled data, learner features, or problem-specific information, and has theoretical guarantees for when learners are conditionally independent. We demonstrate superior performance across diverse ensemble scenarios, including challenging mixture of experts settings. Our experiments span standard ensemble datasets and curated datasets designed to test how the model fuses expertise from multiple sources. These results highlight the potential of unsupervised ensemble learning to harness collective intelligence, especially in data-scarce or privacy-sensitive environments.

Ariel Maymon, Yanir Buznah, Uri Shaham• 2026

Related benchmarks

TaskDatasetResultRank
Image ClassificationImageNet 1.0 (val)
Accuracy57.47
48
Unsupervised Ensemble LearningMnistE
Accuracy94.95
13
Unsupervised Ensemble LearningMicroAgg2
Accuracy63.06
13
Unsupervised Ensemble LearningEyeMovem
Accuracy73.73
13
Unsupervised Ensemble LearningGesturePhsm
Accuracy67
13
Unsupervised Ensemble LearningTree3k
Accuracy95.52
13
Unsupervised Ensemble LearningPetFinder
Accuracy79.84
13
Unsupervised Ensemble LearningCSGO
Accuracy88.16
13
Unsupervised Ensemble LearningArtiChars
Accuracy82.21
13
ClassificationMnistE-4/7 Expert subset
Accuracy95.27
8
Showing 10 of 15 rows

Other info

Follow for update