Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Optimal Classification Trees for Continuous Feature Data Using Dynamic Programming with Branch-and-Bound

About

Computing an optimal classification tree that provably maximizes training performance within a given size limit, is NP-hard, and in practice, most state-of-the-art methods do not scale beyond computing optimal trees of depth three. Therefore, most methods rely on a coarse binarization of continuous features to maintain scalability. We propose a novel algorithm that optimizes trees directly on the continuous feature data using dynamic programming with branch-and-bound. We develop new pruning techniques that eliminate many sub-optimal splits in the search when similar to previously computed splits and we provide an efficient subroutine for computing optimal depth-two trees. Our experiments demonstrate that these techniques improve runtime by one or more orders of magnitude over state-of-the-art optimal methods and improve test accuracy by 5% over greedy heuristics.

Catalin E. Brita, Jacobus G. M. van der Linden, Emir Demirovi\'c• 2025

Related benchmarks

TaskDatasetResultRank
Classificationdry-bean (test)
Accuracy58.2
39
Decision Tree InductionUCI Machine Learning Repository 16 datasets (Average)
Average Primal Integral93.8
28
Decision Tree LearningUCI Machine Learning Repository Average of 16 datasets (train test)
Average Primal Integral93.8
28
ClassificationRaisin UCI (test)
Accuracy82.9
13
Classificationsegment UCI (test)
Accuracy96.4
12
Classificationskin UCI (test)
Accuracy99.4
11
ClassificationUCI Htru2 (test)
Accuracy90.6
9
Classificationoccupancy UCI (test)
Accuracy99.2
6
Classificationbank UCI (test)
Accuracy98.8
6
Classificationrice UCI (test)
Accuracy91.7
6
Showing 10 of 20 rows

Other info

Follow for update