Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Decision Tree Embedding by Leaf-Means

About

Decision trees and random forest remain highly competitive for classification on medium-sized, standard datasets due to their robustness, minimal preprocessing requirements, and interpretability. However, a single tree suffers from high estimation variance, while large ensembles reduce this variance at the cost of substantial computational overhead and diminished interpretability. In this paper, we propose Decision Tree Embedding (DTE), a fast and effective method that leverages the leaf partitions of a trained classification tree to construct an interpretable feature representation. By using the sample means within each leaf region as anchor points, DTE maps inputs into an embedding space defined by the tree's partition structure, effectively circumventing the high variance inherent in decision-tree splitting rules. We further introduce an ensemble extension based on additional bootstrap trees, and pair the resulting embedding with linear discriminant analysis for classification. We establish several population-level theoretical properties of DTE, including its preservation of conditional density under mild conditions and a characterization of the resulting classification error. Empirical studies on synthetic and real datasets demonstrate that DTE strikes a strong balance between accuracy and computational efficiency, outperforming or matching random forest and shallow neural networks while requiring only a fraction of their training time in most cases. Overall, the proposed DTE method can be viewed either as a scalable decision tree classifier that improves upon standard split rules, or as a neural network model whose weights are learned from tree-derived anchor points, achieving an intriguing integration of both paradigms.

Cencheng Shen, Yuexiao Dong, Carey E. Priebe• 2025

Related benchmarks

TaskDatasetResultRank
Node ClassificationCora (test)--
687
ClassificationWINE (test)--
29
ClassificationGerman Credit (test)--
16
Tabular Classificationwholesale-customers (test)
Test Error0.284
10
ClassificationFaceYale 32*32 (test)
Error Rate18.3
5
ClassificationFaceYale 64*64 (test)
Error Rate10.7
5
ClassificationColon (test)
Error Rate12
5
Classificationisolet (test)
Classification Error Rate5.5
5
ClassificationWisc Cancer (test)
Classification Error Rate0.042
5
ClassificationGolub (test)
Classification Error Rate6.6
5
Showing 10 of 18 rows

Other info

Follow for update