Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Deep Forest

About

Current deep learning models are mostly build upon neural networks, i.e., multiple layers of parameterized differentiable nonlinear modules that can be trained by backpropagation. In this paper, we explore the possibility of building deep models based on non-differentiable modules. We conjecture that the mystery behind the success of deep neural networks owes much to three characteristics, i.e., layer-by-layer processing, in-model feature transformation and sufficient model complexity. We propose the gcForest approach, which generates \textit{deep forest} holding these characteristics. This is a decision tree ensemble approach, with much less hyper-parameters than deep neural networks, and its model complexity can be automatically determined in a data-dependent way. Experiments show that its performance is quite robust to hyper-parameter settings, such that in most cases, even across different data from different domains, it is able to get excellent performance by using the same default setting. This study opens the door of deep learning based on non-differentiable modules, and exhibits the possibility of constructing deep models without using backpropagation.

Zhi-Hua Zhou, Ji Feng• 2017

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10 (test)--
3381
Image ClassificationMNIST (test)
Error Rate0.74
31
Instance SegmentationSELVAMASK 1.0 (test)
mAP5.3
15
Tree crown segmentationDetectree2 Tropical
mAP (All)11.4
5
Tree crown segmentationQuebecTrees Temperate
mAP (All)8.3
5
Tree crown segmentationOAM-TCD Urban
mAP (All)0.065
5
Tree crown segmentationBCI50ha Tropical
mAP (All)20.3
5
Tree crown segmentationBAMForest Temperate
mAP (All)12
5
Spam ClassificationSMS Spam Dataset (test)
Precision100
4
Binary Classification33 tabular datasets Binary
Accuracy88.39
4
Showing 10 of 11 rows

Other info

Follow for update