Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Improving Deep Regression with Ordinal Entropy

About

In computer vision, it is often observed that formulating regression problems as a classification task often yields better performance. We investigate this curious phenomenon and provide a derivation to show that classification, with the cross-entropy loss, outperforms regression with a mean squared error loss in its ability to learn high-entropy feature representations. Based on the analysis, we propose an ordinal entropy loss to encourage higher-entropy feature spaces while maintaining ordinal relationships to improve the performance of regression tasks. Experiments on synthetic and real-world regression tasks demonstrate the importance and benefits of increasing entropy for regression.

Shihao Zhang, Linlin Yang, Michael Bi Mi, Xiaoxu Zheng, Angela Yao• 2023

Related benchmarks

TaskDatasetResultRank
Depth EstimationNYU v2 (test)
Threshold Accuracy (delta < 1.25)93.2
423
Crowd CountingShanghaiTech Part A (test)
MAE65.6
227
Crowd CountingShanghaiTech Part B (test)
MAE9.1
191
Cell CountingVGG (test)
MAE5.7
14
Cell CountingMBM (test)
MAE2.9
14
Age EstimationAgeDB (val)
Age MAE6.47
13
RegressionUCI-DIR Real Estate (test)
MAE (All)0.339
12
RegressionUCI-DIR Abalone (test)
MAE (All)6.77
12
RegressionSkyFinder
MAE2.94
11
RegressionTUAB
MAE7.28
10
Showing 10 of 21 rows

Other info

Code

Follow for update