Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Clipped Hyperbolic Classifiers Are Super-Hyperbolic Classifiers

About

Hyperbolic space can naturally embed hierarchies, unlike Euclidean space. Hyperbolic Neural Networks (HNNs) exploit such representational power by lifting Euclidean features into hyperbolic space for classification, outperforming Euclidean neural networks (ENNs) on datasets with known semantic hierarchies. However, HNNs underperform ENNs on standard benchmarks without clear hierarchies, greatly restricting HNNs' applicability in practice. Our key insight is that HNNs' poorer general classification performance results from vanishing gradients during backpropagation, caused by their hybrid architecture connecting Euclidean features to a hyperbolic classifier. We propose an effective solution by simply clipping the Euclidean feature magnitude while training HNNs. Our experiments demonstrate that clipped HNNs become super-hyperbolic classifiers: They are not only consistently better than HNNs which already outperform ENNs on hierarchical data, but also on-par with ENNs on MNIST, CIFAR10, CIFAR100 and ImageNet benchmarks, with better adversarial robustness and out-of-distribution detection.

Yunhui Guo, Xudong Wang, Yubei Chen, Stella X. Yu• 2021

Related benchmarks

TaskDatasetResultRank
Image ClassificationMNIST (test)
Accuracy93.34
882
Image ClassificationMNIST
Accuracy99.08
395
Image ClassificationImageNet
Top-1 Accuracy68.45
324
Image ClassificationCIFAR10
Accuracy94.76
240
Few-shot classificationMini-ImageNet--
175
Out-of-Distribution DetectionTexture
AUROC89.91
109
Out-of-Distribution DetectionCIFAR-10 vs SVHN (test)
AUROC0.9157
101
Few-shot classificationCUB
Accuracy81.76
96
Out-of-Distribution DetectionCIFAR-100 SVHN in-distribution out-of-distribution (test)
AUROC89.53
90
Out-of-Distribution DetectionCIFAR-10 in-distribution LSUN out-of-distribution (test)
AUROC92.97
73
Showing 10 of 31 rows

Other info

Follow for update