Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Large-Margin Softmax Loss for Convolutional Neural Networks

About

Cross-entropy loss together with softmax is arguably one of the most common used supervision components in convolutional neural networks (CNNs). Despite its simplicity, popularity and excellent performance, the component does not explicitly encourage discriminative learning of features. In this paper, we propose a generalized large-margin softmax (L-Softmax) loss which explicitly encourages intra-class compactness and inter-class separability between learned features. Moreover, L-Softmax not only can adjust the desired margin but also can avoid overfitting. We also show that the L-Softmax loss can be optimized by typical stochastic gradient descent. Extensive experiments on four benchmark datasets demonstrate that the deeply-learned features with L-softmax loss become more discriminative, hence significantly boosting the performance on a variety of visual classification and verification tasks.

Weiyang Liu, Yandong Wen, Zhiding Yu, Meng Yang• 2016

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10 (test)--
906
Image ClassificationCIFAR-100--
622
Image ClassificationCIFAR-10
Accuracy96.3
507
Binary ClassificationCIFAR10 Binary imb. 200 (test)
FPR @ 98% TPR0.75
41
Binary ClassificationBinary CIFAR100 (imbalance ratio 1:100) (test)
FPR @ 98% TPR89
41
Binary ClassificationBinary CIFAR100 imbalance ratio 1:200 (test)
FPR @ 98% TPR95
41
Face IdentificationMegaFace 1M distractors 1.0 (test)
Rank-1 Accuracy67.128
40
Face VerificationMegaFace 1.0 (test)
Verification Rate80.423
19
Binary ClassificationBinary CIFAR10 imb. 100 (test)
FPR @ 98% TPR59
17
Binary Classificationin-house MRI dataset (test)
FPR @0 FN81
17
Showing 10 of 12 rows

Other info

Follow for update