Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Zero-Shot Learning via Semantic Similarity Embedding

About

In this paper we consider a version of the zero-shot learning problem where seen class source and target domain data are provided. The goal during test-time is to accurately predict the class label of an unseen target domain instance based on revealed source domain side information (\eg attributes) for unseen classes. Our method is based on viewing each source or target data as a mixture of seen class proportions and we postulate that the mixture patterns have to be similar if the two instances belong to the same unseen class. This perspective leads us to learning source/target embedding functions that map an arbitrary source/target domain data into a same semantic space where similarity can be readily measured. We develop a max-margin framework to learn these similarity functions and jointly optimize parameters by means of cross validation. Our test results are compelling, leading to significant improvement in terms of accuracy on most benchmark datasets for zero-shot recognition.

Ziming Zhang, Venkatesh Saligrama• 2015

Related benchmarks

TaskDatasetResultRank
Zero-shot LearningCUB
Top-1 Accuracy43.9
144
Image ClassificationCUB
Unseen Top-1 Acc8.5
89
Image ClassificationSUN
Harmonic Mean Top-1 Accuracy4
86
ClassificationCUB--
85
Zero-shot LearningSUN (unseen)
Top-1 Accuracy (%)51.5
50
Zero-shot Image ClassificationAWA2 (test)
Metric U8.1
46
Zero-shot recognitionAWA (test)
Avg Top-1 Acc45.6
34
Image ClassificationAWA1
Test Set Score (ts)7
30
Zero-shot recognitionAwA1 (test)
Top-1 Accuracy60.1
25
Zero-shot recognitionAnimals with Attributes (AwA)
Accuracy76.33
24
Showing 10 of 36 rows

Other info

Code

Follow for update