Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Generalized Zero-Shot Learning Via Over-Complete Distribution

About

A well trained and generalized deep neural network (DNN) should be robust to both seen and unseen classes. However, the performance of most of the existing supervised DNN algorithms degrade for classes which are unseen in the training set. To learn a discriminative classifier which yields good performance in Zero-Shot Learning (ZSL) settings, we propose to generate an Over-Complete Distribution (OCD) using Conditional Variational Autoencoder (CVAE) of both seen and unseen classes. In order to enforce the separability between classes and reduce the class scatter, we propose the use of Online Batch Triplet Loss (OBTL) and Center Loss (CL) on the generated OCD. The effectiveness of the framework is evaluated using both Zero-Shot Learning and Generalized Zero-Shot Learning protocols on three publicly available benchmark databases, SUN, CUB and AWA2. The results show that generating over-complete distributions and enforcing the classifier to learn a transform function from overlapping to non-overlapping distributions can improve the performance on both seen and unseen classes.

Rohit Keshari, Richa Singh, Mayank Vatsa• 2020

Related benchmarks

TaskDatasetResultRank
Zero-shot LearningAWA2
Top-1 Accuracy0.713
95
Showing 1 of 1 rows

Other info

Follow for update