Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Improving Zero-Shot Models with Label Distribution Priors

About

Labeling large image datasets with attributes such as facial age or object type is tedious and sometimes infeasible. Supervised machine learning methods provide a highly accurate solution, but require manual labels which are often unavailable. Zero-shot models (e.g., CLIP) do not require manual labels but are not as accurate as supervised ones, particularly when the attribute is numeric. We propose a new approach, CLIPPR (CLIP with Priors), which adapts zero-shot models for regression and classification on unlabelled datasets. Our method does not use any annotated images. Instead, we assume a prior over the label distribution in the dataset. We then train an adapter network on top of CLIP under two competing objectives: i) minimal change of predictions from the original CLIP model ii) minimal distance between predicted and prior distribution of labels. Additionally, we present a novel approach for selecting prompts for Vision & Language models using a distributional prior. Our method is effective and presents a significant improvement over the original model. We demonstrate an improvement of 28% in mean absolute error on the UTK age regression task. We also present promising results for classification benchmarks, improving the classification accuracy on the ImageNet dataset by 2.83%, without using any labels.

Jonathan Kahana, Niv Cohen, Yedid Hoshen• 2022

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100--
691
Image ClassificationEuroSAT--
569
Image ClassificationDTD--
542
Image ClassificationUCF101
Top-1 Acc57.9
455
Image ClassificationSUN397--
425
Image ClassificationFlowers-102
Top-1 Acc57.7
198
Image ClassificationImageNet-A (test)
Top-1 Acc11.6
175
Image ClassificationCaltech-101
Top-1 Accuracy84.8
152
Image ClassificationImageNet-R (test)--
118
Image ClassificationImageNet original (val)
Top-1 Acc60.4
65
Showing 10 of 12 rows

Other info

Follow for update