Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Data curation via joint example selection further accelerates multimodal learning

About

Data curation is an essential component of large-scale pretraining. In this work, we demonstrate that jointly selecting batches of data is more effective for learning than selecting examples independently. Multimodal contrastive objectives expose the dependencies between data and thus naturally yield criteria for measuring the joint learnability of a batch. We derive a simple and tractable algorithm for selecting such batches, which significantly accelerate training beyond individually-prioritized data points. As performance improves by selecting from larger super-batches, we also leverage recent advances in model approximation to reduce the associated computational overhead. As a result, our approach--multimodal contrastive learning with joint example selection (JEST)--surpasses state-of-the-art models with up to 13$\times$ fewer iterations and 10$\times$ less computation. Essential to the performance of JEST is the ability to steer the data selection process towards the distribution of smaller, well-curated datasets via pretrained reference models, exposing the level of data curation as a new dimension for neural scaling laws.

Talfan Evans, Nikhil Parthasarathy, Hamza Merzic, Olivier J. Henaff• 2024

Related benchmarks

TaskDatasetResultRank
Image-Text RetrievalCOCO (test)--
37
Image-to-Text RetrievalCOCO 2017 (test)
I2T Score71.1
13
Text-to-Image RetrievalCOCO 2017 (test)
T2I Score54.8
13
Image ClassificationImageNet-1K 1.0 (val)
Zero-Shot Accuracy80.5
13
Generalist Multi-task EvaluationMultiple (ImageNet-1K, COCO)
Mean Delta0.9
13
Showing 5 of 5 rows

Other info

Follow for update