Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Differentiable Quality Diversity

About

Quality diversity (QD) is a growing branch of stochastic optimization research that studies the problem of generating an archive of solutions that maximize a given objective function but are also diverse with respect to a set of specified measure functions. However, even when these functions are differentiable, QD algorithms treat them as "black boxes", ignoring gradient information. We present the differentiable quality diversity (DQD) problem, a special case of QD, where both the objective and measure functions are first order differentiable. We then present MAP-Elites via a Gradient Arborescence (MEGA), a DQD algorithm that leverages gradient information to efficiently explore the joint range of the objective and measure functions. Results in two QD benchmark domains and in searching the latent space of a StyleGAN show that MEGA significantly outperforms state-of-the-art QD algorithms, highlighting DQD's promise for efficient quality diversity optimization when gradient information is available. Source code is available at https://github.com/icaros-usc/dqd.

Matthew C. Fontaine, Stefanos Nikolaidis• 2021

Related benchmarks

TaskDatasetResultRank
Quality-Diversity OptimizationLSI
QD-score21.82
12
Quality-Diversity OptimizationLP sphere
QD-score75.3
11
Quality-Diversity OptimizationLP Rastrigin
QD-score62.58
11
Quality-Diversity OptimizationArm Repertoire
QD-score74.18
11
Quality DiversityLinear Projection sphere n=1000
QD-score75.3
8
Quality DiversityLinear Projection (Rastrigin) n=1000
QD-score62.58
8
Quality DiversityArm Repertoire 1000-DOF
QD-score74.18
8
Quality DiversityLSI StyleGAN+CLIP
QD-score21.82
5
Showing 8 of 8 rows

Other info

Code

Follow for update