Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

NeMF: Neural Motion Fields for Kinematic Animation

About

We present an implicit neural representation to learn the spatio-temporal space of kinematic motions. Unlike previous work that represents motion as discrete sequential samples, we propose to express the vast motion space as a continuous function over time, hence the name Neural Motion Fields (NeMF). Specifically, we use a neural network to learn this function for miscellaneous sets of motions, which is designed to be a generative model conditioned on a temporal coordinate $t$ and a random vector $z$ for controlling the style. The model is then trained as a Variational Autoencoder (VAE) with motion encoders to sample the latent space. We train our model with a diverse human motion dataset and quadruped dataset to prove its versatility, and finally deploy it as a generic motion prior to solve task-agnostic problems and show its superiority in different motion generation and editing applications, such as motion interpolation, in-betweening, and re-navigating. More details can be found on our project page: https://cs.yale.edu/homes/che/projects/nemf/.

Chengan He, Jun Saito, James Zachary, Holly Rushmeier, Yi Zhou• 2022

Related benchmarks

TaskDatasetResultRank
Motion In-betweeningLaFAN1 (test)
L2Q0.18
77
Motion clips in-betweeningMotion in-betweening dataset (test)
FID0.024
15
Sparse keyframe in-betweeningAIST++
FID0.085
12
Motion ReconstructionAMASS (test)
MRE5.988
3
Motion SynthesisAMASS (test)
FID6.508
3
Showing 5 of 5 rows

Other info

Code

Follow for update