Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Near-Universal Multiplicative Updates for Nonnegative Einsum Factorization

About

Despite the ubiquity of multiway data across scientific domains, there are few user-friendly tools that fit tailored nonnegative tensor factorizations. Researchers may use gradient-based automatic differentiation (which often struggles in nonnegative settings), choose between a limited set of methods with mature implementations, or implement their own model from scratch. As an alternative, we introduce NNEinFact, an einsum-based multiplicative update algorithm that fits any nonnegative tensor factorization expressible as a tensor contraction by minimizing one of many user-specified loss functions (including the $(\alpha,\beta)$-divergence). To use NNEinFact, the researcher simply specifies their model with a string. NNEinFact converges to a stationary point of the loss, supports missing data, and fits to tensors with hundreds of millions of entries in seconds. Empirically, NNEinFact fits custom models which outperform standard ones in heldout prediction tasks on real-world tensor data by over $37\%$ and attains less than half the test loss of gradient-based methods while converging up to 90 times faster.

John Hood, Aaron Schein• 2026

Related benchmarks

TaskDatasetResultRank
Tensor FactorizationUber (heldout)
Mean heldout (α,β)-Divergence0.008
21
Tensor FactorizationWITS (heldout)
Mean Heldout Alpha-Beta Divergence0.0134
21
Tensor FactorizationICEWS (heldout)
Mean Heldout (α,β)-Divergence0.0203
21
Showing 3 of 3 rows

Other info

Follow for update