Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Tensor Methods: A Unified and Interpretable Approach for Material Design

About

When designing new materials, it is often necessary to tailor the material design (with respect to its design parameters) to have some desired properties (e.g. Young's modulus). As the set of design parameters grow, the search space grows exponentially, making the actual synthesis and evaluation of all material combinations virtually impossible. Even using traditional computational methods such as Finite Element Analysis becomes too computationally heavy to search the design space. Recent methods use machine learning (ML) surrogate models to more efficiently determine optimal material designs; unfortunately, these methods often (i) are notoriously difficult to interpret and (ii) under perform when the training data comes from a non-uniform sampling of the design space. We suggest the use of tensor completion methods as an all-in-one approach for interpretability and predictions. We observe classical tensor methods are able to compete with traditional ML in predictions, with the added benefit of their interpretable tensor factors (which are given completely for free, as a result of the prediction). In our experiments, we are able to rediscover physical phenomena via the tensor factors, indicating that our predictions are aligned with the true underlying physics of the problem. This also means these tensor factors could be used by experimentalists to identify potentially novel patterns, given we are able to rediscover existing ones. We also study the effects of both types of surrogate models when we encounter training data from a non-uniform sampling of the design space. We observe more specialized tensor methods that can give better generalization in these non-uniforms sampling scenarios. We find the best generalization comes from a tensor model, which is able to improve upon the baseline ML methods by up to 5% on aggregate $R^2$, and halve the error in some out of distribution regions.

Shaan Pakala, Aldair E. Gongora, Brian Giera, Evangelos E. Papalexakis• 2026

Related benchmarks

TaskDatasetResultRank
Surrogate ModelingLattice Dataset biased sampling
R^20.84
10
Surrogate ModelingCogni-e-Spin Dataset biased sampling
R^20.43
10
RegressionLattice Dataset (80% uniform sampling)
R^20.99
10
Surrogate ModelingCrossed Barrel Dataset biased sampling
R^20.56
10
RegressionCrossed Barrel Dataset (80% uniform sampling)
R^20.72
10
RegressionCogni-e-Spin Dataset (80% uniform sampling)
R^20.45
10
Showing 6 of 6 rows

Other info

Follow for update