Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

SAMa: Material-aware 3D Selection and Segmentation

About

Decomposing 3D assets into material parts is a common task for artists, yet remains a highly manual process. In this work, we introduce Select Any Material (SAMa), a material selection approach for in-the-wild objects in arbitrary 3D representations. Building on SAM2's video prior, we construct a material-centric video dataset that extends it to the material domain. We propose an efficient way to lift the model's 2D predictions to 3D by projecting each view into an intermediary 3D point cloud using depth. Nearest-neighbor lookups between any 3D representation and this similarity point cloud allow us to efficiently reconstruct accurate selection masks over objects' surfaces that can be inspected from any view. Our method is multiview-consistent by design, alleviating the need for costly per-asset optimization, and performs optimization-free selection in seconds. SAMa outperforms several strong baselines in selection accuracy and multiview consistency and enables various compelling applications, such as replacing the diffuse-textured materials on a text-to-3D output with PBR materials or selecting and editing materials on NeRFs and 3DGS captures.

Michael Fischer, Iliyan Georgiev, Thibault Groueix, Vladimir G. Kim, Tobias Ritschel, Valentin Deschaintre• 2024

Related benchmarks

TaskDatasetResultRank
Material SelectionNeRF
mIoU48
4
Material SelectionMIPNeRF-360
mIoU60
4
Material SelectionOur Dataset (test)
mIoU69
4
Multiview ConsistencyNeRF unseen views 49 (test)
Hamming Distance2.2
4
Multiview ConsistencyObject-centric (test)
Hamming Distance (x100)1.7
4
RobustnessNeRF unseen views 49 (test)
Hamming Distance (x100)1.1
4
RobustnessMIPNeRF-360 unseen views 2 (test)
Hamming Distance (x100)1.2
4
RobustnessObject-centric (test)
Hamming Distance (x100)0.3
4
Multiview ConsistencyMIPNeRF-360 unseen views 2 (test)
Hamming Distance0.014
4
Showing 9 of 9 rows

Other info

Follow for update