Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

MME: Mixture of Mesh Experts with Random Walk Transformer Gating

About

In recent years, various methods have been proposed for mesh analysis, each offering distinct advantages and often excelling on different object classes. We present a novel Mixture of Experts (MoE) framework designed to harness the complementary strengths of these diverse approaches. We propose a new gate architecture that encourages each expert to specialise in the classes it excels in. Our design is guided by two key ideas: (1) random walks over the mesh surface effectively capture the regions that individual experts attend to, and (2) an attention mechanism that enables the gate to focus on the areas most informative for each expert's decision-making. To further enhance performance, we introduce a dynamic loss balancing scheme that adjusts a trade-off between diversity and similarity losses throughout the training, where diversity prompts expert specialization, and similarity enables knowledge sharing among the experts. Our framework achieves state-of-the-art results in mesh classification, retrieval, and semantic segmentation tasks. Our code is available at: https://github.com/amirbelder/MME-Mixture-of-Mesh-Experts.

Amir Belder, Ayellet Tal• 2026

Related benchmarks

TaskDatasetResultRank
ClassificationModelNet40 (test)
Accuracy92.9
120
ClassificationSHREC 11 (test)
Accuracy100
19
3D Shape ClassificationCube Engraving (test)
Accuracy100
17
3D Shape RetrievalShapeNet Core55 (test)--
11
Semantic segmentationCOSEG
Accuracy (Edge)99.9
8
Image RetrievalModelNet40 (test)
mAP92.9
7
Semantic segmentationHuman body
Accuracy (Edge)99.7
7
Classification3D-FUTURE (test)
Accuracy86.1
6
Semantic segmentationPartNet
Accuracy (Face)69.9
5
Showing 9 of 9 rows

Other info

Follow for update