Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Towards Test-time Efficient Visual Place Recognition via Asymmetric Query Processing

About

Visual Place Recognition (VPR) has advanced significantly with high-capacity foundation models like DINOv2, achieving remarkable performance. Nonetheless, their substantial computational cost makes deployment on resource-constrained devices impractical. In this paper, we introduce an efficient asymmetric VPR framework that incorporates a high-capacity gallery model for offline feature extraction with a lightweight query network for online processing. A key challenge in this setting is ensuring compatibility between these heterogeneous networks, which conventional approaches address through computationally expensive k-NN-based compatible training. To overcome this, we propose a geographical memory bank that structures gallery features using geolocation metadata inherent in VPR databases, eliminating the need for exhaustive k-NN computations. Additionally, we introduce an implicit embedding augmentation technique that enhances the query network to model feature variations despite its limited capacity. Extensive experiments demonstrate that our method not only significantly reduces computational costs but also outperforms existing asymmetric retrieval techniques, establishing a new aspect for VPR in resource-limited environments. The code is available at https://github.com/jaeyoon1603/AsymVPR

Jaeyoon Kim, Yoonki Cho, Sung-Eui Yoon• 2025

Related benchmarks

TaskDatasetResultRank
Visual Place RecognitionMSLS (val)
Recall@192.3
236
Visual Place RecognitionTokyo24/7
Recall@192.7
146
Visual Place RecognitionNordland
Recall@174.6
112
Visual Place RecognitionPitts250k
Recall@195.4
84
Visual Place RecognitionAmsterTime
Recall@148.6
83
Showing 5 of 5 rows

Other info

Follow for update