Shared Coupling-bridge Scheme for Weakly Supervised Local Feature Learning

Jiayuan Sun1, Luping Ji*1, Jiewen Zhu;1
1 University of Electronic Science and Technology of China
 

IEEE Transactions on Multimedia, 2023

 
[Abstract]     [Code]     [Citation]    

Abstract

Local feature learning is believed to be of important significance in classic vision tasks such as visual localization, image matching and 3D reconstruction. Limited by training samples, weakly-supervised strategy has become one of widely-concerned effective schemes for local feature learning. Currently, it still has some weaknesses needing further improvement, mainly including the discrimination power of extracted local descriptors, the localization accuracy of detected keypoints, and the efficiency of weakly-supervised local feature learning. Focusing on promoting the performance of sparse local feature learning with camera pose supervision, this paper pertinently proposes a Shared Coupling-bridge scheme with four light-weight yet effective improvements for weakly-supervised local feature (SCFeat) learning. It mainly contains: (i) the Feature-Fusion-ResUNet Backbone (F2R-Backbone) for local descriptors learning, (ii) a shared coupling-bridge normalization to improve the decoupling training of description network and detection network,(iii) an improved detection network with peakiness measurement to detect keypoints and iv) a new reward factor of fundamental matrix error to further optimize feature detection training. Extensive experiments prove that our SCFeat scheme is effective and has wide task adaptability. It could often obtain a state-of-the-art performance on classic image matching and visual localization. Even in terms of 3D reconstruction, it could still achieve competitive results.

Code

Please refer to our  github repo for code.

Citation

  @ARTICLE{10129857,
    author={Sun, Jiayuan and Ji, Luping and Zhu, Jiewen},
    journal={IEEE Transactions on Multimedia}, 
    title={Shared Coupling-bridge Scheme for Weakly Supervised Local Feature Learning}, 
    year={2023},
    volume={},
    number={},
    pages={1-13},
    doi={10.1109/TMM.2023.3278172}}