This paper presents a method to more efficiently set the margin parameters of margin-based softmax for training a visual place recognition model. We observe an inconsistency in sample emphasis when using the conventional margin-based softmax loss and propose to solve this by utilizing two gradient scaling curves. We propose to assign more emphasis on the samples that currently come as inputs by analyzing the gradient scaling curve in terms of the angle between two vectors: the feature vector and its ground truth class vector. The gradient scaling curve shows that the model mainly emphasizes the semi-trained samples while it gives relatively small emphasis on the well-trained samples. We propose a method to adjust both angular and additive margin parameters to control the shape of the gradient scaling curve and highlight the samples that are currently being trained on. Moreover, gradient scaling is very sensitive to the type of optimizer and the learning rate. In this paper, we examine our gradient scaling effect in various optimization environments and find out the best configuration guideline for our novel loss.
Adaptively Marginalizing Cosine Logits for Effectively Learning Deep Visual Place Representation in Loop Closure Detection
Lect. Notes in Networks, Syst.
International Conference on Robot Intelligence Technology and Applications ; 2023 ; Taicang December 06, 2023 - December 08, 2023
2024-11-22
17 pages
Article/Chapter (Book)
Electronic Resource
English
Principal components analysis of the logits of the survivorship function
British Library Conference Proceedings | 1993
|