Hybrid Multi-Granularity Approach for Few-Shot Image Retrieval with Weak Features

This paper proposes a multi-granularity retrieval algorithm based on an unsupervised image augmentation network. The algorithm designs a feature extraction method (AugODNet_BRA) rooted in image augmentation, which efficiently captures high-level semantic features of images with few samples, small ta...

Full description

Saved in:
Bibliographic Details
Main Authors: Aiguo Lu, Zican Li, Yanwei Liu, Pandi Liu, Ke Wang
Format: Article
Language:English
Published: MDPI AG 2025-05-01
Series:Algorithms
Subjects:
Online Access:https://www.mdpi.com/1999-4893/18/6/329
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper proposes a multi-granularity retrieval algorithm based on an unsupervised image augmentation network. The algorithm designs a feature extraction method (AugODNet_BRA) rooted in image augmentation, which efficiently captures high-level semantic features of images with few samples, small targets, and weak features through unsupervised learning. The Omni-Dimensional Dynamic Convolution module and Bi-Level Routing Attention mechanism are introduced to enhance the model’s adaptability to complex scenes and variable features, thereby improving its capability to capture details of small targets. The Omni-Dimensional Dynamic Convolution module flexibly adjusts the dimensions of convolution kernels to accommodate small targets of varying sizes and shapes. At the same time, the Bi-Level Routing Attention mechanism adaptively focuses on key regions, boosting the model’s discriminative ability for targets in complex backgrounds. The optimized loss function further enhances the robustness and distinctiveness of features, improving retrieval accuracy. The experimental results demonstrate that the proposed method outperforms baseline algorithms on the public dataset CUB-200-2011 and exhibits great potential for application and practical value in scenarios such as carrier-based aircraft tail hook recognition.
ISSN:1999-4893