Towards few-shot learning with triplet metric learning and Kullback-Leibler optimization
Abstract Few-shot learning has achieved great success in recent years, thanks to its requirement of limited number of labeled data. However, most of the state-of-the-art techniques of few-shot learning employ transfer learning, which still requires massive labeled data to train a meta-learning syste...
Saved in:
| Main Authors: | , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Springer
2025-06-01
|
| Series: | Complex & Intelligent Systems |
| Subjects: | |
| Online Access: | https://doi.org/10.1007/s40747-025-01935-4 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract Few-shot learning has achieved great success in recent years, thanks to its requirement of limited number of labeled data. However, most of the state-of-the-art techniques of few-shot learning employ transfer learning, which still requires massive labeled data to train a meta-learning system. To simulate the human learning mechanism, a deep model of few-shot learning is proposed to learn from one, or a few examples. First of all in this paper, we analyze and note that the problem with representative semi-supervised few-shot learning methods is the negligence of intra-class and inter-class scatter. To address this issue, we propose a new semi-supervised few-shot learning method with triplet metric learning and Kullback-Leibler optimization, in which KL divergence is employed to dynamically determine the inter-class margin; whereas the triplet metric learning is employed to achieve intra-class clustering. In training, the deep learning and expectation-maximization algorithm are used to optimize models. Intensive experiments have been conducted on three popular benchmark datasets, and the experimental results show that this method significantly improves the classification ability of few-shot learning tasks and obtains the most advanced performance. |
|---|---|
| ISSN: | 2199-4536 2198-6053 |