Optimizing Hyperparameters in Meta-Learning for Enhanced Image Classification
This paper investigates the significance of hyperparameter optimization in meta-learning for image classification tasks. Despite advancements in deep learning, real-time image classification applications often suffer from data inadequacy. Few-shot learning addresses this challenge by enabling learni...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11087560/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850078821095571456 |
|---|---|
| author | Amala Mary Vincent P. Jidesh A. A. Bini |
| author_facet | Amala Mary Vincent P. Jidesh A. A. Bini |
| author_sort | Amala Mary Vincent |
| collection | DOAJ |
| description | This paper investigates the significance of hyperparameter optimization in meta-learning for image classification tasks. Despite advancements in deep learning, real-time image classification applications often suffer from data inadequacy. Few-shot learning addresses this challenge by enabling learning from limited samples. Meta-learning, a prominent tool for few-shot learning, learns across multiple classification tasks. We explore different types of meta-learners, with a particular focus on metric-based models. We analyze the potential of hyperparameter optimization techniques, specifically Bayesian optimization and its variants, to enhance the performance of these models. Experimental results on the Omniglot and ImageNet datasets demonstrate that incorporating Bayesian optimization, particularly its evolutionary strategy variant, into meta-learning frameworks leads to improved accuracy compared to settings without hyperparameter optimization. Here, we show that by optimizing hyperparameters for individual tasks rather than using a uniform setting, we achieve notable gains in model performance, underscoring the importance of tailored hyperparameter configurations in meta-learning. |
| format | Article |
| id | doaj-art-801db74569154c0b8fd411c070d4ff50 |
| institution | DOAJ |
| issn | 2169-3536 |
| language | English |
| publishDate | 2025-01-01 |
| publisher | IEEE |
| record_format | Article |
| series | IEEE Access |
| spelling | doaj-art-801db74569154c0b8fd411c070d4ff502025-08-20T02:45:28ZengIEEEIEEE Access2169-35362025-01-011313081613083110.1109/ACCESS.2025.359114211087560Optimizing Hyperparameters in Meta-Learning for Enhanced Image ClassificationAmala Mary Vincent0https://orcid.org/0000-0002-0009-5017P. Jidesh1https://orcid.org/0000-0001-9448-1906A. A. Bini2https://orcid.org/0000-0002-0559-267XDepartment of Mathematical and Computational Sciences, National Institute of Technology Karnataka, Surathkal, Mangalore, IndiaDepartment of Mathematical and Computational Sciences, National Institute of Technology Karnataka, Surathkal, Mangalore, IndiaDepartment of Electronics and Communication Engineering, National Institute of Technology Karnataka, Surathkal, Mangalore, IndiaThis paper investigates the significance of hyperparameter optimization in meta-learning for image classification tasks. Despite advancements in deep learning, real-time image classification applications often suffer from data inadequacy. Few-shot learning addresses this challenge by enabling learning from limited samples. Meta-learning, a prominent tool for few-shot learning, learns across multiple classification tasks. We explore different types of meta-learners, with a particular focus on metric-based models. We analyze the potential of hyperparameter optimization techniques, specifically Bayesian optimization and its variants, to enhance the performance of these models. Experimental results on the Omniglot and ImageNet datasets demonstrate that incorporating Bayesian optimization, particularly its evolutionary strategy variant, into meta-learning frameworks leads to improved accuracy compared to settings without hyperparameter optimization. Here, we show that by optimizing hyperparameters for individual tasks rather than using a uniform setting, we achieve notable gains in model performance, underscoring the importance of tailored hyperparameter configurations in meta-learning.https://ieeexplore.ieee.org/document/11087560/Meta-learningfew-shot learningimage classificationhyperparameter optimizationevolutionary algorithms |
| spellingShingle | Amala Mary Vincent P. Jidesh A. A. Bini Optimizing Hyperparameters in Meta-Learning for Enhanced Image Classification IEEE Access Meta-learning few-shot learning image classification hyperparameter optimization evolutionary algorithms |
| title | Optimizing Hyperparameters in Meta-Learning for Enhanced Image Classification |
| title_full | Optimizing Hyperparameters in Meta-Learning for Enhanced Image Classification |
| title_fullStr | Optimizing Hyperparameters in Meta-Learning for Enhanced Image Classification |
| title_full_unstemmed | Optimizing Hyperparameters in Meta-Learning for Enhanced Image Classification |
| title_short | Optimizing Hyperparameters in Meta-Learning for Enhanced Image Classification |
| title_sort | optimizing hyperparameters in meta learning for enhanced image classification |
| topic | Meta-learning few-shot learning image classification hyperparameter optimization evolutionary algorithms |
| url | https://ieeexplore.ieee.org/document/11087560/ |
| work_keys_str_mv | AT amalamaryvincent optimizinghyperparametersinmetalearningforenhancedimageclassification AT pjidesh optimizinghyperparametersinmetalearningforenhancedimageclassification AT aabini optimizinghyperparametersinmetalearningforenhancedimageclassification |