Enhanced uncertainty sampling with category information for improved active learning.
Traditional uncertainty sampling methods in active learning often neglect category information, leading to imbalanced sample selection in multi-class computer vision tasks. Our approach integrates category information with uncertainty sampling through a novel active learning framework to address thi...
Saved in:
| Main Authors: | , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Public Library of Science (PLoS)
2025-01-01
|
| Series: | PLoS ONE |
| Online Access: | https://doi.org/10.1371/journal.pone.0327694 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Traditional uncertainty sampling methods in active learning often neglect category information, leading to imbalanced sample selection in multi-class computer vision tasks. Our approach integrates category information with uncertainty sampling through a novel active learning framework to address this limitation. Our method employs a pre-trained VGG16 architecture and cosine similarity metrics to efficiently extract category features without requiring additional model training. The framework combines these features with traditional uncertainty measures to ensure balanced sampling across classes while maintaining computational efficiency. Extensive experiments across both object detection and image classification tasks validate our method's effectiveness. For object detection, our approach achieves competitive mAP scores while ensuring balanced category representation. For image classification, our method achieves accuracy comparable to state-of-the-art approaches while reducing computational overhead by up to 80%. The results validate our approach's ability to balance sampling efficiency with dataset representativeness across different computer vision tasks. This work offers a practical, efficient solution for large-scale data annotation in domains with limited labeled data and diverse class distributions. |
|---|---|
| ISSN: | 1932-6203 |