Effective Data Augmentation for Active Sonar Classification Using Attention-Based Complementary Learning With Uncertainty Measure

We enhance the performance of active sonar classification by integrating a validated attention-based complementary learning model (ABCL) with an active learning (AL) framework, following modifications to adapt the model for AL. Acoustic data from active sonar systems strongly depend on ocean environ...

Full description

Saved in:
Bibliographic Details
Main Authors: Youngsang Hwang, Geunhwan Kim, Wooyoung Hong, Youngmin Choo
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10942594/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We enhance the performance of active sonar classification by integrating a validated attention-based complementary learning model (ABCL) with an active learning (AL) framework, following modifications to adapt the model for AL. Acoustic data from active sonar systems strongly depend on ocean environmental conditions and deep learning (DL) models trained on limited active sonar data often demonstrate poor performance when tested on data from new environments. To enhance performance, a small number of test data samples with high uncertainties are selected to augment the existing training set. A deep ensemble approach is employed to measure the uncertainty of each sample, quantified as the variance of predictions from models with independently optimized weights. ABCL, initially designed for robust generalization with limited active sonar data, was modified to produce two predictions per sample. This modification enhances the reliability of uncertainty measurement and is referred to as deeper ABCL (DABCL). Two datasets from different experimental conditions serve as the training and test sets. Most high-uncertainty test samples identified through AL are found in regions with a mix of target and non-target instances, allowing DABCL to adapt effectively to the shifted test data distribution. This approach achieves superior performance compared to other DL models, including VGG16, ResNet18, and Swin Transformer, both before and after applying AL. Although mislabeling occurs in 20 percent of the uncertainty samples during data augmentation, the fine-tuned DABCL still outperforms the version without AL.
ISSN:2169-3536