Fine-Grained Classification via Hierarchical Feature Covariance Attention Module
Fine-Grained Visual Classification (FGVC) has consistently been challenging in various domains, such as aviation and animal breeds. It is mainly due to the FGVC’s criteria that differ with a considerably small range or subtle pattern differences. In the deep convolutional neural network,...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10097470/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1841533421563674624 |
---|---|
author | Yerim Jung Nur Suriza Syazwany Sujeong Kim Sang-Chul Lee |
author_facet | Yerim Jung Nur Suriza Syazwany Sujeong Kim Sang-Chul Lee |
author_sort | Yerim Jung |
collection | DOAJ |
description | Fine-Grained Visual Classification (FGVC) has consistently been challenging in various domains, such as aviation and animal breeds. It is mainly due to the FGVC’s criteria that differ with a considerably small range or subtle pattern differences. In the deep convolutional neural network, the covariance between feature maps positively affects the selection of features to learn discriminative regions automatically. In this study, we propose a method for a fine-grained classification model by inserting an attention module that uses covariance characteristics. Specifically, we introduce a feature map attention module (FCA) to extract the feature map between convolution blocks, constituting the existing classification model. The FCA module then applies the corresponding value of the covariance matrix to the channel to focus on the salient area. We demonstrate the need for fine-grained classification in a hierarchical manner by focusing on the diverse scale representation. Additionally, we implemented two ablation studies to show how each suggested strategy affects classification performance. Our experiments are conducted on three datasets, CUB-200-2011, Stanford Cars, and FGVC-Aircraft, primarily used for fine-grained classification tasks. Our method outperforms the state-of-the-art models by a margin of 0.4%, 1.1%, and 1.4%. |
format | Article |
id | doaj-art-8b760ee8b9784bb8a65e42b895acb5dd |
institution | Kabale University |
issn | 2169-3536 |
language | English |
publishDate | 2023-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj-art-8b760ee8b9784bb8a65e42b895acb5dd2025-01-16T00:00:53ZengIEEEIEEE Access2169-35362023-01-0111356703567910.1109/ACCESS.2023.326547210097470Fine-Grained Classification via Hierarchical Feature Covariance Attention ModuleYerim Jung0https://orcid.org/0009-0006-0756-1849Nur Suriza Syazwany1https://orcid.org/0000-0001-8073-7974Sujeong Kim2Sang-Chul Lee3https://orcid.org/0000-0002-6973-2416Department of Electrical and Computer Engineering, Inha University, Incheon, South KoreaDepartment of Electrical and Computer Engineering, Inha University, Incheon, South KoreaDepartment of Electrical and Computer Engineering, Inha University, Incheon, South KoreaDepartment of Electrical and Computer Engineering, Inha University, Incheon, South KoreaFine-Grained Visual Classification (FGVC) has consistently been challenging in various domains, such as aviation and animal breeds. It is mainly due to the FGVC’s criteria that differ with a considerably small range or subtle pattern differences. In the deep convolutional neural network, the covariance between feature maps positively affects the selection of features to learn discriminative regions automatically. In this study, we propose a method for a fine-grained classification model by inserting an attention module that uses covariance characteristics. Specifically, we introduce a feature map attention module (FCA) to extract the feature map between convolution blocks, constituting the existing classification model. The FCA module then applies the corresponding value of the covariance matrix to the channel to focus on the salient area. We demonstrate the need for fine-grained classification in a hierarchical manner by focusing on the diverse scale representation. Additionally, we implemented two ablation studies to show how each suggested strategy affects classification performance. Our experiments are conducted on three datasets, CUB-200-2011, Stanford Cars, and FGVC-Aircraft, primarily used for fine-grained classification tasks. Our method outperforms the state-of-the-art models by a margin of 0.4%, 1.1%, and 1.4%.https://ieeexplore.ieee.org/document/10097470/Attention modulecovariancefeature mapfine-grained classification |
spellingShingle | Yerim Jung Nur Suriza Syazwany Sujeong Kim Sang-Chul Lee Fine-Grained Classification via Hierarchical Feature Covariance Attention Module IEEE Access Attention module covariance feature map fine-grained classification |
title | Fine-Grained Classification via Hierarchical Feature Covariance Attention Module |
title_full | Fine-Grained Classification via Hierarchical Feature Covariance Attention Module |
title_fullStr | Fine-Grained Classification via Hierarchical Feature Covariance Attention Module |
title_full_unstemmed | Fine-Grained Classification via Hierarchical Feature Covariance Attention Module |
title_short | Fine-Grained Classification via Hierarchical Feature Covariance Attention Module |
title_sort | fine grained classification via hierarchical feature covariance attention module |
topic | Attention module covariance feature map fine-grained classification |
url | https://ieeexplore.ieee.org/document/10097470/ |
work_keys_str_mv | AT yerimjung finegrainedclassificationviahierarchicalfeaturecovarianceattentionmodule AT nursurizasyazwany finegrainedclassificationviahierarchicalfeaturecovarianceattentionmodule AT sujeongkim finegrainedclassificationviahierarchicalfeaturecovarianceattentionmodule AT sangchullee finegrainedclassificationviahierarchicalfeaturecovarianceattentionmodule |