Chinese herbal medicine recognition network based on knowledge distillation and cross-attention

Abstract In order to reduce the number of parameters in the Chinese herbal medicine recognition model while maintaining accuracy, this paper takes 20 classes of Chinese herbs as the research object and proposes a recognition network based on knowledge distillation and cross-attention – ShuffleCANet...

Full description

Saved in:
Bibliographic Details
Main Authors: Qinggang Hou, Wanshuai Yang, Guizhuang Liu
Format: Article
Language:English
Published: Nature Portfolio 2025-01-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-025-85697-6
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1841544793821282304
author Qinggang Hou
Wanshuai Yang
Guizhuang Liu
author_facet Qinggang Hou
Wanshuai Yang
Guizhuang Liu
author_sort Qinggang Hou
collection DOAJ
description Abstract In order to reduce the number of parameters in the Chinese herbal medicine recognition model while maintaining accuracy, this paper takes 20 classes of Chinese herbs as the research object and proposes a recognition network based on knowledge distillation and cross-attention – ShuffleCANet (ShuffleNet and Cross-Attention). Firstly, transfer learning was used for experiments on 20 classic networks, and DenseNet and RegNet were selected as dual teacher models. Then, considering the parameter count and recognition accuracy, ShuffleNet was determined as the student model, and a new cross-attention mechanism was proposed. This cross-attention model replaces Conv5 in ShuffleNet to achieve the goal of lightweight design while maintaining accuracy. Finally, experiments on the public dataset NB-TCM-CHM showed that the accuracy (ACC) and F1_score of the proposed ShuffleCANet model reached 98.8%, with only 128.66M model parameters. Compared with the baseline model ShuffleNet, the parameters are reduced by nearly 50%, but the accuracy is improved by about 1.3%, proving this method’s effectiveness.
format Article
id doaj-art-085329bfa6fc46f0bf689bc5b356bc49
institution Kabale University
issn 2045-2322
language English
publishDate 2025-01-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-085329bfa6fc46f0bf689bc5b356bc492025-01-12T12:15:26ZengNature PortfolioScientific Reports2045-23222025-01-0115111510.1038/s41598-025-85697-6Chinese herbal medicine recognition network based on knowledge distillation and cross-attentionQinggang Hou0Wanshuai Yang1Guizhuang Liu2School of Information Engineering, Shandong Huayu University of TechnologySchool of Information Engineering, Shandong Huayu University of TechnologySchool of Information Engineering, Shandong Huayu University of TechnologyAbstract In order to reduce the number of parameters in the Chinese herbal medicine recognition model while maintaining accuracy, this paper takes 20 classes of Chinese herbs as the research object and proposes a recognition network based on knowledge distillation and cross-attention – ShuffleCANet (ShuffleNet and Cross-Attention). Firstly, transfer learning was used for experiments on 20 classic networks, and DenseNet and RegNet were selected as dual teacher models. Then, considering the parameter count and recognition accuracy, ShuffleNet was determined as the student model, and a new cross-attention mechanism was proposed. This cross-attention model replaces Conv5 in ShuffleNet to achieve the goal of lightweight design while maintaining accuracy. Finally, experiments on the public dataset NB-TCM-CHM showed that the accuracy (ACC) and F1_score of the proposed ShuffleCANet model reached 98.8%, with only 128.66M model parameters. Compared with the baseline model ShuffleNet, the parameters are reduced by nearly 50%, but the accuracy is improved by about 1.3%, proving this method’s effectiveness.https://doi.org/10.1038/s41598-025-85697-6
spellingShingle Qinggang Hou
Wanshuai Yang
Guizhuang Liu
Chinese herbal medicine recognition network based on knowledge distillation and cross-attention
Scientific Reports
title Chinese herbal medicine recognition network based on knowledge distillation and cross-attention
title_full Chinese herbal medicine recognition network based on knowledge distillation and cross-attention
title_fullStr Chinese herbal medicine recognition network based on knowledge distillation and cross-attention
title_full_unstemmed Chinese herbal medicine recognition network based on knowledge distillation and cross-attention
title_short Chinese herbal medicine recognition network based on knowledge distillation and cross-attention
title_sort chinese herbal medicine recognition network based on knowledge distillation and cross attention
url https://doi.org/10.1038/s41598-025-85697-6
work_keys_str_mv AT qingganghou chineseherbalmedicinerecognitionnetworkbasedonknowledgedistillationandcrossattention
AT wanshuaiyang chineseherbalmedicinerecognitionnetworkbasedonknowledgedistillationandcrossattention
AT guizhuangliu chineseherbalmedicinerecognitionnetworkbasedonknowledgedistillationandcrossattention