MDCNN: Multi-Teacher Distillation-Based CNN for News Text Classification
News text classification is crucial for efficient information acquisition and dissemination. While deep learning models, such as BERT and BiGRU, excel in accuracy for text classification, their high complexity and resource demands hinder practical deployment. To address these challenges, we proposed...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10943179/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849702129874239488 |
|---|---|
| author | Xiaolei Guo Qingyang Liu Yanrong Hu Hongjiu Liu |
| author_facet | Xiaolei Guo Qingyang Liu Yanrong Hu Hongjiu Liu |
| author_sort | Xiaolei Guo |
| collection | DOAJ |
| description | News text classification is crucial for efficient information acquisition and dissemination. While deep learning models, such as BERT and BiGRU, excel in accuracy for text classification, their high complexity and resource demands hinder practical deployment. To address these challenges, we proposed MDCNN (Multi-teacher Distillation-based CNN), which leverages knowledge distillation with BERT and BiGRU as teacher models and TextCNN as the student model. Experiments on three benchmark news dataset demonstrate that MDCNN improves classification accuracy by nearly 2% while significantly reducing computational overhead, offering a practical solution for real-world applications. |
| format | Article |
| id | doaj-art-c30d99f97ec34c85a62ace484a73c2f9 |
| institution | DOAJ |
| issn | 2169-3536 |
| language | English |
| publishDate | 2025-01-01 |
| publisher | IEEE |
| record_format | Article |
| series | IEEE Access |
| spelling | doaj-art-c30d99f97ec34c85a62ace484a73c2f92025-08-20T03:17:44ZengIEEEIEEE Access2169-35362025-01-0113566315664110.1109/ACCESS.2025.355522410943179MDCNN: Multi-Teacher Distillation-Based CNN for News Text ClassificationXiaolei Guo0https://orcid.org/0009-0001-1390-1479Qingyang Liu1https://orcid.org/0000-0003-0491-3248Yanrong Hu2https://orcid.org/0000-0002-9826-5212Hongjiu Liu3https://orcid.org/0000-0001-8175-264XCollege of Mathematics and Computer Science, Zhejiang A&F University, Hangzhou, ChinaInstitute of Informatics, Georg-August-Universität Göttingen, Göttingen, GermanyCollege of Mathematics and Computer Science, Zhejiang A&F University, Hangzhou, ChinaCollege of Mathematics and Computer Science, Zhejiang A&F University, Hangzhou, ChinaNews text classification is crucial for efficient information acquisition and dissemination. While deep learning models, such as BERT and BiGRU, excel in accuracy for text classification, their high complexity and resource demands hinder practical deployment. To address these challenges, we proposed MDCNN (Multi-teacher Distillation-based CNN), which leverages knowledge distillation with BERT and BiGRU as teacher models and TextCNN as the student model. Experiments on three benchmark news dataset demonstrate that MDCNN improves classification accuracy by nearly 2% while significantly reducing computational overhead, offering a practical solution for real-world applications.https://ieeexplore.ieee.org/document/10943179/Text classificationknowledge distillationBERTBiGRUTextCNN |
| spellingShingle | Xiaolei Guo Qingyang Liu Yanrong Hu Hongjiu Liu MDCNN: Multi-Teacher Distillation-Based CNN for News Text Classification IEEE Access Text classification knowledge distillation BERT BiGRU TextCNN |
| title | MDCNN: Multi-Teacher Distillation-Based CNN for News Text Classification |
| title_full | MDCNN: Multi-Teacher Distillation-Based CNN for News Text Classification |
| title_fullStr | MDCNN: Multi-Teacher Distillation-Based CNN for News Text Classification |
| title_full_unstemmed | MDCNN: Multi-Teacher Distillation-Based CNN for News Text Classification |
| title_short | MDCNN: Multi-Teacher Distillation-Based CNN for News Text Classification |
| title_sort | mdcnn multi teacher distillation based cnn for news text classification |
| topic | Text classification knowledge distillation BERT BiGRU TextCNN |
| url | https://ieeexplore.ieee.org/document/10943179/ |
| work_keys_str_mv | AT xiaoleiguo mdcnnmultiteacherdistillationbasedcnnfornewstextclassification AT qingyangliu mdcnnmultiteacherdistillationbasedcnnfornewstextclassification AT yanronghu mdcnnmultiteacherdistillationbasedcnnfornewstextclassification AT hongjiuliu mdcnnmultiteacherdistillationbasedcnnfornewstextclassification |