MDCNN: Multi-Teacher Distillation-Based CNN for News Text Classification
News text classification is crucial for efficient information acquisition and dissemination. While deep learning models, such as BERT and BiGRU, excel in accuracy for text classification, their high complexity and resource demands hinder practical deployment. To address these challenges, we proposed...
Saved in:
| Main Authors: | Xiaolei Guo, Qingyang Liu, Yanrong Hu, Hongjiu Liu |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10943179/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
ERNIE-TextCNN: research on classification methods of Chinese news headlines in different situations
by: Yumin Yan
Published: (2025-08-01) -
A k-nearest text similarity-BiGRU approach for duration prediction of traffic accidents on expressways
by: Jiaona Chen, et al.
Published: (2025-07-01) -
Intelligent cloud guidance technology based on natural language learning
by: Renjie TANG, et al.
Published: (2019-04-01) -
Design and implementation of information system for vocational education quality assessment based on CNN-BiGRU
by: Yuhui Xu, et al.
Published: (2025-12-01) -
Hybrid Transformer-Based Large Language Models for Word Sense Disambiguation in the Low-Resource Sesotho sa Leboa Language
by: Hlaudi Daniel Masethe, et al.
Published: (2025-03-01)