Autocorrelation Matrix Knowledge Distillation: A Task-Specific Distillation Method for BERT Models
Pre-trained language models perform well in various natural language processing tasks. However, their large number of parameters poses significant challenges for edge devices with limited resources, greatly limiting their application in practical deployment. This paper introduces a simple and effici...
Saved in:
| Main Authors: | Kai Zhang, Jinqiu Li, Bingqian Wang, Haoran Meng |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2024-10-01
|
| Series: | Applied Sciences |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2076-3417/14/20/9180 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Optimised knowledge distillation for efficient social media emotion recognition using DistilBERT and ALBERT
by: Muhammad Hussain, et al.
Published: (2025-08-01) -
Evaluating sentiment analysis models: A comparative analysis of vaccination tweets during the COVID-19 phase leveraging DistilBERT for enhanced insights
by: Renuka Agrawal, et al.
Published: (2025-06-01) -
Enhancing consistency in piping and instrumentation diagrams using DistilBERT and smart PID systems
by: F.S. Gómez-Vega, et al.
Published: (2025-12-01) -
Leveraging BERT, DistilBERT, and TinyBERT for Rumor Detection
by: Aijazahamed Qazi, et al.
Published: (2025-01-01) -
MDCNN: Multi-Teacher Distillation-Based CNN for News Text Classification
by: Xiaolei Guo, et al.
Published: (2025-01-01)