Optimal Knowledge Distillation through Non-Heuristic Control of Dark Knowledge
In this paper, a method is introduced to control the dark knowledge values also known as soft targets, with the purpose of improving the training by knowledge distillation for multi-class classification tasks. Knowledge distillation effectively transfers knowledge from a larger model to a smaller mo...
Saved in:
| Main Authors: | Darian Onchis, Codruta Istin, Ioan Samuila |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2024-08-01
|
| Series: | Machine Learning and Knowledge Extraction |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2504-4990/6/3/94 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Class and Data-Incremental Learning Framework for Baggage Threat Segmentation via Knowledge Distillation
by: Ammara Nasim, et al.
Published: (2025-01-01) -
Optimizing Deep Learning Models for Resource‐Constrained Environments With Cluster‐Quantized Knowledge Distillation
by: Niaz Ashraf Khan, et al.
Published: (2025-05-01) -
Decoupled Time-Dimensional Progressive Self-Distillation With Knowledge Calibration for Edge Computing-Enabled AIoT
by: Yingchao Wang, et al.
Published: (2024-01-01) -
Correlation-Based Knowledge Distillation in Exemplar-Free Class-Incremental Learning
by: Zijian Gao, et al.
Published: (2025-01-01) -
Leveraging logit uncertainty for better knowledge distillation
by: Zhen Guo, et al.
Published: (2024-12-01)