Two-stage optimization based on heterogeneous branch fusion for knowledge distillation.

Knowledge distillation transfers knowledge from the teacher model to the student model, effectively improving the performance of the student model. However, relying solely on the fixed knowledge of the teacher model for guidance lacks the supplementation and expansion of knowledge, which limits the...

Full description

Saved in:
Bibliographic Details
Main Authors: Gang Li, Pengfei Lv, Yang Zhang, Chuanyun Xu, Zihan Ruan, Zheng Zhou, Xinyu Fan, Ru Wang, Pan He
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2025-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0326711
Tags: Add Tag
No Tags, Be the first to tag this record!