Feature fusion-based collaborative learning for knowledge distillation
Deep neural networks have achieved a great success in a variety of applications, such as self-driving cars and intelligent robotics. Meanwhile, knowledge distillation has received increasing attention as an effective model compression technique for training very efficient deep models. The performanc...
Saved in:
Main Authors: | Yiting Li, Liyuan Sun, Jianping Gou, Lan Du, Weihua Ou |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2021-11-01
|
Series: | International Journal of Distributed Sensor Networks |
Online Access: | https://doi.org/10.1177/15501477211057037 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
A Multi-Robot Collaborative Exploration Method Based on Deep Reinforcement Learning and Knowledge Distillation
by: Rui Wang, et al.
Published: (2025-01-01) -
Type 2 diabetes prediction method based on dual-teacher knowledge distillation and feature enhancement
by: Jian Zhao, et al.
Published: (2025-01-01) -
Adversarial examples defense method based on multi-dimensional feature maps knowledge distillation
by: Baolin QIU, et al.
Published: (2022-04-01) -
Real world federated learning with a knowledge distilled transformer for cardiac CT imaging
by: Malte Tölle, et al.
Published: (2025-02-01) -
Predicting Subsurface Layer Thickness and Seismic Wave Velocity Using Deep Learning: Knowledge Distillation Approach
by: Amir Moslemi, et al.
Published: (2025-01-01)