Distilling Diverse Knowledge for Deep Ensemble Learning
Bidirectional knowledge distillation improves network performance by sharing knowledge between networks during the training of multiple networks. Additionally, performance is further improved by using an ensemble of multiple networks during inference. However, the performance improvement achieved by...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11028994/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Be the first to leave a comment!