A dynamic dropout self-distillation method for object segmentation

Abstract There is a phenomenon that better teachers cannot teach out better students in knowledge distillation due to the capacity mismatch. Especially in pixel-level object segmentation, there are some challenging pixels that are difficult for the student model to learn. Even if the student model l...

Full description

Saved in:
Bibliographic Details
Main Authors: Lei Chen, Tieyong Cao, Yunfei Zheng, Yang Wang, Bo Zhang, Jibin Yang
Format: Article
Language:English
Published: Springer 2024-12-01
Series:Complex & Intelligent Systems
Subjects:
Online Access:https://doi.org/10.1007/s40747-024-01705-8
Tags: Add Tag
No Tags, Be the first to tag this record!