A dynamic dropout self-distillation method for object segmentation
Abstract There is a phenomenon that better teachers cannot teach out better students in knowledge distillation due to the capacity mismatch. Especially in pixel-level object segmentation, there are some challenging pixels that are difficult for the student model to learn. Even if the student model l...
Saved in:
| Main Authors: | Lei Chen, Tieyong Cao, Yunfei Zheng, Yang Wang, Bo Zhang, Jibin Yang |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Springer
2024-12-01
|
| Series: | Complex & Intelligent Systems |
| Subjects: | |
| Online Access: | https://doi.org/10.1007/s40747-024-01705-8 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Cognitive motivational variables and dropout intention as precursors of university dropout
by: Yaranay López-Angulo, et al.
Published: (2024-11-01) -
Ethnic Minorities’ Dropout Decisions in Higher Education
by: Souksakhone Sengsouliya, et al.
Published: (2023-04-01) -
Relationship between self-efficacy and university dropout: a systematic review
by: Ana B. Bernardo, et al.
Published: (2025-07-01) -
Knowledge Distillation in Object Detection for Resource-Constrained Edge Computing
by: Arief Setyanto, et al.
Published: (2025-01-01) -
Dropout in adult education as a phenomenon of fit
by: Veronika Thalhammer, et al.
Published: (2022-10-01)