Global Structural Knowledge Distillation for Semantic Segmentation
Knowledge distillation (KD) has become a cornerstone for compressing deep neural networks, allowing a smaller student model to learn from a larger teacher model. In the context of semantic segmentation, traditional KD methods primarily focus on pixel-level feature alignment, where the student model...
Saved in:
| Main Authors: | Hyejin Park, Keonhee Ahn, Hyesong Choi, Dongbo Min |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11018413/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Semantic Segmentation-Driven Knowledge Distillation-Based Infrared Visible Image Fusion Framework
by: Xingshuo Wang
Published: (2025-01-01) -
Pseudo Multi-Modal Approach to LiDAR Semantic Segmentation
by: Kyungmin Kim
Published: (2024-12-01) -
Logitwise Distillation Network: Improving Knowledge Distillation via Introducing Sample Confidence
by: Teng Shen, et al.
Published: (2025-02-01) -
Cross-modal unsupervised domain adaptation for 3D semantic segmentation via multi-scale fusion-then-distillation
by: Maomao Sun, et al.
Published: (2025-08-01) -
Class and Data-Incremental Learning Framework for Baggage Threat Segmentation via Knowledge Distillation
by: Ammara Nasim, et al.
Published: (2025-01-01)