NMS-KSD: Efficient Knowledge Distillation for Dense Object Detection via Non-Maximum Suppression and Feature Storage
Recently, many studies have proposed knowledge distillation (KD) frameworks for object detection. However, these frameworks did not take into account the inefficiencies caused by the teacher detector. The inefficiency refers to the computational cost incurred during the process of passing input data...
Saved in:
| Main Authors: | Suho Son, Byung Cheol Song |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10988601/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
A Survey of Dense Object Detection Methods Based on Deep Learning
by: Yang Zhou, et al.
Published: (2024-01-01) -
Non-Maximum Suppression for Rotated Object Detection During Merging Slices of High-Resolution Images
by: Lei Ge, et al.
Published: (2024-01-01) -
Autocorrelation Matrix Knowledge Distillation: A Task-Specific Distillation Method for BERT Models
by: Kai Zhang, et al.
Published: (2024-10-01) -
Multi-convolutional neural network brain image denoising study based on feature distillation learning and dense residual attention
by: Huimin Qu, et al.
Published: (2025-03-01) -
Category semantic and global relation distillation for object detection
by: Yanpeng LIANG, et al.
Published: (2025-04-01)