Mixed precision quantization based on information entropy
Abstract Mixed precision quantization represents a sophisticated technique that markedly diminishes a system’s computational and memory demands by reducing the bit width of the model. However, in practical applications, an improper allocation strategy can fail to leverage the advantages of quantizat...
Saved in:
| Main Authors: | Ting Qin, Zhao Li, Jiaqi Zhao, Yuting Yan, Yafei Du |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-04-01
|
| Series: | Scientific Reports |
| Subjects: | |
| Online Access: | https://doi.org/10.1038/s41598-025-91684-8 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
A Novel Mixed-Precision Quantization Approach for CNNs
by: Dan Wu, et al.
Published: (2025-01-01) -
Fully Quantized Neural Networks for Audio Source Separation
by: Elad Cohen, et al.
Published: (2024-01-01) -
Hierarchical Mixed-Precision Post-Training Quantization for SAR Ship Detection Networks
by: Hang Wei, et al.
Published: (2024-10-01) -
Optimizing Deep Learning Models for Resource‐Constrained Environments With Cluster‐Quantized Knowledge Distillation
by: Niaz Ashraf Khan, et al.
Published: (2025-05-01) -
HLQ: Hardware-Friendly Logarithmic Quantization Aware Training for Power-Efficient Low-Precision CNN Models
by: Dahun Choi, et al.
Published: (2024-01-01)