Progressive Bitwidth Assignment Approaches for Efficient Capsule Networks Quantization
Capsule Networks (CapsNets) are a class of neural network architectures that can be used to more accurately model hierarchical relationships due to their hierarchical structure and dynamic routing algorithms. However, their high accuracy comes at the cost of significant memory and computational reso...
Saved in:
Main Authors: | Mohsen Raji, Amir Ghazizadeh Ahsaei, Kimia Soroush, Behnam Ghavami |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10854429/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Subset-Selection Weight Post-Training Quantization Method for Learned Image Compression Task
by: Jinru Yang, et al.
Published: (2025-01-01) -
Enhanced Vector Quantization for Embedded Machine Learning: A Post-Training Approach With Incremental Clustering
by: Thommas K. S. Flores, et al.
Published: (2025-01-01) -
Uncertainty-based quantization method for stable training of binary neural networks
by: A.V. Trusov, et al.
Published: (2024-08-01) -
Enhancing Image-Based JPEG Compression: ML-Driven Quantization via DCT Feature Clustering
by: Shahrzad Sabzavi, et al.
Published: (2025-01-01) -
Uniform Quantization for Multi-Antenna Amplify–Quantize–Forward Relay
by: Gangsan Jeong, et al.
Published: (2025-01-01)