Enhancing Evolution of Neural Network Assisted Compact Model Through Continual Learning
Continual learning (CL) in neural network assisted compact modeling (NNCM) provides a sustainable approach desired for model evolution in novel device development, owing knowledge transfer from prior compact model could reduce model training cost and shorten development cycle. In this work, CL metho...
Saved in:
| Main Authors: | Shuhan Wang, Zheng Zhou, Guihai Yu, Zili Tang, Jinghan Xu, Xiaoyan Liu, Xing Zhang |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Journal of the Electron Devices Society |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10752563/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
A Novel Approach to Continual Knowledge Transfer in Multilingual Neural Machine Translation Using Autoregressive and Non-Autoregressive Models for Indic Languages
by: Shailashree K. Sheshadri, et al.
Published: (2025-01-01) -
c-Continuity, c-Compact and c-Separation Axioms via Soft Sets
by: Gugu Narzary, et al.
Published: (2024-11-01) -
Net-Compact Hausdorff Topologies and Continuous Multi-Utility Representations for Closed Preorders
by: Gianni Bosi, et al.
Published: (2025-03-01) -
Large-Scale Training in Neural Compact Models for Accurate and Adaptable MOSFET Simulation
by: Chanwoo Park, et al.
Published: (2024-01-01) -
Impact of mechanical compaction on crop growth and sustainable agriculture
by: Zijian LONG, et al.
Published: (2024-06-01)