Enhancing Evolution of Neural Network Assisted Compact Model Through Continual Learning
Continual learning (CL) in neural network assisted compact modeling (NNCM) provides a sustainable approach desired for model evolution in novel device development, owing knowledge transfer from prior compact model could reduce model training cost and shorten development cycle. In this work, CL metho...
Saved in:
| Main Authors: | , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Journal of the Electron Devices Society |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10752563/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Continual learning (CL) in neural network assisted compact modeling (NNCM) provides a sustainable approach desired for model evolution in novel device development, owing knowledge transfer from prior compact model could reduce model training cost and shorten development cycle. In this work, CL methods for NNCM is proposed and verified within an actual case: Silicon-on-nothing (SON) MOSFET modeling. Continuously generated samples expanded in range and dimension are used to estimate CL approaches, with more physical effects to accommodate. In this process, continual training network and block-modular network successfully expand device modeling parameters, showing excellent knowledge transfer from prior model for evolution. The evolved model can accurately predict device and circuit characteristics along development, reducing the prediction errors from above 20% to less than 1%. |
|---|---|
| ISSN: | 2168-6734 |