NeuAFG: Neural Network-Based Analog Function Generator for Inference in CIM
Resistive Random-Access Memory (RRAM)-based Compute-in-Memory (CIM) architectures offer promising solutions for energy-efficient deep neural network (DNN) inference. However, conventional CIM accelerators suffer from high energy consumption due to frequent analog-to-digital (AD) and digital-to-analo...
Saved in:
| Main Authors: | , , , , , , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10856149/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Resistive Random-Access Memory (RRAM)-based Compute-in-Memory (CIM) architectures offer promising solutions for energy-efficient deep neural network (DNN) inference. However, conventional CIM accelerators suffer from high energy consumption due to frequent analog-to-digital (AD) and digital-to-analog (DA) signal conversions, especially when computing nonlinear activation functions (NAFs). This paper presents NeuAFG, a time-domain analog function generator designed to directly compute arbitrary NAFs in the analog domain. NeuAFG utilizes a one-hidden-layer ReLU neural network (ReLUNet) to approximate a variety of activation functions, incorporating two novel optimization algorithms: Low-Discrepancy Search (LDSearch) for efficient parameter initialization and KDE-FocusSampler for focused sampling of regions that are difficult to approximate. The hardware implementation integrates RRAM-based components and employs a Pulse-In-Pulse-Out method for robust time-domain computation. Experimental results demonstrate that NeuAFG reduces energy consumption in mainstream CIM accelerators such as ISAAC and RAELLA by <inline-formula> <tex-math notation="LaTeX">$1.03\times $ </tex-math></inline-formula> to <inline-formula> <tex-math notation="LaTeX">$2.20\times $ </tex-math></inline-formula>, with less than 1% accuracy loss in DNN inference tasks. |
|---|---|
| ISSN: | 2169-3536 |