Disturbance‐Aware On‐Chip Training with Mitigation Schemes for Massively Parallel Computing in Analog Deep Learning Accelerator
Abstract On‐chip training in analog in‐memory computing (AIMC) holds great promise for reducing data latency and enabling user‐specific learning. However, analog synaptic devices face significant challenges, particularly during parallel weight updates in crossbar arrays, where non‐uniform programmin...
Saved in:
| Main Authors: | Jaehyeon Kang, Jongun Won, Narae Han, Sangjun Hong, Jee‐Eun Yang, Sangwook Kim, Sangbum Kim |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Wiley
2025-06-01
|
| Series: | Advanced Science |
| Subjects: | |
| Online Access: | https://doi.org/10.1002/advs.202417635 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Enabling Selective and Tunable Weight Updates in All-InGaZnO 3-Transistor 1-Capacitor Synaptic Circuits for On-Chip Training
by: Minseung Kang, et al.
Published: (2025-01-01) -
Low-Power a-IGZO TFT Emission Driver With Shoot-Through Current-Free QB Control Block
by: Won-Been Jeong, et al.
Published: (2025-01-01) -
Performance Improvement of Vertical Channel Indium–Gallium–Zinc Oxide Thin-Film Transistors Using Porous MXene Electrode
by: Wanqiang Fu, et al.
Published: (2025-05-01) -
Low Power Emission Pulse Generation Circuit Based on n-Type Amorphous In-Ga-Zn-Oxide Transistors for Active-Matrix Organic Light-Emitting Diode Displays
by: Min-Kyu Chang, et al.
Published: (2024-10-01) -
Mechanism of Threshold Voltage Instability in Double Gate α-IGZO Nanosheet TFT Under Bias and Temperature Stress
by: Muhammad Aslam, et al.
Published: (2024-01-01)