An In-Memory-Computing Binary Neural Network Architecture With In-Memory Batch Normalization

This paper describes an in-memory computing architecture that combines full-precision computation for the first and last layers of a neural network while employing binary weights and input activations for the intermediate layers. This unique approach presents an efficient and effective solution for...

Full description

Saved in:
Bibliographic Details
Main Authors: Prathamesh Prashant Rege, Ming Yin, Sanjay Parihar, Joseph Versaggi, Shashank Nemawarkar
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10639963/
Tags: Add Tag
No Tags, Be the first to tag this record!