A low-voltage split memory architecture for binary neural networks
Source
Proceedings IEEE International Symposium on Circuits and Systems
ISSN
02714310
Date Issued
2020-01-01
Author(s)
Volume
2020-October
Abstract
This paper performs an in-depth study of error-resiliency of Neural Networks(NNs). Our investigation has resulted into two important findings. First, we found that Binary Neural Networks (BNNs) are more error-tolerant than 32-bits NNs. Second, in BNNs the network accuracy is more sensitive to errors in Batch Normalization Parameters(BNPs) than that in binary weights. A detailed discussion on the same is presented in the paper. Based on these findings, we propose a split memory architecture for low power BNNs, suitable for IoTs. In the proposed split memory architecture, weights are stored in area-efficient 6T SRAM, and BNPs are stored in robust 12T SRAM. The proposed split memory architecture for BNNs synthesized in UMC 28nm is highly energy efficient as the V<inf>min</inf> (minimum operating voltage) can be reduced to 0.36 V, 0.52 V, and 0.52 V for the MNIST, CIFAR10, and ImageNet datasets respectively, with accuracy drop of less than 1%.
Subjects
Approximate Memory | BER | BNN | CNN | Neural network | Quantization | SRAM
