Devnath, Joydeep KumarJoydeep KumarDevnathSurana, NeelamNeelamSuranaMekie, JoyceeJoyceeMekie2025-08-312025-08-312020-01-01[9781728133201]2-s2.0-85109312420http://repository.iitgn.ac.in/handle/IITG2025/25703This paper performs an in-depth study of error-resiliency of Neural Networks(NNs). Our investigation has resulted into two important findings. First, we found that Binary Neural Networks (BNNs) are more error-tolerant than 32-bits NNs. Second, in BNNs the network accuracy is more sensitive to errors in Batch Normalization Parameters(BNPs) than that in binary weights. A detailed discussion on the same is presented in the paper. Based on these findings, we propose a split memory architecture for low power BNNs, suitable for IoTs. In the proposed split memory architecture, weights are stored in area-efficient 6T SRAM, and BNPs are stored in robust 12T SRAM. The proposed split memory architecture for BNNs synthesized in UMC 28nm is highly energy efficient as the V<inf>min</inf> (minimum operating voltage) can be reduced to 0.36 V, 0.52 V, and 0.52 V for the MNIST, CIFAR10, and ImageNet datasets respectively, with accuracy drop of less than 1%.falseApproximate Memory | BER | BNN | CNN | Neural network | Quantization | SRAMA low-voltage split memory architecture for binary neural networksConference Paper202029180862