SpinDrop: Dropout-Based Bayesian Binary Neural Networks With Spintronic Implementation. Ahmed, S. T., Danouchi, K., Münch, C., Prenat, G., Anghel, L., & Tahoori, M. B. IEEE Journal on Emerging and Selected Topics in Circuits and Systems, 13(1):150–164, March, 2023. 0 citations (Semantic Scholar/DOI) [2023-05-05] Conference Name: IEEE Journal on Emerging and Selected Topics in Circuits and Systems
doi  abstract   bibtex   
Neural Networks (NNs) provide an effective solution in numerous application domains, including autonomous driving and medical applications. Nevertheless, NN predictions can be incorrect if the input sample is outside of the training distribution or contaminated by noise. Consequently, quantifying the uncertainty of the NN prediction allows the system to make more insightful decisions by avoiding blind predictions. Therefore, uncertainty quantification is crucial for a variety of applications, including safety-critical applications. Bayesian NN (BayNN) using Dropout-based approximation provides a systematic approach for estimating the uncertainty of predictions. Despite such merit, BayNNs are not suitable for implementation in an embedded device or able to meet high-performance demands for certain applications. Computation in-memory (CiM) architecture with emerging non-volatile memories (NVMs) is a great candidate for high-performance and low-power acceleration BayNNs in hardware. Among NVMs, Magnetic Tunnel Junction (MTJ) offer many benefits, but they also suffer from various non-idealities and limited bit-level resolution. As a result, binarizing BayNNs is an attractive option that can directly implement BayNN into a CiM architecture and able to achieve benefits of both CiM architecture and BayNNs at the same time. Conventional in-memory hardware implementations emphasize conventional NNs, which can only make predictions, and do not account for both device and input uncertainty, thus, reducing both reliability and performance. In this paper, we propose for the first time Binary Bayesian NNs (BinBayNN) with an end-to-end approach (from algorithmic level to device level) for their implementation. Our approach takes the inherent stochastic properties of MTJs as a feature to implement Dropout-based Bayesian Neural Networks. We provide an extensive evaluation of our approach from the device level up to the algorithmic level on various benchmark datasets.
@article{ahmed_spindrop_2023,
	title = {{SpinDrop}: {Dropout}-{Based} {Bayesian} {Binary} {Neural} {Networks} {With} {Spintronic} {Implementation}},
	volume = {13},
	issn = {2156-3365},
	shorttitle = {{SpinDrop}},
	doi = {10.1109/JETCAS.2023.3242146},
	abstract = {Neural Networks (NNs) provide an effective solution in numerous application domains, including autonomous driving and medical applications. Nevertheless, NN predictions can be incorrect if the input sample is outside of the training distribution or contaminated by noise. Consequently, quantifying the uncertainty of the NN prediction allows the system to make more insightful decisions by avoiding blind predictions. Therefore, uncertainty quantification is crucial for a variety of applications, including safety-critical applications. Bayesian NN (BayNN) using Dropout-based approximation provides a systematic approach for estimating the uncertainty of predictions. Despite such merit, BayNNs are not suitable for implementation in an embedded device or able to meet high-performance demands for certain applications. Computation in-memory (CiM) architecture with emerging non-volatile memories (NVMs) is a great candidate for high-performance and low-power acceleration BayNNs in hardware. Among NVMs, Magnetic Tunnel Junction (MTJ) offer many benefits, but they also suffer from various non-idealities and limited bit-level resolution. As a result, binarizing BayNNs is an attractive option that can directly implement BayNN into a CiM architecture and able to achieve benefits of both CiM architecture and BayNNs at the same time. Conventional in-memory hardware implementations emphasize conventional NNs, which can only make predictions, and do not account for both device and input uncertainty, thus, reducing both reliability and performance. In this paper, we propose for the first time Binary Bayesian NNs (BinBayNN) with an end-to-end approach (from algorithmic level to device level) for their implementation. Our approach takes the inherent stochastic properties of MTJs as a feature to implement Dropout-based Bayesian Neural Networks. We provide an extensive evaluation of our approach from the device level up to the algorithmic level on various benchmark datasets.},
	number = {1},
	journal = {IEEE Journal on Emerging and Selected Topics in Circuits and Systems},
	author = {Ahmed, Soyed Tuhin and Danouchi, Kamal and Münch, Christopher and Prenat, Guillaume and Anghel, Lorena and Tahoori, Mehdi B.},
	month = mar,
	year = {2023},
	note = {0 citations (Semantic Scholar/DOI) [2023-05-05]
Conference Name: IEEE Journal on Emerging and Selected Topics in Circuits and Systems},
	keywords = {Artificial neural networks, Bayes methods, Bayesian neural network, Magnetic tunneling, Neural networks, Resistance, Switches, Uncertainty, binary bayesian neural network, binary neural network, monte carlo dropout, spintronic},
	pages = {150--164},
}

Downloads: 0