Exploring the impact of variability in resistance distributions of RRAM on the prediction accuracy of deep learning neural networks
In this work, we explore the use of the resistive random access memory (RRAM) device as a synapse for mimicking the trained weights linking neurons in a deep learning neural network (DNN) (AlexNet). The RRAM devices were fabricated in-house and subjected to 1000 bipolar read-write cycles to measure...
Saved in:
Main Authors: | , , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2021
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/148658 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-148658 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1486582023-02-28T19:55:49Z Exploring the impact of variability in resistance distributions of RRAM on the prediction accuracy of deep learning neural networks Prabhu, Nagaraj Lakshmana Loy, Desmond Jia Jun Dananjaya, Putu Andhita Lew, Wen Siang Toh, Eng Huat Raghavan, Nagarajan School of Physical and Mathematical Sciences Science::Physics Convolutional Neural Network Look-up-table In this work, we explore the use of the resistive random access memory (RRAM) device as a synapse for mimicking the trained weights linking neurons in a deep learning neural network (DNN) (AlexNet). The RRAM devices were fabricated in-house and subjected to 1000 bipolar read-write cycles to measure the resistances recorded for Logic-0 and Logic-1 (we demonstrate the feasibility of achieving eight discrete resistance states in the same device depending on the RESET stop voltage). DNN simulations have been performed to compare the relative error between the output of AlexNet Layer 1 (Convolution) implemented with the standard backpropagation (BP) algorithm trained weights versus the weights that are encoded using the measured resistance distributions from RRAM. The IMAGENET dataset is used for classification purpose here. We focus only on the Layer 1 weights in the AlexNet framework with 11 × 11 × 96 filters values coded into a binary floating point and substituted with the RRAM resistance values corresponding to Logic-0 and Logic-1. The impact of variability in the resistance states of RRAM for the low and high resistance states on the accuracy of image classification is studied by formulating a look-up table (LUT) for the RRAM (from measured I-V data) and comparing the convolution computation output of AlexNet Layer 1 with the standard outputs from the BP-based pre-trained weights. This is one of the first studies dedicated to exploring the impact of RRAM device resistance variability on the prediction accuracy of a convolutional neural network (CNN) on an AlexNet platform through a framework that requires limited actual device switching test data. Agency for Science, Technology and Research (A*STAR) Economic Development Board (EDB) National Research Foundation (NRF) Published version This research was funded by A*STAR BRENAIC Research Project No. A18A5b0056 and the APC associated with the publication as well. Funding support for fabrication and characterization of devices were provided by the Economic Development Board EDB-IPP (RCA – 16/216) program and the Industry-IHL Partnership Program (NRF2015-IIP001-001). 2021-05-31T07:54:07Z 2021-05-31T07:54:07Z 2020 Journal Article Prabhu, N. L., Loy, D. J. J., Dananjaya, P. A., Lew, W. S., Toh, E. H. & Raghavan, N. (2020). Exploring the impact of variability in resistance distributions of RRAM on the prediction accuracy of deep learning neural networks. Electronics, 9(3). https://dx.doi.org/10.3390/electronics9030414 2079-9292 https://hdl.handle.net/10356/148658 10.3390/electronics9030414 2-s2.0-85081022579 3 9 en A18A5b0056 RCA – 16/216 NRF2015-IIP001-001 Electronics © 2020 The Author(s). Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Science::Physics Convolutional Neural Network Look-up-table |
spellingShingle |
Science::Physics Convolutional Neural Network Look-up-table Prabhu, Nagaraj Lakshmana Loy, Desmond Jia Jun Dananjaya, Putu Andhita Lew, Wen Siang Toh, Eng Huat Raghavan, Nagarajan Exploring the impact of variability in resistance distributions of RRAM on the prediction accuracy of deep learning neural networks |
description |
In this work, we explore the use of the resistive random access memory (RRAM) device as a synapse for mimicking the trained weights linking neurons in a deep learning neural network (DNN) (AlexNet). The RRAM devices were fabricated in-house and subjected to 1000 bipolar read-write cycles to measure the resistances recorded for Logic-0 and Logic-1 (we demonstrate the feasibility of achieving eight discrete resistance states in the same device depending on the RESET stop voltage). DNN simulations have been performed to compare the relative error between the output of AlexNet Layer 1 (Convolution) implemented with the standard backpropagation (BP) algorithm trained weights versus the weights that are encoded using the measured resistance distributions from RRAM. The IMAGENET dataset is used for classification purpose here. We focus only on the Layer 1 weights in the AlexNet framework with 11 × 11 × 96 filters values coded into a binary floating point and substituted with the RRAM resistance values corresponding to Logic-0 and Logic-1. The impact of variability in the resistance states of RRAM for the low and high resistance states on the accuracy of image classification is studied by formulating a look-up table (LUT) for the RRAM (from measured I-V data) and comparing the convolution computation output of AlexNet Layer 1 with the standard outputs from the BP-based pre-trained weights. This is one of the first studies dedicated to exploring the impact of RRAM device resistance variability on the prediction accuracy of a convolutional neural network (CNN) on an AlexNet platform through a framework that requires limited actual device switching test data. |
author2 |
School of Physical and Mathematical Sciences |
author_facet |
School of Physical and Mathematical Sciences Prabhu, Nagaraj Lakshmana Loy, Desmond Jia Jun Dananjaya, Putu Andhita Lew, Wen Siang Toh, Eng Huat Raghavan, Nagarajan |
format |
Article |
author |
Prabhu, Nagaraj Lakshmana Loy, Desmond Jia Jun Dananjaya, Putu Andhita Lew, Wen Siang Toh, Eng Huat Raghavan, Nagarajan |
author_sort |
Prabhu, Nagaraj Lakshmana |
title |
Exploring the impact of variability in resistance distributions of RRAM on the prediction accuracy of deep learning neural networks |
title_short |
Exploring the impact of variability in resistance distributions of RRAM on the prediction accuracy of deep learning neural networks |
title_full |
Exploring the impact of variability in resistance distributions of RRAM on the prediction accuracy of deep learning neural networks |
title_fullStr |
Exploring the impact of variability in resistance distributions of RRAM on the prediction accuracy of deep learning neural networks |
title_full_unstemmed |
Exploring the impact of variability in resistance distributions of RRAM on the prediction accuracy of deep learning neural networks |
title_sort |
exploring the impact of variability in resistance distributions of rram on the prediction accuracy of deep learning neural networks |
publishDate |
2021 |
url |
https://hdl.handle.net/10356/148658 |
_version_ |
1759855356917317632 |