SRAM based computing-in-memory for tiny machine learning

This dissertation investigates the potential of Computing-In-Memory (CIM) using Static Random-Access Memory (SRAM) to address the limitations of the Von Neumann architecture and to increase miniaturisation. This research aims to overcome this bottleneck by enabling in-memory computation for tiny mac...

Full description

Saved in:
Bibliographic Details
Main Author: Gupta, Shini
Other Authors: Kim Tae Hyoung
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2024
Subjects:
Online Access:https://hdl.handle.net/10356/175498
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-175498
record_format dspace
spelling sg-ntu-dr.10356-1754982024-04-26T16:00:56Z SRAM based computing-in-memory for tiny machine learning Gupta, Shini Kim Tae Hyoung School of Electrical and Electronic Engineering THKIM@ntu.edu.sg Engineering SRAM Computing in memory Tiny machine learning This dissertation investigates the potential of Computing-In-Memory (CIM) using Static Random-Access Memory (SRAM) to address the limitations of the Von Neumann architecture and to increase miniaturisation. This research aims to overcome this bottleneck by enabling in-memory computation for tiny machine learning applications like BNN. Two SRAM cell architectures: 6-transistor (6T) and 8-transistor (8T) are investigated. Simulations performed using Cadence Virtuoso using TSMC 65nm Library demonstrate that both cells give correct value for write, hold, read operations but for MAC operations the 8T cell exhibits superior stability compared to the 6T design. Furthermore, the implementation of a 64-bit memory array capable of performing 8-row MAC operations was investigated . This paves the way for efficient in-memory computing suitable for Tiny machine learning applications. Master's degree 2024-04-26T02:54:24Z 2024-04-26T02:54:24Z 2024 Thesis-Master by Coursework Gupta, S. (2024). SRAM based computing-in-memory for tiny machine learning. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/175498 https://hdl.handle.net/10356/175498 en application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering
SRAM
Computing in memory
Tiny machine learning
spellingShingle Engineering
SRAM
Computing in memory
Tiny machine learning
Gupta, Shini
SRAM based computing-in-memory for tiny machine learning
description This dissertation investigates the potential of Computing-In-Memory (CIM) using Static Random-Access Memory (SRAM) to address the limitations of the Von Neumann architecture and to increase miniaturisation. This research aims to overcome this bottleneck by enabling in-memory computation for tiny machine learning applications like BNN. Two SRAM cell architectures: 6-transistor (6T) and 8-transistor (8T) are investigated. Simulations performed using Cadence Virtuoso using TSMC 65nm Library demonstrate that both cells give correct value for write, hold, read operations but for MAC operations the 8T cell exhibits superior stability compared to the 6T design. Furthermore, the implementation of a 64-bit memory array capable of performing 8-row MAC operations was investigated . This paves the way for efficient in-memory computing suitable for Tiny machine learning applications.
author2 Kim Tae Hyoung
author_facet Kim Tae Hyoung
Gupta, Shini
format Thesis-Master by Coursework
author Gupta, Shini
author_sort Gupta, Shini
title SRAM based computing-in-memory for tiny machine learning
title_short SRAM based computing-in-memory for tiny machine learning
title_full SRAM based computing-in-memory for tiny machine learning
title_fullStr SRAM based computing-in-memory for tiny machine learning
title_full_unstemmed SRAM based computing-in-memory for tiny machine learning
title_sort sram based computing-in-memory for tiny machine learning
publisher Nanyang Technological University
publishDate 2024
url https://hdl.handle.net/10356/175498
_version_ 1800916121379930112