Design novel DNN models with neural architecture search
Current bit-shift techniques used to create lightweight and energy-efficient neural networks involve shifting existing Convolutional Neural Networks (CNNs) directly into the bit-shift domain. Though that offers greater hardware efficiency by replacing floating-point multiplications with binary bit-s...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/175370 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Current bit-shift techniques used to create lightweight and energy-efficient neural networks involve shifting existing Convolutional Neural Networks (CNNs) directly into the bit-shift domain. Though that offers greater hardware efficiency by replacing floating-point multiplications with binary bit-shifts, it often also results in accuracy degradation and sometimes a failure to converge. This work proposes ShiftNAS, a Neural Architecture Search (NAS) framework used to generate CNNs optimized for the bit-shift domain. ShiftNAS searches for networks in a shift-oriented search space, using a decoupled operation and topology search strategy that is enhanced with suitable regularization schemes. It can overcome the limitations that handcrafted CNNs and other NAS frameworks experience in the bit-shift domain. ShiftNAS-designed networks achieve accuracy improvements of 0.22-16.55% on CIFAR-10 and 1.36-29.33% on CIFAR-100 as compared to existing approaches. |
---|