Neural architecture search as sparse supernet

This paper aims at enlarging the problem of Neural Architecture Search (NAS) from Single-Path and Multi-Path Search to automated Mixed-Path Search. In particular, we model the NAS problem as a sparse supernet using a new continuous architecture representation with a mixture of sparsity constraints....

Full description

Saved in:
Bibliographic Details
Main Authors: WU, Y., LIU, A., HUANG, Zhiwu, ZHANG, S., VAN, Gool L.
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2021
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/6411
https://ink.library.smu.edu.sg/context/sis_research/article/7414/viewcontent/Neural_architecture_search_as_sparse_supernet.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-7414
record_format dspace
spelling sg-smu-ink.sis_research-74142021-11-23T01:58:55Z Neural architecture search as sparse supernet WU, Y. LIU, A. HUANG, Zhiwu ZHANG, S. VAN, Gool L. This paper aims at enlarging the problem of Neural Architecture Search (NAS) from Single-Path and Multi-Path Search to automated Mixed-Path Search. In particular, we model the NAS problem as a sparse supernet using a new continuous architecture representation with a mixture of sparsity constraints. The sparse supernet enables us to automatically achieve sparsely-mixed paths upon a compact set of nodes. To optimize the proposed sparse supernet, we exploit a hierarchical accelerated proximal gradient algorithm within a bi-level optimization framework. Extensive experiments on Convolutional Neural Network and Recurrent Neural Network search demonstrate that the proposed method is capable of searching for compact, general and powerful neural architectures. 2021-02-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/6411 https://ink.library.smu.edu.sg/context/sis_research/article/7414/viewcontent/Neural_architecture_search_as_sparse_supernet.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University OS and Networks Systems Architecture
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic OS and Networks
Systems Architecture
spellingShingle OS and Networks
Systems Architecture
WU, Y.
LIU, A.
HUANG, Zhiwu
ZHANG, S.
VAN, Gool L.
Neural architecture search as sparse supernet
description This paper aims at enlarging the problem of Neural Architecture Search (NAS) from Single-Path and Multi-Path Search to automated Mixed-Path Search. In particular, we model the NAS problem as a sparse supernet using a new continuous architecture representation with a mixture of sparsity constraints. The sparse supernet enables us to automatically achieve sparsely-mixed paths upon a compact set of nodes. To optimize the proposed sparse supernet, we exploit a hierarchical accelerated proximal gradient algorithm within a bi-level optimization framework. Extensive experiments on Convolutional Neural Network and Recurrent Neural Network search demonstrate that the proposed method is capable of searching for compact, general and powerful neural architectures.
format text
author WU, Y.
LIU, A.
HUANG, Zhiwu
ZHANG, S.
VAN, Gool L.
author_facet WU, Y.
LIU, A.
HUANG, Zhiwu
ZHANG, S.
VAN, Gool L.
author_sort WU, Y.
title Neural architecture search as sparse supernet
title_short Neural architecture search as sparse supernet
title_full Neural architecture search as sparse supernet
title_fullStr Neural architecture search as sparse supernet
title_full_unstemmed Neural architecture search as sparse supernet
title_sort neural architecture search as sparse supernet
publisher Institutional Knowledge at Singapore Management University
publishDate 2021
url https://ink.library.smu.edu.sg/sis_research/6411
https://ink.library.smu.edu.sg/context/sis_research/article/7414/viewcontent/Neural_architecture_search_as_sparse_supernet.pdf
_version_ 1770575955037782016