TinyNAD: tiny network with augmentation and distillation on point cloud learning model

The development of practical applications, such as autonomous driving and robotics, has brought 3D point cloud data from LiDAR or RGB-D cameras to work as a good supplement to the sense of the environment than pure images. The utilization of point clouds with deep learning models is referred to as p...

Full description

Saved in:
Bibliographic Details
Main Author: Yang, Zhiyuan
Other Authors: Xie Lihua
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2022
Subjects:
Online Access:https://hdl.handle.net/10356/159551
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-159551
record_format dspace
spelling sg-ntu-dr.10356-1595512023-07-04T17:44:22Z TinyNAD: tiny network with augmentation and distillation on point cloud learning model Yang, Zhiyuan Xie Lihua School of Electrical and Electronic Engineering ELHXIE@ntu.edu.sg Engineering::Electrical and electronic engineering The development of practical applications, such as autonomous driving and robotics, has brought 3D point cloud data from LiDAR or RGB-D cameras to work as a good supplement to the sense of the environment than pure images. The utilization of point clouds with deep learning models is referred to as point-cloud learning. However, it is crucial work to deploy point-cloud learning models in IoT or edged devices with limited memory and computational resource. Rather than efficient network designing, our work applies model compression techniques to directly compress existing models with little accuracy drops. We propose a two-stage tiny model with Network Augmentation and Distillation (TinyNAD) and find that the tiny model after network augmentation is much easier for a teacher to distill. Compared with shrinking the parameters step by step like pruning or quantization, TinyNAD is pre-defining a tiny model and trying to improve its performance by introducing auxiliary supervision from augmented networks and the original model. We verify our method on PointNet++ using ModelNet40 3D shape classification dataset. Our tiny model is 58 times smaller than the original model, but with only 1.4% accuracy descent. Master of Science (Computer Control and Automation) 2022-06-24T04:59:20Z 2022-06-24T04:59:20Z 2022 Thesis-Master by Coursework Yang, Z. (2022). TinyNAD: tiny network with augmentation and distillation on point cloud learning model. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/159551 https://hdl.handle.net/10356/159551 en application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
spellingShingle Engineering::Electrical and electronic engineering
Yang, Zhiyuan
TinyNAD: tiny network with augmentation and distillation on point cloud learning model
description The development of practical applications, such as autonomous driving and robotics, has brought 3D point cloud data from LiDAR or RGB-D cameras to work as a good supplement to the sense of the environment than pure images. The utilization of point clouds with deep learning models is referred to as point-cloud learning. However, it is crucial work to deploy point-cloud learning models in IoT or edged devices with limited memory and computational resource. Rather than efficient network designing, our work applies model compression techniques to directly compress existing models with little accuracy drops. We propose a two-stage tiny model with Network Augmentation and Distillation (TinyNAD) and find that the tiny model after network augmentation is much easier for a teacher to distill. Compared with shrinking the parameters step by step like pruning or quantization, TinyNAD is pre-defining a tiny model and trying to improve its performance by introducing auxiliary supervision from augmented networks and the original model. We verify our method on PointNet++ using ModelNet40 3D shape classification dataset. Our tiny model is 58 times smaller than the original model, but with only 1.4% accuracy descent.
author2 Xie Lihua
author_facet Xie Lihua
Yang, Zhiyuan
format Thesis-Master by Coursework
author Yang, Zhiyuan
author_sort Yang, Zhiyuan
title TinyNAD: tiny network with augmentation and distillation on point cloud learning model
title_short TinyNAD: tiny network with augmentation and distillation on point cloud learning model
title_full TinyNAD: tiny network with augmentation and distillation on point cloud learning model
title_fullStr TinyNAD: tiny network with augmentation and distillation on point cloud learning model
title_full_unstemmed TinyNAD: tiny network with augmentation and distillation on point cloud learning model
title_sort tinynad: tiny network with augmentation and distillation on point cloud learning model
publisher Nanyang Technological University
publishDate 2022
url https://hdl.handle.net/10356/159551
_version_ 1772826433576501248