TinyNAD: tiny network with augmentation and distillation on point cloud learning model

The development of practical applications, such as autonomous driving and robotics, has brought 3D point cloud data from LiDAR or RGB-D cameras to work as a good supplement to the sense of the environment than pure images. The utilization of point clouds with deep learning models is referred to as p...

Full description

Saved in:
Bibliographic Details
Main Author: Yang, Zhiyuan
Other Authors: Xie Lihua
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2022
Subjects:
Online Access:https://hdl.handle.net/10356/159551
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:The development of practical applications, such as autonomous driving and robotics, has brought 3D point cloud data from LiDAR or RGB-D cameras to work as a good supplement to the sense of the environment than pure images. The utilization of point clouds with deep learning models is referred to as point-cloud learning. However, it is crucial work to deploy point-cloud learning models in IoT or edged devices with limited memory and computational resource. Rather than efficient network designing, our work applies model compression techniques to directly compress existing models with little accuracy drops. We propose a two-stage tiny model with Network Augmentation and Distillation (TinyNAD) and find that the tiny model after network augmentation is much easier for a teacher to distill. Compared with shrinking the parameters step by step like pruning or quantization, TinyNAD is pre-defining a tiny model and trying to improve its performance by introducing auxiliary supervision from augmented networks and the original model. We verify our method on PointNet++ using ModelNet40 3D shape classification dataset. Our tiny model is 58 times smaller than the original model, but with only 1.4% accuracy descent.