Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/159551
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYang, Zhiyuanen_US
dc.date.accessioned2022-06-24T04:59:20Z-
dc.date.available2022-06-24T04:59:20Z-
dc.date.issued2022-
dc.identifier.citationYang, Z. (2022). TinyNAD: tiny network with augmentation and distillation on point cloud learning model. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/159551en_US
dc.identifier.urihttps://hdl.handle.net/10356/159551-
dc.description.abstractThe development of practical applications, such as autonomous driving and robotics, has brought 3D point cloud data from LiDAR or RGB-D cameras to work as a good supplement to the sense of the environment than pure images. The utilization of point clouds with deep learning models is referred to as point-cloud learning. However, it is crucial work to deploy point-cloud learning models in IoT or edged devices with limited memory and computational resource. Rather than efficient network designing, our work applies model compression techniques to directly compress existing models with little accuracy drops. We propose a two-stage tiny model with Network Augmentation and Distillation (TinyNAD) and find that the tiny model after network augmentation is much easier for a teacher to distill. Compared with shrinking the parameters step by step like pruning or quantization, TinyNAD is pre-defining a tiny model and trying to improve its performance by introducing auxiliary supervision from augmented networks and the original model. We verify our method on PointNet++ using ModelNet40 3D shape classification dataset. Our tiny model is 58 times smaller than the original model, but with only 1.4% accuracy descent.en_US
dc.language.isoenen_US
dc.publisherNanyang Technological Universityen_US
dc.subjectEngineeringen_US
dc.titleTinyNAD: tiny network with augmentation and distillation on point cloud learning modelen_US
dc.typeThesis-Master by Courseworken_US
dc.contributor.supervisorXie Lihuaen_US
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.description.degreeMaster's degreeen_US
dc.contributor.supervisoremailELHXIE@ntu.edu.sgen_US
item.grantfulltextembargo_restricted_20260630-
item.fulltextWith Fulltext-
Appears in Collections:EEE Theses
Files in This Item:
File Description SizeFormat 
Amended_NTU_Master_Dissertation_Yang_Zhiyuan(2).pdf
  Until 2026-06-30
3.29 MBAdobe PDFUnder embargo until Jun 30, 2026

Page view(s)

253
Updated on Feb 15, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.