Please use this identifier to cite or link to this item:
Title: Neural network compression techniques for out-of-distribution detection
Authors: Bansal, Aditya
Keywords: Engineering::Computer science and engineering
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Bansal, A. (2022). Neural network compression techniques for out-of-distribution detection. Final Year Project (FYP), Nanyang Technological University, Singapore.
Abstract: One of the key challenges in deploying ML models on embedded systems are the numerous resource constraints, for instance, memory footprint, response time, and power consumption. Such real-time systems require resource-efficient models with low inference time while maintaining reasonable accuracy. In the context of OOD detection, despite the detection model having a high classification accuracy, if the inference time is too high, the system might be rendered ineffectual. There is significant literature on a number of neural network compression techniques. However, the majority of studies have performed offline testing on datasets like CIFAR. Few works have been implemented on some dedicated hardware or FPGAs. By implementing the above techniques on a real-time embedded system of DuckieBot, we studied the performance of these methods, particularly for the task of OOD detection. The compression techniques of pruning, quantization, and knowledge distillation have been experimented with, and analyzed on numerous metrics, for execution time, memory usage, reconstruction loss, and OOD metrics like ROC curve, True Positive, and False Positive Rates.
Schools: School of Computer Science and Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Restricted Access
2.93 MBAdobe PDFView/Open

Page view(s)

Updated on Oct 3, 2023

Download(s) 50

Updated on Oct 3, 2023

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.