Please use this identifier to cite or link to this item:
Title: An adaptive dropout based deep metric learning algorithm
Authors: Tan, Ronald Tay Siang
Keywords: Engineering::Computer science and engineering::Computing methodologies::Pattern recognition
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Tan, R. T. S. (2022). An adaptive dropout based deep metric learning algorithm. Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: SCSE21-0295
Abstract: The key idea of Deep Metric Learning (DML) is to learn a set of hierarchical non-linear mappings using deep neural networks, and then project the data samples into a new feature space for comparing or matching. As its name suggest, DML is a combination of deep learning and metric learning. Deep learning is a machine learning technique that requires the use of large neural networks. Metric learning is also a machine learning technique that utilizes distances between data points to undergo training and testing. Therefore, DML is a combination of these 2 techniques. DML is theorised to be an effective way to utilize a distance metric to learn the similarity between 2 data samples. This allows the model to predict if the data samples inputted is of the same class or not. This brings us to the idea of contrastive learning. Contrastive learning is a machine learning technique used to learn the general features of a dataset without labels by teaching the model which data points are similar or different. This is very useful in this project as such techniques allow us to train the model without any annotations or labels. This allows our model to transit from a supervised learning task to a self-supervised learning task. This is very beneficial given that the labels of the dataset are not given in a one-to-one relationship. The application of DML has achieved many practical successes. This is due to the many modifications made to the several aspects of DML. This includes the modification of model architecture, optimization of objective function, or addition of new training tasks to improve the training process, etc. The list of improvements made to DML goes on but all have their own practical successes due to the different dataset that the algorithm is applied on. Naturally, a different dataset would require a different optimization for the model to fully understand the data points. As many types of modifications and optimizations made to DML, there has not been any existing work that theoretically analyses the generalization error bound for DML. This can be a good method to measure how good a learned DML model is able to perform on unseen data. Therefore, the novel DML method, Adaptive Dropout based DML (ADroDML), is proposed which can adaptively learn the retention rates for the DML models with dropout in a theoretically justified way. This is compared to traditional DML methods of predefined retention rates that are unchanged throughout the training process. ADroDML can theoretically learn the retention rates in an optimal way and update them iteratively to achieve better performance. Experiments on the given dataset were made and found that the algorithm was able to produce a substantial accuracy score, comparable to the results produced in the research paper proposing ADroDML [1]. The algorithm is also able to do predictions to produce similar results on unseen data when tested on other months data. Therefore, this paper proves the success of the proposed ADroDML algorithm when applied on this given dataset.
Schools: School of Computer Science and Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Restricted Access
888.87 kBAdobe PDFView/Open

Page view(s)

Updated on Sep 30, 2023


Updated on Sep 30, 2023

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.