Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/147934
Title: Attack on training effort of deep learning
Authors: How, Kevin Kai-Wen
Keywords: Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Issue Date: 2021
Publisher: Nanyang Technological University
Source: How, K. K. (2021). Attack on training effort of deep learning. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/147934
Project: SCSE20-0189
Abstract: The objective of this project is to develop an attack to hinder the tracking results of state-of-the- art Visual Object Trackers. After code development and testing, an evaluation will be done to assess the performance of the attack and to draw conclusions. Visual Object Tracking is a relatively new technology with increasing usage in modern systems. As Visual Object Trackers are built using deep learning models, it is inherently prone to the same vulnerabilities which give rise to the need to properly secure such systems. This project aims to attack Visual Object Trackers through the means of data poisoning with adversarial examples. An attack script was developed to utilise consecutive frames from a video to synthesize motion blurred images which are then used to poison the dataset that the object tracker is working on. The mechanisms implemented and inner workings were detailed, and an evaluation was drawn on the performance of the developed attack script. The attack script performed to expectation and was successful in achieving the goals set out for this project. This allows for further research to explore similar attacks in detail to devise appropriate protection/counter mechanisms against them.
URI: https://hdl.handle.net/10356/147934
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP_Final_Report.pdf
  Restricted Access
1.79 MBAdobe PDFView/Open

Page view(s)

162
Updated on May 20, 2022

Download(s) 50

32
Updated on May 20, 2022

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.