Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/69497
Title: Attire detection and retrieval based on region proposals with convolutional neural network
Authors: Mao, Shangbo
Keywords: DRNTU::Engineering::Electrical and electronic engineering
Issue Date: 2017
Abstract: Region Proposals with Convolutional Neural Network Features (RCNN), an object detection algorithm, has a good performance on Visual Object Classes Challenge 2012 [1]. There are two main approaches to improve the performance of it. The first one is to apply high-capacity Convolutional Neutral Network (CNN) with region proposals to localize and segment the object. The other one is to perform supervised pre-training when the labelled data is insufficient. The goal of this project is to build an attire detection system using Region Proposals with Convolutional Neural Network Features. In order to study RCNN, we introduce some concepts related to it. We explain the definitions of object detection, Neural Network (NN) and Convolutional Neural Network (CNN) in detail. The description of RCNN contains two parts. The first part is the method of region proposal, and the second part is the CNN architecture. Then we describe the attire detection system and the process of dataset construction in detail. Finally, we summarize and discuss the testing results. The testing results show RCNN have a good performance on attire object detection. The mean average precision (mAP) based on all categories is 57.26%. Based on the testing results, we find that the quality and amount of training data have a great effect on the performance of attire detection system.
URI: http://hdl.handle.net/10356/69497
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
MaoShangbo16.pdf
  Restricted Access
3.24 MBAdobe PDFView/Open

Page view(s)

209
Updated on Oct 24, 2021

Download(s) 50

14
Updated on Oct 24, 2021

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.