Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/67028
Title: | Near Duplicate Image Identification | Authors: | Li, Zhiling | Keywords: | DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision | Issue Date: | 2016 | Abstract: | Near Duplicate Image Identification (NDII) is a Content-based Image Retrieval technique that recognizes and retrieves near duplicate images based on their visual content such as shapes, colors, and textures. As a growing interest in NDII, some frameworks have been developed to efficiently and robustly retrieve near duplicate images. A framework named Spatially Aligned Pyramid Matching (SAPM) was proposed to robustly handle images with spatial shifts and scale variations. This project is aiming to implement the SAPM framework and investigate the performance of this framework. Some new techniques are applied to this framework in order to improve its efficiency and accuracy. As the result of experiment shows, new clustering technique K-means++ improve the accuracy of retrieval. And SURF descriptors and candidate list ranking scheme improve the performance of SAPM compared with SIFT descriptors and equal weighted fusion when conduct experiments on two collected databases. However, several limitations still exist in the author’s research work. Some possible future research directions are suggested at the end of this report. | URI: | http://hdl.handle.net/10356/67028 | Schools: | School of Computer Engineering | Rights: | Nanyang Technological University | Fulltext Permission: | restricted | Fulltext Availability: | With Fulltext |
Appears in Collections: | SCSE Student Reports (FYP/IA/PA/PI) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FYP_Final Report_LiZhiling.pdf Restricted Access | 1.51 MB | Adobe PDF | View/Open |
Page view(s)
376
Updated on Jan 12, 2025
Download(s)
10
Updated on Jan 12, 2025
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.