Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/137941
Title: Rapid facial recognition through wearable cameras
Authors: Sim, Jun Kai
Keywords: Engineering::Computer science and engineering
Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2020
Publisher: Nanyang Technological University
Project: SCSE19-0352
Abstract: Face Recognition has been one of the most popular topics in the industry over the past decades. It is a biometric software that has many creative usages, for example, the camera of a smartphone where it will automatically focus on the face of the person and in China, cameras are used to capture the face of those people who jaywalk. The goal of this report is to present a face recognition system that makes use of k-Nearest Neighbors to achieve a rapid recognition of everyone that appears on the screen. Moreover, it can also be used as a memory aid for users. The report provides a detailed explanation of the software used, these include face recognition API, python libraries, pre-trained model and the reasons for choosing such techniques and methods to achieve the goals of the project. In addition, the flow chart and decision tree of the program will be used to provide a better illustration of how the face recognition system works. Lastly, the report has also stated further improvements which allow the whole face recognition project to achieve better user satisfaction and performance enhancement of the system.
URI: https://hdl.handle.net/10356/137941
Schools: School of Computer Science and Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP_REPORT_SIMJUNKAI_U1721530F.pdf
  Restricted Access
1.74 MBAdobe PDFView/Open

Page view(s)

351
Updated on Mar 17, 2025

Download(s) 50

21
Updated on Mar 17, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.