Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/49772
Title: User specific learning and decision for face authentication
Authors: Dong, Zhan
Keywords: DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Control engineering
DRNTU::Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Issue Date: 2012
Abstract: The ability to recognize human faces is a demonstration of incredible human intelligence. Over the last three decades researchers from diverse areas have been making attempts to replicate this outstanding visual perception of human beings in machine recognition of faces. Face recognition has attracted many researchers and engineers in the area of image processing, pattern recognition and computer vision because of its immense application potential. Although human beings can easily recognize face images, the challenge of the face recognition is that we don’t know what features or image structures are used in the human intelligence for this recognition task. Machine learning technique provides a powerful tool to learn such features from sample images. Face verification mainly concerns authenticating a claimed identity posed by a person, while face identification focuses on recognizing the identity of a person from a database of known individuals. Thus, face authentication or verification engine can be designed differently based on different characteristics of different users to maximize the verification accuracy. This project investigates methods that utilize user specific features to enhance the verification accuracy.
URI: http://hdl.handle.net/10356/49772
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
eP3019-101.pdf
  Restricted Access
1.34 MBAdobe PDFView/Open

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.