Please use this identifier to cite or link to this item:
Title: Age group classification based on facial images
Authors: Sai, Phyo Kyaw
Keywords: DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
DRNTU::Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Issue Date: 2014
Source: Sai, P. -K. (2014). Age group classification based on facial images. Master’s thesis, Nanyang Technological University, Singapore.
Abstract: There are a lot of possible real world applications where age estimation could be used. Age estimation from face images is a challenging problem since aging is a personalized process and it is also affected by many factors. Recently there is increasing interest in age estimation work but most researchers used publicly available aging face datasets most of which may not be suitable for age estimation work. First, some datasets contain face images of low quality; pose variation and extreme expression and they have limited number of images. Second, some datasets contain several images of the same person for an age. The condition makes age estimation performance on those datasets irrelevant to real world applications. In this thesis, we proposed a framework for collecting a large set of aging face images over internet. A large aging face dataset where there are many higher quality images of different people for each age was collected. In our approach, the dataset was collected over internet using Microsoft Search Engine Bing API with age-related text query. By using different combinations of filter options of API, a large number of images could be collected and by looping through ages, images of different ages could be obtained. This automated method of age-labeled image collection is able to provide us the dataset required for research work with less resource of our own. The second contribution is that we have evaluated the age estimation methods by using the existing and new datasets. For age estimation research work, performance improvement was emphasized since our initial accuracy was low. Our initial age estimation system was employed with AdaBoost classifier where tree classifiers were used as weak learners taking Gabor features as inputs. Since tree classifier works on local points, AdaBoost acts like local feature selector in this method. Biologically Inspired Feature was introduced to improve the performance in our existing framework. Then a new framework where Extreme Learning Machine classifier took output of dimension reduction methods like PCA on local features as inputs was introduced to get a much better performance. In this way, this new method worked on global of local features where the first method totally missed out global features. In our experiments, Local Gabor Binary Pattern was found out to be a better feature than Biologically Inspired Feature in the proposed new framework. Moreover, performance and learning speed were improved significantly higher compared to previous framework. Though some performance improvement has been achieved in our age group classification work, aging feature extraction method still needs to be improved so that possible real world application could be implemented. For our work on four age group classifications, improvement of 40-50% accuracy to 70-80% accuracy was achieved on average overall performance from previous to current work. Individual class performance on middle age groups was still in an unsatisfactory state even though performance on younger and older age group was quite high. This result pointed to us that a good feature extraction method is still lacking in age estimation domain or there is no distinct feature present to differentiate middle-age groups.
DOI: 10.32657/10356/59908
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
MEng Thesis - Sai Phyo Kyaw - G0902789J - EEE- NTU.pdf3.52 MBAdobe PDFThumbnail

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.