Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLi, Zhenen
dc.contributor.authorYap, Kim-Huien
dc.identifier.citationLi, Z., & Yap, K.-H. (2013). An efficient approach for scene categorization based on discriminative codebook learning in bag-of-words framework. Image and Vision Computing, 31(10), 748-755.en
dc.description.abstractThis paper proposes an efficient technique for learning a discriminative codebook for scene categorization. A state-of-the-art approach for scene categorization is the Bag-of-Words (BoW) framework, where codebook generation plays an important role in determining the performance of the system. Traditionally, the codebook generation methods adopted in the BoW techniques are designed to minimize the quantization error, rather than optimize the classification accuracy. In view of this, this paper tries to address the issue by careful design of the codewords such that the resulting image histograms for each category will retain strong discriminating power, while the online categorization of the testing image is as efficient as in the baseline BoW. The codewords are refined iteratively to improve their discriminative power offline. The proposed method is validated on UIUC Scene-15 dataset and NTU Scene-25 dataset and it is shown to outperform other state-of-the-art codebook generation methods in scene categorization.en
dc.description.sponsorshipASTAR (Agency for Sci., Tech. and Research, S’pore)en
dc.relation.ispartofseriesImage and Vision Computingen
dc.rights© 2013 Elsevier.en
dc.subjectScene categorizationen
dc.subjectCodebook learningen
dc.titleAn efficient approach for scene categorization based on discriminative codebook learning in bag-of-words frameworken
dc.typeJournal Articleen
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen
item.fulltextNo Fulltext-
Appears in Collections:EEE Journal Articles

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.