Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/81682
Title: An efficient approach for scene categorization based on discriminative codebook learning in bag-of-words framework
Authors: Li, Zhen
Yap, Kim-Hui
Keywords: Scene categorization
Codebook learning
Issue Date: 2013
Source: Li, Z., & Yap, K.-H. (2013). An efficient approach for scene categorization based on discriminative codebook learning in bag-of-words framework. Image and Vision Computing, 31(10), 748-755.
Series/Report no.: Image and Vision Computing
Abstract: This paper proposes an efficient technique for learning a discriminative codebook for scene categorization. A state-of-the-art approach for scene categorization is the Bag-of-Words (BoW) framework, where codebook generation plays an important role in determining the performance of the system. Traditionally, the codebook generation methods adopted in the BoW techniques are designed to minimize the quantization error, rather than optimize the classification accuracy. In view of this, this paper tries to address the issue by careful design of the codewords such that the resulting image histograms for each category will retain strong discriminating power, while the online categorization of the testing image is as efficient as in the baseline BoW. The codewords are refined iteratively to improve their discriminative power offline. The proposed method is validated on UIUC Scene-15 dataset and NTU Scene-25 dataset and it is shown to outperform other state-of-the-art codebook generation methods in scene categorization.
URI: https://hdl.handle.net/10356/81682
http://hdl.handle.net/10220/40930
ISSN: 0262-8856
DOI: 10.1016/j.imavis.2013.07.001
Rights: © 2013 Elsevier.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:EEE Journal Articles

SCOPUSTM   
Citations 20

8
checked on Aug 31, 2020

WEB OF SCIENCETM
Citations 50

7
checked on Oct 24, 2020

Page view(s) 50

234
checked on Oct 29, 2020

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.