Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/147489
Title: | Group cost-sensitive boosting with multi-scale decorrelated filters for pedestrian detection | Authors: | Zhou, Chengju Wu, Meiqing Lam, Siew-Kei |
Keywords: | Engineering | Issue Date: | 2017 | Source: | Zhou, C., Wu, M. & Lam, S. (2017). Group cost-sensitive boosting with multi-scale decorrelated filters for pedestrian detection. The British Machine Vision Conference 2017. https://dx.doi.org/10.5244/C.31.48 | metadata.dc.contributor.conference: | The British Machine Vision Conference 2017 | Abstract: | We propose a novel two-stage pedestrian detection framework that combines multiscale decorrelated filters to extract more discriminative features and a novel group costsensitive boosting algorithm. The proposed boosting algorithm is based on mixture loss to alleviate the influence of annotation errors in training data and explores varying cost for different types of misclassification. Experiments on Caltech and INRIA datasets show that the proposed framework achieves the best detection performance among all state-of-the-art non-deep learning methods. In addition, the proposed approach runs 88X faster than the best performing method from the widely-known Filtered Channel Feature framework. | URI: | https://hdl.handle.net/10356/147489 | DOI: | 10.5244/C.31.48 | Schools: | School of Computer Science and Engineering | Rights: | © 2017 The Authors. The copyright of this document resides with its authors. It may be distributed unchanged freely in print or electronic forms. | Fulltext Permission: | open | Fulltext Availability: | With Fulltext |
Appears in Collections: | SCSE Conference Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
paper048.pdf | 1.71 MB | Adobe PDF | ![]() View/Open |
SCOPUSTM
Citations
50
1
Updated on Sep 27, 2023
Page view(s)
212
Updated on Sep 27, 2023
Download(s) 50
54
Updated on Sep 27, 2023
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.