Please use this identifier to cite or link to this item:
Title: Batch mode adaptive multiple instance learning for computer vision tasks
Authors: Li, Wen
Duan, Lixin
Tsang, Ivor Wai-Hung
Xu, Dong
Keywords: DRNTU::Engineering::Computer science and engineering
Issue Date: 2012
Source: Li, W., Duan, L., Tsang, I. W.-H., & Xu, D. (2012). Batch Mode Adaptive Multiple Instance Learning for Computer Vision Tasks. 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2368-2375.
Abstract: Multiple Instance Learning (MIL) has been widely exploited in many computer vision tasks, such as image retrieval, object tracking and so on. To handle ambiguity of instance labels in positive bags, the training process of traditional MIL methods is usually computationally expensive, which limits the applications of MIL in more computer vision tasks. In this paper, we propose a novel batch mode framework, namely Batch mode Adaptive Multiple Instance Learning (BAMIL), to accelerate the instance-level MIL methods. Specifically, instead of using all training bags at once, we divide the training bags into several sets of bags (i.e., batches). At each time, we use one batch of training bags to train a new classifier which is adapted from the latest pre-learned classifier. Such batch mode framework significantly accelerates the traditional MIL methods for large scale applications and can be also used in dynamic environments such as object tracking. The experimental results show that our BAMIL is much faster than the recently developed MIL with constrained positive bags while achieves comparable performance for text-based web image retrieval. In dynamic settings, BAMIL also achieves the better overall performance for object tracking when compared with other online MIL methods.
DOI: 10.1109/CVPR.2012.6247949
Rights: © 2012 IEEE.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Conference Papers

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.