Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/84500
Title: Online multiple kernel classification
Authors: Jin, Rong.
Zhao, Peilin.
Yang, Tianbao.
Hoi, Steven C. H.
Keywords: DRNTU::Engineering::Computer science and engineering
Issue Date: 2012
Source: Hoi, S. C. H., Jin, R., Zhao, P., & Yang, T. (2013). Online Multiple Kernel Classification. Machine Learning, 90(2), 289-316.
Series/Report no.: Machine learning
Abstract: Although both online learning and kernel learning have been studied extensively in machine learning, there is limited effort in addressing the intersecting research problems of these two important topics. As an attempt to fill the gap, we address a new research problem, termed Online Multiple Kernel Classification (OMKC), which learns a kernel-based prediction function by selecting a subset of predefined kernel functions in an online learning fashion. OMKC is in general more challenging than typical online learning because both the kernel classifiers and the subset of selected kernels are unknown, and more importantly the solutions to the kernel classifiers and their combination weights are correlated. The proposed algorithms are based on the fusion of two online learning algorithms, i.e., the Perceptron algorithm that learns a classifier for a given kernel, and the Hedge algorithm that combines classifiers by linear weights. We develop stochastic selection strategies that randomly select a subset of kernels for combination and model updating, thus improving the learning efficiency. Our empirical study with 15 data sets shows promising performance of the proposed algorithms for OMKC in both learning efficiency and prediction accuracy.
URI: https://hdl.handle.net/10356/84500
http://hdl.handle.net/10220/17285
DOI: 10.1007/s10994-012-5319-2
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Journal Articles

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.