Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/162582
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLiu, Haitaoen_US
dc.contributor.authorOng, Yew-Soonen_US
dc.contributor.authorJiang, Xiaomoen_US
dc.contributor.authorWang, Xiaofangen_US
dc.date.accessioned2022-10-31T05:34:18Z-
dc.date.available2022-10-31T05:34:18Z-
dc.date.issued2021-
dc.identifier.citationLiu, H., Ong, Y., Jiang, X. & Wang, X. (2021). Modulating scalable Gaussian processes for expressive statistical learning. Pattern Recognition, 120, 108121-. https://dx.doi.org/10.1016/j.patcog.2021.108121en_US
dc.identifier.issn0031-3203en_US
dc.identifier.urihttps://hdl.handle.net/10356/162582-
dc.description.abstractFor a learning task, Gaussian process (GP) is interested in learning the statistical relationship between inputs and outputs, since it offers not only the prediction mean but also the associated variability. The vanilla GP however is hard to learn complicated distribution with the property of, e.g., heteroscedastic noise, multi-modality and non-stationarity, from massive data due to the Gaussian marginal and the cubic complexity. To this end, this article studies new scalable GP paradigms including the non-stationary heteroscedastic GP, the mixture of GPs and the latent GP, which introduce additional latent variables to modulate the outputs or inputs in order to learn richer, non-Gaussian statistical representation. Particularly, we resort to different variational inference strategies to arrive at analytical or tighter evidence lower bounds (ELBOs) of the marginal likelihood for efficient and effective model training. Extensive numerical experiments against state-of-the-art GP and neural network (NN) counterparts on various tasks verify the superiority of these scalable modulated GPs, especially the scalable latent GP, for learning diverse data distributions.en_US
dc.language.isoenen_US
dc.relation.ispartofPattern Recognitionen_US
dc.rights© 2021 Elsevier Ltd. All rights reserved.en_US
dc.subjectEngineering::Computer science and engineeringen_US
dc.titleModulating scalable Gaussian processes for expressive statistical learningen_US
dc.typeJournal Articleen
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.identifier.doi10.1016/j.patcog.2021.108121-
dc.identifier.scopus2-s2.0-85108971042-
dc.identifier.volume120en_US
dc.identifier.spage108121en_US
dc.subject.keywordsGaussian Processen_US
dc.subject.keywordsModulationen_US
dc.description.acknowledgementIt was supported by the National Key Research and Development Program of China (2020YFA0714403), the National Natural Science Foundation of China (52005074), and the Fundamental Research Funds for the Central Universities (DUT19RC(3)070). Besides, it was partially supported by the Research and Innovation in Science and Technology Major Project of Liaoning Province (2019JH1-10100024), and the MIIT Marine Welfare Project (Z135060009002).en_US
item.grantfulltextnone-
item.fulltextNo Fulltext-
Appears in Collections:SCSE Journal Articles

SCOPUSTM   
Citations 50

3
Updated on Jan 31, 2023

Web of ScienceTM
Citations 50

3
Updated on Feb 4, 2023

Page view(s)

13
Updated on Feb 7, 2023

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.