Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/99545
Title: Discretized-Vapnik-Chervonenkis dimension for analyzing complexity of real function classes
Authors: Zhang, Chao
Bian, Wei
Tao, Dacheng
Lin, Weisi
Keywords: DRNTU::Engineering::Computer science and engineering
Issue Date: 2012
Source: Zhang, C., Bian, W., Tao, D., & Lin, W. (2012). Discretized-Vapnik-Chervonenkis dimension for analyzing complexity of real function classes. IEEE transactions on neural networks and learning systems, 23(9), 1461-1472.
Series/Report no.: IEEE transactions on neural networks and learning systems
Abstract: In this paper, we introduce the discretized-Vapnik-Chervonenkis (VC) dimension for studying the complexity of a real function class, and then analyze properties of real function classes and neural networks. We first prove that a countable traversal set is enough to achieve the VC dimension for a real function class, whereas its classical definition states that the traversal set is the output range of the function class. Based on this result, we propose the discretized-VC dimension defined by using a countable traversal set consisting of rational numbers in the range of a real function class. By using the discretized-VC dimension, we show that if a real function class has a finite VC dimension, only a finite traversal set is needed to achieve the VC dimension. We then point out that the real function classes, which have the infinite VC dimension, can be grouped into two categories: TYPE-A and TYPE-B. Subsequently, based on the obtained results, we discuss the relationship between the VC dimension of an indicator-output network and that of the real-output network, when both networks have the same structure except for the output activation functions. Finally, we present the risk bound based on the discretized-VC dimension for a real function class that has infinite VC dimension and is of TYPE-A. We prove that, with such a function class, the empirical risk minimization (ERM) principle for the function class is still consistent with overwhelming probability. This is a development of the existing knowledge that the ERM learning is consistent if and only if the function class has a finite VC dimension.
URI: https://hdl.handle.net/10356/99545
http://hdl.handle.net/10220/13524
ISSN: 2162-237X
DOI: http://dx.doi.org/10.1109/TNNLS.2012.2204773
Rights: © 2012 IEEE
metadata.item.grantfulltext: none
metadata.item.fulltext: No Fulltext
Appears in Collections:SCSE Journal Articles

Google ScholarTM

Check

Altmetric

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.