Computationally efficient models for high-dimensional and large-scale classification problems
Date of Issue2009
School of Computer Engineering
Centre for Computational Intelligence
Generally there are two main objectives in designing modern learning models when handling the problems with high-dimensional input spaces and a large amount of data. Firstly the model’s effectiveness in terms of a good accuracy needs to be met and secondly the model’s efficiency in terms of scalability and computation complexity needs to suffice. In practice these objectives require different types of learning models to solve different difficulties. In the case of the parametric models such as the radial basis function (RBF), the main difficulty is in the deterioration in accuracy and increase in computation complexity for high-dimensional data, which can be caused by the inductive nature of learning problems and the curse of dimensionality. While in the case of nonparametric models such as the Gaussian process (GP), the computing demand could become extremely high when there is a large amount of data to be processed. These difficulties pose the main obstacles preventing many successful traditional models from being applied to high-dimensional and large-scale data applications.
DRNTU::Engineering::Computer science and engineering::Computing methodologies::Pattern recognition