Computationally efficient models for high-dimensional and large-scale classification problems

Generally there are two main objectives in designing modern learning models when handling the problems with high-dimensional input spaces and a large amount of data. Firstly the model’s effectiveness in terms of a good accuracy needs to be met and secondly the model’s efficiency in terms of scalabil...

Full description

Saved in:
Bibliographic Details
Main Author: Ma, Li
Other Authors: Abdul Wahab Bin Abdul Rahman
Format: Theses and Dissertations
Language:English
Published: 2009
Subjects:
Online Access:https://hdl.handle.net/10356/19093
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Generally there are two main objectives in designing modern learning models when handling the problems with high-dimensional input spaces and a large amount of data. Firstly the model’s effectiveness in terms of a good accuracy needs to be met and secondly the model’s efficiency in terms of scalability and computation complexity needs to suffice. In practice these objectives require different types of learning models to solve different difficulties. In the case of the parametric models such as the radial basis function (RBF), the main difficulty is in the deterioration in accuracy and increase in computation complexity for high-dimensional data, which can be caused by the inductive nature of learning problems and the curse of dimensionality. While in the case of nonparametric models such as the Gaussian process (GP), the computing demand could become extremely high when there is a large amount of data to be processed. These difficulties pose the main obstacles preventing many successful traditional models from being applied to high-dimensional and large-scale data applications.