Global-Local Partial Least Squares Discriminant Analysis And Its Extension In Reproducing Kernel Hilbert Space
Subspace learning is an essential approach for learning a low dimensional representation of a high dimensional space. When data samples are represented as points in a high dimensional space, learning with the high dimensionality becomes challenging as the effectiveness and efficiency of the learn...
Saved in:
Main Author: | |
---|---|
Format: | Thesis |
Language: | English |
Published: |
2021
|
Subjects: | |
Online Access: | http://eprints.usm.my/51628/1/AMINU%20MUHAMMAD.pdf http://eprints.usm.my/51628/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Sains Malaysia |
Language: | English |
id |
my.usm.eprints.51628 |
---|---|
record_format |
eprints |
spelling |
my.usm.eprints.51628 http://eprints.usm.my/51628/ Global-Local Partial Least Squares Discriminant Analysis And Its Extension In Reproducing Kernel Hilbert Space Muhammad, Aminu QA1-939 Mathematics Subspace learning is an essential approach for learning a low dimensional representation of a high dimensional space. When data samples are represented as points in a high dimensional space, learning with the high dimensionality becomes challenging as the effectiveness and efficiency of the learning algorithms drops significantly as the dimensionality increases. Thus, subspace learning techniques are employed to reduce the dimensionality of the data prior to employing other learning algorithms. Recently, there has been a lot of interest in subspace learning techniques that are based on the global and local structure preserving (GLSP) framework. The main idea of the GLSP approach is to find a transformation of the high dimensional data into a lower dimensional subspace, where both the global and local structure information of the data are preserved in the lower dimensional subspace. This thesis consider the case where data is sampled from an underlying manifold embedded in a high dimensional ambient space. Two novel subspace learning algorithms called locality preserving partial least squares discriminant analysis (LPPLS-DA) and neighborhood preserving partial least squares discriminant analysis (NPPLS-DA) which are based on the GLSP framework are proposed for discriminant subspace learning. Unlike the conventional partial least squares discriminant analysis (PLS-DA) which aims at preserving only the global Euclidean structure of the data space, the proposed LPPLS-DA and NPPLS-DA algorithms find an embedding that preserves both the global and local manifold structure. 2021-04 Thesis NonPeerReviewed application/pdf en http://eprints.usm.my/51628/1/AMINU%20MUHAMMAD.pdf Muhammad, Aminu (2021) Global-Local Partial Least Squares Discriminant Analysis And Its Extension In Reproducing Kernel Hilbert Space. PhD thesis, Perpustakaan Hamzah Sendut. |
institution |
Universiti Sains Malaysia |
building |
Hamzah Sendut Library |
collection |
Institutional Repository |
continent |
Asia |
country |
Malaysia |
content_provider |
Universiti Sains Malaysia |
content_source |
USM Institutional Repository |
url_provider |
http://eprints.usm.my/ |
language |
English |
topic |
QA1-939 Mathematics |
spellingShingle |
QA1-939 Mathematics Muhammad, Aminu Global-Local Partial Least Squares Discriminant Analysis And Its Extension In Reproducing Kernel Hilbert Space |
description |
Subspace learning is an essential approach for learning a low dimensional representation
of a high dimensional space. When data samples are represented as points
in a high dimensional space, learning with the high dimensionality becomes challenging
as the effectiveness and efficiency of the learning algorithms drops significantly
as the dimensionality increases. Thus, subspace learning techniques are employed to
reduce the dimensionality of the data prior to employing other learning algorithms.
Recently, there has been a lot of interest in subspace learning techniques that are based
on the global and local structure preserving (GLSP) framework. The main idea of the
GLSP approach is to find a transformation of the high dimensional data into a lower dimensional
subspace, where both the global and local structure information of the data
are preserved in the lower dimensional subspace. This thesis consider the case where
data is sampled from an underlying manifold embedded in a high dimensional ambient
space. Two novel subspace learning algorithms called locality preserving partial
least squares discriminant analysis (LPPLS-DA) and neighborhood preserving partial
least squares discriminant analysis (NPPLS-DA) which are based on the GLSP framework
are proposed for discriminant subspace learning. Unlike the conventional partial
least squares discriminant analysis (PLS-DA) which aims at preserving only the global
Euclidean structure of the data space, the proposed LPPLS-DA and NPPLS-DA algorithms
find an embedding that preserves both the global and local manifold structure. |
format |
Thesis |
author |
Muhammad, Aminu |
author_facet |
Muhammad, Aminu |
author_sort |
Muhammad, Aminu |
title |
Global-Local Partial Least Squares
Discriminant Analysis And Its
Extension In Reproducing Kernel
Hilbert Space |
title_short |
Global-Local Partial Least Squares
Discriminant Analysis And Its
Extension In Reproducing Kernel
Hilbert Space |
title_full |
Global-Local Partial Least Squares
Discriminant Analysis And Its
Extension In Reproducing Kernel
Hilbert Space |
title_fullStr |
Global-Local Partial Least Squares
Discriminant Analysis And Its
Extension In Reproducing Kernel
Hilbert Space |
title_full_unstemmed |
Global-Local Partial Least Squares
Discriminant Analysis And Its
Extension In Reproducing Kernel
Hilbert Space |
title_sort |
global-local partial least squares
discriminant analysis and its
extension in reproducing kernel
hilbert space |
publishDate |
2021 |
url |
http://eprints.usm.my/51628/1/AMINU%20MUHAMMAD.pdf http://eprints.usm.my/51628/ |
_version_ |
1725973339798765568 |