Global-Local Partial Least Squares Discriminant Analysis And Its Extension In Reproducing Kernel Hilbert Space

Subspace learning is an essential approach for learning a low dimensional representation of a high dimensional space. When data samples are represented as points in a high dimensional space, learning with the high dimensionality becomes challenging as the effectiveness and efficiency of the learn...

Full description

Saved in:
Bibliographic Details
Main Author: Muhammad, Aminu
Format: Thesis
Language:English
Published: 2021
Subjects:
Online Access:http://eprints.usm.my/51628/1/AMINU%20MUHAMMAD.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
id my-usm-ep.51628
record_format uketd_dc
spelling my-usm-ep.516282022-02-23T07:22:34Z Global-Local Partial Least Squares Discriminant Analysis And Its Extension In Reproducing Kernel Hilbert Space 2021-04 Muhammad, Aminu QA1-939 Mathematics Subspace learning is an essential approach for learning a low dimensional representation of a high dimensional space. When data samples are represented as points in a high dimensional space, learning with the high dimensionality becomes challenging as the effectiveness and efficiency of the learning algorithms drops significantly as the dimensionality increases. Thus, subspace learning techniques are employed to reduce the dimensionality of the data prior to employing other learning algorithms. Recently, there has been a lot of interest in subspace learning techniques that are based on the global and local structure preserving (GLSP) framework. The main idea of the GLSP approach is to find a transformation of the high dimensional data into a lower dimensional subspace, where both the global and local structure information of the data are preserved in the lower dimensional subspace. This thesis consider the case where data is sampled from an underlying manifold embedded in a high dimensional ambient space. Two novel subspace learning algorithms called locality preserving partial least squares discriminant analysis (LPPLS-DA) and neighborhood preserving partial least squares discriminant analysis (NPPLS-DA) which are based on the GLSP framework are proposed for discriminant subspace learning. Unlike the conventional partial least squares discriminant analysis (PLS-DA) which aims at preserving only the global Euclidean structure of the data space, the proposed LPPLS-DA and NPPLS-DA algorithms find an embedding that preserves both the global and local manifold structure. 2021-04 Thesis http://eprints.usm.my/51628/ http://eprints.usm.my/51628/1/AMINU%20MUHAMMAD.pdf application/pdf en public phd doctoral Perpustakaan Hamzah Sendut Pusat Pengajian Sains Matematik
institution Universiti Sains Malaysia
collection USM Institutional Repository
language English
topic QA1-939 Mathematics
spellingShingle QA1-939 Mathematics
Muhammad, Aminu
Global-Local Partial Least Squares Discriminant Analysis And Its Extension In Reproducing Kernel Hilbert Space
description Subspace learning is an essential approach for learning a low dimensional representation of a high dimensional space. When data samples are represented as points in a high dimensional space, learning with the high dimensionality becomes challenging as the effectiveness and efficiency of the learning algorithms drops significantly as the dimensionality increases. Thus, subspace learning techniques are employed to reduce the dimensionality of the data prior to employing other learning algorithms. Recently, there has been a lot of interest in subspace learning techniques that are based on the global and local structure preserving (GLSP) framework. The main idea of the GLSP approach is to find a transformation of the high dimensional data into a lower dimensional subspace, where both the global and local structure information of the data are preserved in the lower dimensional subspace. This thesis consider the case where data is sampled from an underlying manifold embedded in a high dimensional ambient space. Two novel subspace learning algorithms called locality preserving partial least squares discriminant analysis (LPPLS-DA) and neighborhood preserving partial least squares discriminant analysis (NPPLS-DA) which are based on the GLSP framework are proposed for discriminant subspace learning. Unlike the conventional partial least squares discriminant analysis (PLS-DA) which aims at preserving only the global Euclidean structure of the data space, the proposed LPPLS-DA and NPPLS-DA algorithms find an embedding that preserves both the global and local manifold structure.
format Thesis
qualification_name Doctor of Philosophy (PhD.)
qualification_level Doctorate
author Muhammad, Aminu
author_facet Muhammad, Aminu
author_sort Muhammad, Aminu
title Global-Local Partial Least Squares Discriminant Analysis And Its Extension In Reproducing Kernel Hilbert Space
title_short Global-Local Partial Least Squares Discriminant Analysis And Its Extension In Reproducing Kernel Hilbert Space
title_full Global-Local Partial Least Squares Discriminant Analysis And Its Extension In Reproducing Kernel Hilbert Space
title_fullStr Global-Local Partial Least Squares Discriminant Analysis And Its Extension In Reproducing Kernel Hilbert Space
title_full_unstemmed Global-Local Partial Least Squares Discriminant Analysis And Its Extension In Reproducing Kernel Hilbert Space
title_sort global-local partial least squares discriminant analysis and its extension in reproducing kernel hilbert space
granting_institution Perpustakaan Hamzah Sendut
granting_department Pusat Pengajian Sains Matematik
publishDate 2021
url http://eprints.usm.my/51628/1/AMINU%20MUHAMMAD.pdf
_version_ 1747822090973609984