On feature selection with principal component analysis for one-class SVM
In this short note, we demonstrate the use of principal components analysis (PCA) for one-class support vector machine (one-class SVM) as a dimension reduction tool. However, unlike almost all other usage of PCA which extracts the eigenvectors associated with top eigenvalues as the projection direct...
Saved in:
主要作者: | |
---|---|
其他作者: | |
格式: | Article |
語言: | English |
出版: |
2013
|
主題: | |
在線閱讀: | https://hdl.handle.net/10356/105603 http://hdl.handle.net/10220/17154 http://dx.doi.org/10.1016/j.patrec.2012.01.019 |
標簽: |
添加標簽
沒有標簽, 成為第一個標記此記錄!
|
機構: | Nanyang Technological University |
語言: | English |
id |
sg-ntu-dr.10356-105603 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1056032019-12-06T21:54:18Z On feature selection with principal component analysis for one-class SVM Lian, Heng School of Physical and Mathematical Sciences DRNTU::Science::Mathematics In this short note, we demonstrate the use of principal components analysis (PCA) for one-class support vector machine (one-class SVM) as a dimension reduction tool. However, unlike almost all other usage of PCA which extracts the eigenvectors associated with top eigenvalues as the projection directions, here it is the eigenvectors associated with small eigenvalues that are of interests, and in particular the null of the eigenspace, since the null space in fact characterizes the common features of the training samples. Image retrieval examples are used to illustrate the effectiveness of dimension reduction. 2013-10-31T07:43:56Z 2019-12-06T21:54:18Z 2013-10-31T07:43:56Z 2019-12-06T21:54:18Z 2012 2012 Journal Article Lian, H. (2012). On feature selection with principal component analysis for one-class SVM. Pattern recognition letters, 33(9), 1027-1031. 0167-8655 https://hdl.handle.net/10356/105603 http://hdl.handle.net/10220/17154 http://dx.doi.org/10.1016/j.patrec.2012.01.019 en Pattern recognition letters |
institution |
Nanyang Technological University |
building |
NTU Library |
country |
Singapore |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Science::Mathematics |
spellingShingle |
DRNTU::Science::Mathematics Lian, Heng On feature selection with principal component analysis for one-class SVM |
description |
In this short note, we demonstrate the use of principal components analysis (PCA) for one-class support vector machine (one-class SVM) as a dimension reduction tool. However, unlike almost all other usage of PCA which extracts the eigenvectors associated with top eigenvalues as the projection directions, here it is the eigenvectors associated with small eigenvalues that are of interests, and in particular the null of the eigenspace, since the null space in fact characterizes the common features of the training samples. Image retrieval examples are used to illustrate the effectiveness of dimension reduction. |
author2 |
School of Physical and Mathematical Sciences |
author_facet |
School of Physical and Mathematical Sciences Lian, Heng |
format |
Article |
author |
Lian, Heng |
author_sort |
Lian, Heng |
title |
On feature selection with principal component analysis for one-class SVM |
title_short |
On feature selection with principal component analysis for one-class SVM |
title_full |
On feature selection with principal component analysis for one-class SVM |
title_fullStr |
On feature selection with principal component analysis for one-class SVM |
title_full_unstemmed |
On feature selection with principal component analysis for one-class SVM |
title_sort |
on feature selection with principal component analysis for one-class svm |
publishDate |
2013 |
url |
https://hdl.handle.net/10356/105603 http://hdl.handle.net/10220/17154 http://dx.doi.org/10.1016/j.patrec.2012.01.019 |
_version_ |
1681043574470213632 |