Conditional random mapping for effective ELM feature representation
Extreme learning machine (ELM) has been extensively studied, due to its fast training and good generalization. Unfortunately, the existing ELM-based feature representation methods are uncompetitive with state-of-the-art deep neural networks (DNNs) when conducting some complex visual recognition task...
Saved in:
Main Authors: | , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/141688 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-141688 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1416882020-06-10T03:09:33Z Conditional random mapping for effective ELM feature representation Li, Cheng Deng, Chenwei Zhou, Shichao Zhao, Baojun Huang, Guang-Bin School of Electrical and Electronic Engineering Engineering::Electrical and electronic engineering Extreme Learning Machine Conditional Random Feature Mapping Extreme learning machine (ELM) has been extensively studied, due to its fast training and good generalization. Unfortunately, the existing ELM-based feature representation methods are uncompetitive with state-of-the-art deep neural networks (DNNs) when conducting some complex visual recognition tasks. This weakness is mainly caused by two critical defects: (1) random feature mappings (RFM) by ad hoc probability distribution is unable to well project various input data into discriminative feature spaces; (2) in the ELM-based hierarchical architectures, features from previous layer are scattered via RFM in the current layer, which leads to abstracting higher level features ineffectively. To address these issues, we aim to take advantage of label information for optimizing random mapping in the ELM, utilizing an efficient label alignment metric to learn a conditional random feature mapping (CRFM) in a supervised manner. Moreover, we proposed a new CRFM-based single-layer ELM (CELM) and then extended CELM to the supervised multi-layer learning architecture (ML-CELM). Extensive experiments on various widely used datasets demonstrate our approach is more effective than original ELM-based and other existing DNN feature representation methods with rapid training/testing speed. The proposed CELM and ML-CELM are able to achieve discriminative and robust feature representation, and have shown superiority in various simulations in terms of generalization and speed. 2020-06-10T03:09:33Z 2020-06-10T03:09:33Z 2018 Journal Article Li, C., Deng, C., Zhou, S., Zhao, B., & Huang, G.-B. (2018). Conditional random mapping for effective ELM feature representation. Cognitive Computation, 10(5), 827-847. doi:10.1007/s12559-018-9557-x 1866-9956 https://hdl.handle.net/10356/141688 10.1007/s12559-018-9557-x 2-s2.0-85046824283 5 10 827 847 en Cognitive Computation © 2018 Springer Science+Business Media, LLC, part of Springer Nature. All rights reserved. |
institution |
Nanyang Technological University |
building |
NTU Library |
country |
Singapore |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Electrical and electronic engineering Extreme Learning Machine Conditional Random Feature Mapping |
spellingShingle |
Engineering::Electrical and electronic engineering Extreme Learning Machine Conditional Random Feature Mapping Li, Cheng Deng, Chenwei Zhou, Shichao Zhao, Baojun Huang, Guang-Bin Conditional random mapping for effective ELM feature representation |
description |
Extreme learning machine (ELM) has been extensively studied, due to its fast training and good generalization. Unfortunately, the existing ELM-based feature representation methods are uncompetitive with state-of-the-art deep neural networks (DNNs) when conducting some complex visual recognition tasks. This weakness is mainly caused by two critical defects: (1) random feature mappings (RFM) by ad hoc probability distribution is unable to well project various input data into discriminative feature spaces; (2) in the ELM-based hierarchical architectures, features from previous layer are scattered via RFM in the current layer, which leads to abstracting higher level features ineffectively. To address these issues, we aim to take advantage of label information for optimizing random mapping in the ELM, utilizing an efficient label alignment metric to learn a conditional random feature mapping (CRFM) in a supervised manner. Moreover, we proposed a new CRFM-based single-layer ELM (CELM) and then extended CELM to the supervised multi-layer learning architecture (ML-CELM). Extensive experiments on various widely used datasets demonstrate our approach is more effective than original ELM-based and other existing DNN feature representation methods with rapid training/testing speed. The proposed CELM and ML-CELM are able to achieve discriminative and robust feature representation, and have shown superiority in various simulations in terms of generalization and speed. |
author2 |
School of Electrical and Electronic Engineering |
author_facet |
School of Electrical and Electronic Engineering Li, Cheng Deng, Chenwei Zhou, Shichao Zhao, Baojun Huang, Guang-Bin |
format |
Article |
author |
Li, Cheng Deng, Chenwei Zhou, Shichao Zhao, Baojun Huang, Guang-Bin |
author_sort |
Li, Cheng |
title |
Conditional random mapping for effective ELM feature representation |
title_short |
Conditional random mapping for effective ELM feature representation |
title_full |
Conditional random mapping for effective ELM feature representation |
title_fullStr |
Conditional random mapping for effective ELM feature representation |
title_full_unstemmed |
Conditional random mapping for effective ELM feature representation |
title_sort |
conditional random mapping for effective elm feature representation |
publishDate |
2020 |
url |
https://hdl.handle.net/10356/141688 |
_version_ |
1681058034490540032 |