Weighting and pruning based ensemble deep random vector functional link network for tabular data classification

In this paper, we first integrate normalization to the Ensemble Deep Random Vector Functional Link network (edRVFL). This re-normalization step can help the network avoid divergence of the hidden features. Then, we propose novel variants of the edRVFL network. Weighted edRVFL (WedRVFL) uses weightin...

Full description

Saved in:
Bibliographic Details
Main Authors: Shi, Qiushi, Hu, Minghui, Suganthan, Ponnuthurai Nagaratnam, Katuwal, Rakesh
Other Authors: School of Electrical and Electronic Engineering
Format: Article
Language:English
Published: 2023
Subjects:
Online Access:https://hdl.handle.net/10356/164112
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-164112
record_format dspace
spelling sg-ntu-dr.10356-1641122023-01-05T02:15:36Z Weighting and pruning based ensemble deep random vector functional link network for tabular data classification Shi, Qiushi Hu, Minghui Suganthan, Ponnuthurai Nagaratnam Katuwal, Rakesh School of Electrical and Electronic Engineering Engineering::Electrical and electronic engineering Weighting Methods Pruning In this paper, we first integrate normalization to the Ensemble Deep Random Vector Functional Link network (edRVFL). This re-normalization step can help the network avoid divergence of the hidden features. Then, we propose novel variants of the edRVFL network. Weighted edRVFL (WedRVFL) uses weighting methods to give training samples different weights in different layers according to how the samples were classified confidently in the previous layer thereby increasing the ensemble's diversity and accuracy. Furthermore, a pruning-based edRVFL (PedRVFL) has also been proposed. We prune some inferior neurons based on their importance for classification before generating the next hidden layer. Through this method, we ensure that the randomly generated inferior features will not propagate to deeper layers. Subsequently, the combination of weighting and pruning, called Weighting and Pruning based Ensemble Deep Random Vector Functional Link Network (WPedRVFL), is proposed. We compare their performances with other state-of-the-art classification methods on 24 tabular UCI classification datasets. The experimental results illustrate the superior performance of our proposed methods. 2023-01-05T02:15:35Z 2023-01-05T02:15:35Z 2022 Journal Article Shi, Q., Hu, M., Suganthan, P. N. & Katuwal, R. (2022). Weighting and pruning based ensemble deep random vector functional link network for tabular data classification. Pattern Recognition, 132, 108879-. https://dx.doi.org/10.1016/j.patcog.2022.108879 0031-3203 https://hdl.handle.net/10356/164112 10.1016/j.patcog.2022.108879 2-s2.0-85135340847 132 108879 en Pattern Recognition © 2022 Elsevier Ltd. All rights reserved.
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
Weighting Methods
Pruning
spellingShingle Engineering::Electrical and electronic engineering
Weighting Methods
Pruning
Shi, Qiushi
Hu, Minghui
Suganthan, Ponnuthurai Nagaratnam
Katuwal, Rakesh
Weighting and pruning based ensemble deep random vector functional link network for tabular data classification
description In this paper, we first integrate normalization to the Ensemble Deep Random Vector Functional Link network (edRVFL). This re-normalization step can help the network avoid divergence of the hidden features. Then, we propose novel variants of the edRVFL network. Weighted edRVFL (WedRVFL) uses weighting methods to give training samples different weights in different layers according to how the samples were classified confidently in the previous layer thereby increasing the ensemble's diversity and accuracy. Furthermore, a pruning-based edRVFL (PedRVFL) has also been proposed. We prune some inferior neurons based on their importance for classification before generating the next hidden layer. Through this method, we ensure that the randomly generated inferior features will not propagate to deeper layers. Subsequently, the combination of weighting and pruning, called Weighting and Pruning based Ensemble Deep Random Vector Functional Link Network (WPedRVFL), is proposed. We compare their performances with other state-of-the-art classification methods on 24 tabular UCI classification datasets. The experimental results illustrate the superior performance of our proposed methods.
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Shi, Qiushi
Hu, Minghui
Suganthan, Ponnuthurai Nagaratnam
Katuwal, Rakesh
format Article
author Shi, Qiushi
Hu, Minghui
Suganthan, Ponnuthurai Nagaratnam
Katuwal, Rakesh
author_sort Shi, Qiushi
title Weighting and pruning based ensemble deep random vector functional link network for tabular data classification
title_short Weighting and pruning based ensemble deep random vector functional link network for tabular data classification
title_full Weighting and pruning based ensemble deep random vector functional link network for tabular data classification
title_fullStr Weighting and pruning based ensemble deep random vector functional link network for tabular data classification
title_full_unstemmed Weighting and pruning based ensemble deep random vector functional link network for tabular data classification
title_sort weighting and pruning based ensemble deep random vector functional link network for tabular data classification
publishDate 2023
url https://hdl.handle.net/10356/164112
_version_ 1754611285350154240