Improving deep RVFL neural networks on large datasets

Recently, neural networks algorithm is becoming popular among researchers for classification problems such as Handwritten Character Recognition. Gradient descend is a kind of popular method which usually used in neural networks. However, such iteration methods usually lead to train the network slowl...

Full description

Saved in:
Bibliographic Details
Main Author: Li, Bing
Other Authors: Ponnuthurai Nagaratnam Suganthan
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2021
Subjects:
Online Access:https://hdl.handle.net/10356/152334
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-152334
record_format dspace
spelling sg-ntu-dr.10356-1523342023-07-04T17:12:28Z Improving deep RVFL neural networks on large datasets Li, Bing Ponnuthurai Nagaratnam Suganthan School of Electrical and Electronic Engineering EPNSugan@ntu.edu.sg Engineering::Electrical and electronic engineering Recently, neural networks algorithm is becoming popular among researchers for classification problems such as Handwritten Character Recognition. Gradient descend is a kind of popular method which usually used in neural networks. However, such iteration methods usually lead to train the network slowly. Besides, overfitting the training data is also a problem of this approach. As such, researchers studied the randomised neural networks such as Random Vector Functional Link (RVFL) neural networks and Extreme Learning Machine (ELM) which significantly reduce training time and yet produce good classification performance. ELM neural network is a kind of simplified RVFL neural network. Compared with RVFL, it does not contain the direct link and bias between the input layer and output layer. In this project, improving deep RVFL neural networks are applied to classification problem. This work contains mainly deep RVFL neural networks and Convolution RVFL neural network. This latter combines RVFL neural network and CNN. The parameters in the convolution kernel are generated randomly in a certain range and keep fixed. The input of fully-connected layer includes the original input data and the data after convolved. From the results of testing, it can be seen that deep RVFL neural network is more suitable for tabular data, while the performance of CRVFL is better than traditional deep RVFL on image input data. Master of Science (Computer Control and Automation) 2021-08-03T11:59:41Z 2021-08-03T11:59:41Z 2021 Thesis-Master by Coursework Li, B. (2021). Improving deep RVFL neural networks on large datasets. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/152334 https://hdl.handle.net/10356/152334 en application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
spellingShingle Engineering::Electrical and electronic engineering
Li, Bing
Improving deep RVFL neural networks on large datasets
description Recently, neural networks algorithm is becoming popular among researchers for classification problems such as Handwritten Character Recognition. Gradient descend is a kind of popular method which usually used in neural networks. However, such iteration methods usually lead to train the network slowly. Besides, overfitting the training data is also a problem of this approach. As such, researchers studied the randomised neural networks such as Random Vector Functional Link (RVFL) neural networks and Extreme Learning Machine (ELM) which significantly reduce training time and yet produce good classification performance. ELM neural network is a kind of simplified RVFL neural network. Compared with RVFL, it does not contain the direct link and bias between the input layer and output layer. In this project, improving deep RVFL neural networks are applied to classification problem. This work contains mainly deep RVFL neural networks and Convolution RVFL neural network. This latter combines RVFL neural network and CNN. The parameters in the convolution kernel are generated randomly in a certain range and keep fixed. The input of fully-connected layer includes the original input data and the data after convolved. From the results of testing, it can be seen that deep RVFL neural network is more suitable for tabular data, while the performance of CRVFL is better than traditional deep RVFL on image input data.
author2 Ponnuthurai Nagaratnam Suganthan
author_facet Ponnuthurai Nagaratnam Suganthan
Li, Bing
format Thesis-Master by Coursework
author Li, Bing
author_sort Li, Bing
title Improving deep RVFL neural networks on large datasets
title_short Improving deep RVFL neural networks on large datasets
title_full Improving deep RVFL neural networks on large datasets
title_fullStr Improving deep RVFL neural networks on large datasets
title_full_unstemmed Improving deep RVFL neural networks on large datasets
title_sort improving deep rvfl neural networks on large datasets
publisher Nanyang Technological University
publishDate 2021
url https://hdl.handle.net/10356/152334
_version_ 1772827615344721920