Automatic hyperparameter optimization for machine learning
This project focuses on the concept of hyperparameters in a Machine Learning classifi- cation problem. Hyperparameters are parameter values that have direct control over the behaviour of the Machine Learning classification model. Automatic hyperparameter op- timization is an active area of research...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/139152 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-139152 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1391522023-07-07T18:52:20Z Automatic hyperparameter optimization for machine learning Tan, Xavier Jun Sheng Mao Kezhi School of Electrical and Electronic Engineering A*STAR Institute of High Performance Computing EKZMao@ntu.edu.sg Engineering::Electrical and electronic engineering This project focuses on the concept of hyperparameters in a Machine Learning classifi- cation problem. Hyperparameters are parameter values that have direct control over the behaviour of the Machine Learning classification model. Automatic hyperparameter op- timization is an active area of research which aims to facilitate the exhaustive process of manually tuning them by hand. In recent years, the success of the bayesian optimization model in the sequential model-based optimization approach shed light and direction for the future of hyperparameter optimization. It led to the development of many automatic hyperparameter optimization frameworks such as Hyperopt, Scikit-optimize and Optuna, to name a few. Each differs in the underlying optimization algorithms and performance. A considerable amount of effort is required to understand the complex and rigid struc- ture of the framework, and they are not easily reproducible onto other practical Machine Learning problems. This project aims to address those problems through the development of a unified func- tion combining Hyperopt, Scikit-optimize and Optuna, offering more flexibility, improved usability and visual aids. Experiments conducted demonstrated that the program devel- oped had consistently produced a model that outperforms an untuned classification model and that Hyperopt is superior in terms of accuracy score and speed. However, an ob- servation from the results was that the program hadn’t accounted for several factors which could have impeded with the performance of the random forest classifier. Hence, as future works, functional ANOVA can be incorporated into the program to address the limitations of the program. Bachelor of Engineering (Electrical and Electronic Engineering) 2020-05-16T10:53:30Z 2020-05-16T10:53:30Z 2020 Final Year Project (FYP) https://hdl.handle.net/10356/139152 en B1122-191 application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Electrical and electronic engineering |
spellingShingle |
Engineering::Electrical and electronic engineering Tan, Xavier Jun Sheng Automatic hyperparameter optimization for machine learning |
description |
This project focuses on the concept of hyperparameters in a Machine Learning classifi- cation problem. Hyperparameters are parameter values that have direct control over the behaviour of the Machine Learning classification model. Automatic hyperparameter op- timization is an active area of research which aims to facilitate the exhaustive process of manually tuning them by hand. In recent years, the success of the bayesian optimization model in the sequential model-based optimization approach shed light and direction for the future of hyperparameter optimization. It led to the development of many automatic hyperparameter optimization frameworks such as Hyperopt, Scikit-optimize and Optuna, to name a few. Each differs in the underlying optimization algorithms and performance. A considerable amount of effort is required to understand the complex and rigid struc- ture of the framework, and they are not easily reproducible onto other practical Machine Learning problems.
This project aims to address those problems through the development of a unified func- tion combining Hyperopt, Scikit-optimize and Optuna, offering more flexibility, improved usability and visual aids. Experiments conducted demonstrated that the program devel- oped had consistently produced a model that outperforms an untuned classification model and that Hyperopt is superior in terms of accuracy score and speed. However, an ob- servation from the results was that the program hadn’t accounted for several factors which could have impeded with the performance of the random forest classifier. Hence, as future works, functional ANOVA can be incorporated into the program to address the limitations of the program. |
author2 |
Mao Kezhi |
author_facet |
Mao Kezhi Tan, Xavier Jun Sheng |
format |
Final Year Project |
author |
Tan, Xavier Jun Sheng |
author_sort |
Tan, Xavier Jun Sheng |
title |
Automatic hyperparameter optimization for machine learning |
title_short |
Automatic hyperparameter optimization for machine learning |
title_full |
Automatic hyperparameter optimization for machine learning |
title_fullStr |
Automatic hyperparameter optimization for machine learning |
title_full_unstemmed |
Automatic hyperparameter optimization for machine learning |
title_sort |
automatic hyperparameter optimization for machine learning |
publisher |
Nanyang Technological University |
publishDate |
2020 |
url |
https://hdl.handle.net/10356/139152 |
_version_ |
1772827785933357056 |