A transformer-based deep learning model for predicting residential load demand

In this dissertation, a transformer-based deep learning model is introduced to solve the problem of predicting household energy consumption time series data. As the demand for high-precision energy consumption prediction in smart energy management systems increases, traditional methods still have li...

Full description

Saved in:
Bibliographic Details
Main Author: Zhang, Xijia
Other Authors: Xu Yan
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2024
Subjects:
Online Access:https://hdl.handle.net/10356/175442
https://www.kaggle.com/datasets/taranvee/smart-home-dataset-with-weather-information/data
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-175442
record_format dspace
spelling sg-ntu-dr.10356-1754422024-04-26T16:00:36Z A transformer-based deep learning model for predicting residential load demand Zhang, Xijia Xu Yan School of Electrical and Electronic Engineering xuyan@ntu.edu.sg Engineering Transformer Deep learning model Residential energy consumption Time series data Simulation evaluation Python In this dissertation, a transformer-based deep learning model is introduced to solve the problem of predicting household energy consumption time series data. As the demand for high-precision energy consumption prediction in smart energy management systems increases, traditional methods still have limitations in capturing complex dependencies and handling large-scale datasets. The proposed model harnesses the unique capabilities of the Transformer architecture, particularly its attention mechanism, to adeptly capture the long-term dependencies prevalent in energy consumption data, thereby enhancing the accuracy of load demand for residential-level users. The project optimized the Transformer encoder layer to support batch data processing and improve training efficiency. MSE loss function and Adam optimizer are employed to ensure fast and stable convergence. Through comparative experiments with traditional LSTM models, this model shows significant advantages in terms of accuracy, prediction speed, and ability to handle large-scale datasets, demonstrating the strong potential of utilizing deep learning techniques to process complex data in time series. The results of the model on residential energy consumption prediction task demonstrate its potential application value in future smart energy management systems, and future work is expected to explore the application of the model in different energy management scenarios, as well as its contribution to promoting sustainable energy utilization. Master's degree 2024-04-23T13:04:40Z 2024-04-23T13:04:40Z 2024 Thesis-Master by Coursework Zhang, X. (2024). A transformer-based deep learning model for predicting residential load demand. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/175442 https://hdl.handle.net/10356/175442 en https://www.kaggle.com/datasets/taranvee/smart-home-dataset-with-weather-information/data application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering
Transformer
Deep learning model
Residential energy consumption
Time series data
Simulation evaluation
Python
spellingShingle Engineering
Transformer
Deep learning model
Residential energy consumption
Time series data
Simulation evaluation
Python
Zhang, Xijia
A transformer-based deep learning model for predicting residential load demand
description In this dissertation, a transformer-based deep learning model is introduced to solve the problem of predicting household energy consumption time series data. As the demand for high-precision energy consumption prediction in smart energy management systems increases, traditional methods still have limitations in capturing complex dependencies and handling large-scale datasets. The proposed model harnesses the unique capabilities of the Transformer architecture, particularly its attention mechanism, to adeptly capture the long-term dependencies prevalent in energy consumption data, thereby enhancing the accuracy of load demand for residential-level users. The project optimized the Transformer encoder layer to support batch data processing and improve training efficiency. MSE loss function and Adam optimizer are employed to ensure fast and stable convergence. Through comparative experiments with traditional LSTM models, this model shows significant advantages in terms of accuracy, prediction speed, and ability to handle large-scale datasets, demonstrating the strong potential of utilizing deep learning techniques to process complex data in time series. The results of the model on residential energy consumption prediction task demonstrate its potential application value in future smart energy management systems, and future work is expected to explore the application of the model in different energy management scenarios, as well as its contribution to promoting sustainable energy utilization.
author2 Xu Yan
author_facet Xu Yan
Zhang, Xijia
format Thesis-Master by Coursework
author Zhang, Xijia
author_sort Zhang, Xijia
title A transformer-based deep learning model for predicting residential load demand
title_short A transformer-based deep learning model for predicting residential load demand
title_full A transformer-based deep learning model for predicting residential load demand
title_fullStr A transformer-based deep learning model for predicting residential load demand
title_full_unstemmed A transformer-based deep learning model for predicting residential load demand
title_sort transformer-based deep learning model for predicting residential load demand
publisher Nanyang Technological University
publishDate 2024
url https://hdl.handle.net/10356/175442
https://www.kaggle.com/datasets/taranvee/smart-home-dataset-with-weather-information/data
_version_ 1800916203870355456