PENINGKATAN KINERJA MIKROGRID CERDAS DENGAN MENGOPTIMALKAN PENJADWALAN OPERASI SISTEMBATERAI PENYIMPAN ENERGI MENGGUNAKAN METODE DEEP QLEARNING
The world is currently facing the issue of an energy security crisis with the depletion of fossil energy reserves which have an impact on energy availability. Awareness of the importance of Renewable Energy (RE) which is environmentally friendly and far from polluting carbon emissions is urgently ne...
Saved in:
Main Author: | |
---|---|
Format: | Theses |
Language: | Indonesia |
Online Access: | https://digilib.itb.ac.id/gdl/view/70401 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Institut Teknologi Bandung |
Language: | Indonesia |
id |
id-itb.:70401 |
---|---|
spelling |
id-itb.:704012023-01-10T12:55:23ZPENINGKATAN KINERJA MIKROGRID CERDAS DENGAN MENGOPTIMALKAN PENJADWALAN OPERASI SISTEMBATERAI PENYIMPAN ENERGI MENGGUNAKAN METODE DEEP QLEARNING Herizal, Herviyandi Indonesia Theses Battery Energy Storage System, Smart Microgrid, Deep Q-Learning, Optimal Schedulling, State of Charge. INSTITUT TEKNOLOGI BANDUNG https://digilib.itb.ac.id/gdl/view/70401 The world is currently facing the issue of an energy security crisis with the depletion of fossil energy reserves which have an impact on energy availability. Awareness of the importance of Renewable Energy (RE) which is environmentally friendly and far from polluting carbon emissions is urgently needed. Optimization of intelligent Microgrid (MG) operations and Battery Energy Storage Systems (BESS) to deal with intermittent nature of RE generators is a solution to integrate all components in dealing with fluctuations in electricity consumption and intermittent consumption of RE generators that affect the profile of Renewable Fraction (RF) or percentage of RE use in MG. In order to optimize BESS operations on MG, an energy management algorithm is needed to schedule battery charging and discharging operations. For the purpose of optimizing its performance, hybrid modeling can be used which is a combination of machine learning modeling and physics-based modeling by modeling SBPE, Solar Power Plants, and electrical loads. In this study, BESS modeling was used which refers to battery state of charge (SOC) data, Solar Power Plants modeling was carried out using a physics-based model using PVLib combined with machine learning methods, and electric loads were modeled using machine learning prediction methods, namely deep Q-Learning. An energy management algorithm was developed to improve RF performance on MGs by optimizing the scheduling of battery charging and discharging operations using the deep Q-Learning method. This algorithm can be implemented in the MG digital twin (MGDT) framework which models physical objects into digital models. In implementing the energy management algorithm based on deep Q-Learning, several scenarios are carried out based on the punishment on the reward function of the influence of conditions on BESS operations which are divided into high and low punishment. The results of the average RF value for seven consecutive days in the application of high and low punishment were 43.71 and 45.52 where the average value of RF experienced an increase from the comparison of rule-based algorithms in the application of high and low punishment respectively, namely 2.8% and 6.67%. In addition, information was obtained that along with increasing the results of the reward value and increasing the number of simulation iterations, the average RF value could increase. The application of the low punishment scenario has the advantage of reward and an increased average RF value but has disadvantages in the BESS operation which experiences operating conditions outside the desired operating range due to the absence of punishment for these operating conditions as well as in the high punishment scenario. In the application of the high punishment scenario, although it produces a lower average RF value, it tends to operate the BESS in the operating range which is very influential on battery life. . text |
institution |
Institut Teknologi Bandung |
building |
Institut Teknologi Bandung Library |
continent |
Asia |
country |
Indonesia Indonesia |
content_provider |
Institut Teknologi Bandung |
collection |
Digital ITB |
language |
Indonesia |
description |
The world is currently facing the issue of an energy security crisis with the depletion of fossil energy reserves which have an impact on energy availability. Awareness of the importance of Renewable Energy (RE) which is environmentally friendly and far from polluting carbon emissions is urgently needed. Optimization of intelligent Microgrid (MG) operations and Battery Energy Storage Systems (BESS) to deal with intermittent nature of RE generators is a solution to integrate all components in dealing with fluctuations in electricity consumption and intermittent consumption of RE generators that affect the profile of Renewable Fraction (RF) or percentage of RE use in MG. In order to optimize BESS operations on MG, an energy management algorithm is needed to schedule battery charging and discharging operations. For the purpose of optimizing its performance, hybrid modeling can be used which is a combination of machine learning modeling and physics-based modeling by modeling SBPE, Solar Power Plants, and electrical loads.
In this study, BESS modeling was used which refers to battery state of charge (SOC) data, Solar Power Plants modeling was carried out using a physics-based model using PVLib combined with machine learning methods, and electric loads were modeled using machine learning prediction methods, namely deep Q-Learning. An energy management algorithm was developed to improve RF performance on MGs by optimizing the scheduling of battery charging and discharging operations using the deep Q-Learning method. This algorithm can be implemented in the MG digital twin (MGDT) framework which models physical objects into digital models. In implementing the energy management algorithm based on deep Q-Learning, several scenarios are carried out based on the punishment on the reward function of the influence of conditions on BESS operations which are divided into high and low punishment. The results of the average RF value for seven consecutive days in the application of high and low punishment were 43.71 and 45.52 where the average value of RF experienced an increase from the comparison of rule-based algorithms in the application of high and low punishment respectively, namely 2.8% and 6.67%. In addition, information was obtained that along with increasing the results of the reward value and increasing the number of simulation iterations, the average RF value could increase.
The application of the low punishment scenario has the advantage of reward and an increased average RF value but has disadvantages in the BESS operation which experiences operating conditions outside the desired operating range due to the absence of punishment for these operating conditions as well as in the high punishment scenario. In the application of the high punishment scenario, although it produces a lower average RF value, it tends to operate the BESS in the operating range which is very influential on battery life.
.
|
format |
Theses |
author |
Herizal, Herviyandi |
spellingShingle |
Herizal, Herviyandi PENINGKATAN KINERJA MIKROGRID CERDAS DENGAN MENGOPTIMALKAN PENJADWALAN OPERASI SISTEMBATERAI PENYIMPAN ENERGI MENGGUNAKAN METODE DEEP QLEARNING |
author_facet |
Herizal, Herviyandi |
author_sort |
Herizal, Herviyandi |
title |
PENINGKATAN KINERJA MIKROGRID CERDAS DENGAN MENGOPTIMALKAN PENJADWALAN OPERASI SISTEMBATERAI PENYIMPAN ENERGI MENGGUNAKAN METODE DEEP QLEARNING |
title_short |
PENINGKATAN KINERJA MIKROGRID CERDAS DENGAN MENGOPTIMALKAN PENJADWALAN OPERASI SISTEMBATERAI PENYIMPAN ENERGI MENGGUNAKAN METODE DEEP QLEARNING |
title_full |
PENINGKATAN KINERJA MIKROGRID CERDAS DENGAN MENGOPTIMALKAN PENJADWALAN OPERASI SISTEMBATERAI PENYIMPAN ENERGI MENGGUNAKAN METODE DEEP QLEARNING |
title_fullStr |
PENINGKATAN KINERJA MIKROGRID CERDAS DENGAN MENGOPTIMALKAN PENJADWALAN OPERASI SISTEMBATERAI PENYIMPAN ENERGI MENGGUNAKAN METODE DEEP QLEARNING |
title_full_unstemmed |
PENINGKATAN KINERJA MIKROGRID CERDAS DENGAN MENGOPTIMALKAN PENJADWALAN OPERASI SISTEMBATERAI PENYIMPAN ENERGI MENGGUNAKAN METODE DEEP QLEARNING |
title_sort |
peningkatan kinerja mikrogrid cerdas dengan mengoptimalkan penjadwalan operasi sistembaterai penyimpan energi menggunakan metode deep qlearning |
url |
https://digilib.itb.ac.id/gdl/view/70401 |
_version_ |
1822278751006752768 |