PREDICTION OF USER MOBILITY FOR HANDOVERS IN 5G NETWORKS IN JAKARTA USING THE LONG SHORT-TERM MEMORY METHOD
Since the initial launch of 5G networks in 2018, the technology has rapidly expanded and now covers about 40% of the world's population. 5G offers higher bandwidth, faster connectivity, and lower latency compared to 4G. To achieve this high bandwidth, 5G operates at higher frequencies, which...
Saved in:
Main Author: | |
---|---|
Format: | Final Project |
Language: | Indonesia |
Online Access: | https://digilib.itb.ac.id/gdl/view/85100 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Institut Teknologi Bandung |
Language: | Indonesia |
id |
id-itb.:85100 |
---|---|
spelling |
id-itb.:851002024-08-19T14:42:49ZPREDICTION OF USER MOBILITY FOR HANDOVERS IN 5G NETWORKS IN JAKARTA USING THE LONG SHORT-TERM MEMORY METHOD Timoteo, Adriel Indonesia Final Project multi-step-ahead prediction, long short-term memory, handover, user movement prediction INSTITUT TEKNOLOGI BANDUNG https://digilib.itb.ac.id/gdl/view/85100 Since the initial launch of 5G networks in 2018, the technology has rapidly expanded and now covers about 40% of the world's population. 5G offers higher bandwidth, faster connectivity, and lower latency compared to 4G. To achieve this high bandwidth, 5G operates at higher frequencies, which, although they carry more data, have smaller coverage areas and are easily obstructed. This results in 5G users experiencing more handovers compared to 4G, making handover performance crucial in a 5G network. One way to improve handover performance and efficiency is by predicting user movement using algorithms such as Markov Chain, Hidden Markov Model, and machine learning methods like Support Vector Machine (SVM), XGBoost, Deep Neural Network (DNN), and Long Short-Term Memory (LSTM). Developments in LSTM algorithms have consistently shown that they are excellent for predicting time-related data. Therefore, the LSTM algorithm can be highly efficient for predicting user movement for handover purposes. The goal of this research is to create an LSTM model for multi-step-ahead prediction, which will take 60 seconds of historical data and predict 10 seconds ahead (x = 60, y = 10). This model aims to achieve an average distance between actual and predicted data of less than 20 meters and latency of less than 100 ms. To develop this model, subsystems such as Plotly and TensorBoard will be used to display the map results and model training metrics. Meanwhile, the dataset used for model training is the Grab-Posisi dataset, and the machine learning framework used is TensorFlow, both of which are part of the prediction subsystem. This combination of subsystems is chosen for its simplicity in implementation and relatively high efficiency. Six model variations are built and trained: models with 16 units in 1 layer, 32 units in 1 layer, 64 units in 1 layer, 128 units in 1 layer, 32 units in 2 layers (32 + 32), iv and 64 units in 2 layers (64 + 64). All variations are trained for 1000 epochs, but there is also a variation with 16 units trained for 3106 epochs. Metrics such as MAE, MSE, RMSE, and average difference in meters (avg_m_diff) of each model are measured to find the best performing model. Additionally, the model results will be mapped to visualize the predicted user trajectory and compared to the actual user trajectory. The tests revealed that the model with 16 units trained for 3106 epochs (model_16_lr001_v2) is the most accurate, achieving an accuracy of 16.99 meters with a latency of 31 ms. This research demonstrates that the LSTM method can accurately and quickly predict user movement for handover management in a 5G network. In the future, the accuracy of the model can be improved by increasing the training epochs for each model variation. Models with 32 and 64 neurons are believed to have the same or greater potential compared to the model with 16 neurons if trained for more epochs. text |
institution |
Institut Teknologi Bandung |
building |
Institut Teknologi Bandung Library |
continent |
Asia |
country |
Indonesia Indonesia |
content_provider |
Institut Teknologi Bandung |
collection |
Digital ITB |
language |
Indonesia |
description |
Since the initial launch of 5G networks in 2018, the technology has rapidly
expanded and now covers about 40% of the world's population. 5G offers higher
bandwidth, faster connectivity, and lower latency compared to 4G. To achieve this
high bandwidth, 5G operates at higher frequencies, which, although they carry
more data, have smaller coverage areas and are easily obstructed. This results in
5G users experiencing more handovers compared to 4G, making handover
performance crucial in a 5G network. One way to improve handover performance
and efficiency is by predicting user movement using algorithms such as Markov
Chain, Hidden Markov Model, and machine learning methods like Support Vector
Machine (SVM), XGBoost, Deep Neural Network (DNN), and Long Short-Term
Memory (LSTM).
Developments in LSTM algorithms have consistently shown that they are excellent
for predicting time-related data. Therefore, the LSTM algorithm can be highly
efficient for predicting user movement for handover purposes. The goal of this
research is to create an LSTM model for multi-step-ahead prediction, which will
take 60 seconds of historical data and predict 10 seconds ahead (x = 60, y = 10).
This model aims to achieve an average distance between actual and predicted data
of less than 20 meters and latency of less than 100 ms.
To develop this model, subsystems such as Plotly and TensorBoard will be used to
display the map results and model training metrics. Meanwhile, the dataset used
for model training is the Grab-Posisi dataset, and the machine learning framework
used is TensorFlow, both of which are part of the prediction subsystem. This
combination of subsystems is chosen for its simplicity in implementation and
relatively high efficiency.
Six model variations are built and trained: models with 16 units in 1 layer, 32 units
in 1 layer, 64 units in 1 layer, 128 units in 1 layer, 32 units in 2 layers (32 + 32),
iv
and 64 units in 2 layers (64 + 64). All variations are trained for 1000 epochs, but
there is also a variation with 16 units trained for 3106 epochs. Metrics such as
MAE, MSE, RMSE, and average difference in meters (avg_m_diff) of each model
are measured to find the best performing model. Additionally, the model results will
be mapped to visualize the predicted user trajectory and compared to the actual
user trajectory. The tests revealed that the model with 16 units trained for 3106
epochs (model_16_lr001_v2) is the most accurate, achieving an accuracy of 16.99
meters with a latency of 31 ms.
This research demonstrates that the LSTM method can accurately and quickly
predict user movement for handover management in a 5G network. In the future,
the accuracy of the model can be improved by increasing the training epochs for
each model variation. Models with 32 and 64 neurons are believed to have the same
or greater potential compared to the model with 16 neurons if trained for more
epochs. |
format |
Final Project |
author |
Timoteo, Adriel |
spellingShingle |
Timoteo, Adriel PREDICTION OF USER MOBILITY FOR HANDOVERS IN 5G NETWORKS IN JAKARTA USING THE LONG SHORT-TERM MEMORY METHOD |
author_facet |
Timoteo, Adriel |
author_sort |
Timoteo, Adriel |
title |
PREDICTION OF USER MOBILITY FOR HANDOVERS IN 5G NETWORKS IN JAKARTA USING THE LONG SHORT-TERM MEMORY METHOD |
title_short |
PREDICTION OF USER MOBILITY FOR HANDOVERS IN 5G NETWORKS IN JAKARTA USING THE LONG SHORT-TERM MEMORY METHOD |
title_full |
PREDICTION OF USER MOBILITY FOR HANDOVERS IN 5G NETWORKS IN JAKARTA USING THE LONG SHORT-TERM MEMORY METHOD |
title_fullStr |
PREDICTION OF USER MOBILITY FOR HANDOVERS IN 5G NETWORKS IN JAKARTA USING THE LONG SHORT-TERM MEMORY METHOD |
title_full_unstemmed |
PREDICTION OF USER MOBILITY FOR HANDOVERS IN 5G NETWORKS IN JAKARTA USING THE LONG SHORT-TERM MEMORY METHOD |
title_sort |
prediction of user mobility for handovers in 5g networks in jakarta using the long short-term memory method |
url |
https://digilib.itb.ac.id/gdl/view/85100 |
_version_ |
1822010606885011456 |