Log Anomaly Detection Using Adaptive Universal Transformer
There have been numerous problems regarding parsing activities over logs that come from the output of a software system. One of the most common cases is log anomaly detection. This thesis shows one approach suitable for use in that specific case: DeepLog, a simple approach with considerably good acc...
Saved in:
Main Author: | |
---|---|
Format: | Final Project |
Language: | Indonesia |
Online Access: | https://digilib.itb.ac.id/gdl/view/39635 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Institut Teknologi Bandung |
Language: | Indonesia |
id |
id-itb.:39635 |
---|---|
spelling |
id-itb.:396352019-06-27T11:46:25ZLog Anomaly Detection Using Adaptive Universal Transformer Ryan Wibisono, Sergio Indonesia Final Project LSTM, adaptive universal transformer, DeepLog, Spell, anomaly INSTITUT TEKNOLOGI BANDUNG https://digilib.itb.ac.id/gdl/view/39635 There have been numerous problems regarding parsing activities over logs that come from the output of a software system. One of the most common cases is log anomaly detection. This thesis shows one approach suitable for use in that specific case: DeepLog, a simple approach with considerably good accuracy in log anomaly detection. In DeepLog, the whole logs get broken down into log sequences with certain length. Those sequences of logs get fed in streaming fashion into a log parser called Spell, which in turn feed the parsed sequence into a deep learning model to predict the next log appearing after the corresponding sequence. DeepLog utilizes the LSTM model. In this thesis, the LSTM model is replaced with the adaptive universal transformer model. LSTM seems to have difficulties in processing long sequence of logs, a common problem in LSTM. Adaptive universal transformer addresses this issue by replacing sequential processing of LSTM with parallel computation based on the concept of self-attention, implemented using the multi-head self-attention layers. This thesis looks for how the adaptive universal transformer model can be implemented in this case despite the model original design, which was for text problems like language translation. The model has achieved state-of-the-art accuracy in language translation problems although the model can be considered quite complex. To see how the model performs, various experiments will be done by varying diverse configurable parameters. Experiments show that this model can achieve high enough accuracy in the log anomaly detection case by using the right hyper-parameters with the various logs. text |
institution |
Institut Teknologi Bandung |
building |
Institut Teknologi Bandung Library |
continent |
Asia |
country |
Indonesia Indonesia |
content_provider |
Institut Teknologi Bandung |
collection |
Digital ITB |
language |
Indonesia |
description |
There have been numerous problems regarding parsing activities over logs that come from the output of a software system. One of the most common cases is log anomaly detection. This thesis shows one approach suitable for use in that specific case: DeepLog, a simple approach with considerably good accuracy in log anomaly detection. In DeepLog, the whole logs get broken down into log sequences with certain length. Those sequences of logs get fed in streaming fashion into a log parser called Spell, which in turn feed the parsed sequence into a deep learning model to predict the next log appearing after the corresponding sequence.
DeepLog utilizes the LSTM model. In this thesis, the LSTM model is replaced with the adaptive universal transformer model. LSTM seems to have difficulties in processing long sequence of logs, a common problem in LSTM. Adaptive universal transformer addresses this issue by replacing sequential processing of LSTM with parallel computation based on the concept of self-attention, implemented using the multi-head self-attention layers.
This thesis looks for how the adaptive universal transformer model can be implemented in this case despite the model original design, which was for text problems like language translation. The model has achieved state-of-the-art accuracy in language translation problems although the model can be considered quite complex.
To see how the model performs, various experiments will be done by varying diverse configurable parameters. Experiments show that this model can achieve high enough accuracy in the log anomaly detection case by using the right hyper-parameters with the various logs. |
format |
Final Project |
author |
Ryan Wibisono, Sergio |
spellingShingle |
Ryan Wibisono, Sergio Log Anomaly Detection Using Adaptive Universal Transformer |
author_facet |
Ryan Wibisono, Sergio |
author_sort |
Ryan Wibisono, Sergio |
title |
Log Anomaly Detection Using Adaptive Universal Transformer |
title_short |
Log Anomaly Detection Using Adaptive Universal Transformer |
title_full |
Log Anomaly Detection Using Adaptive Universal Transformer |
title_fullStr |
Log Anomaly Detection Using Adaptive Universal Transformer |
title_full_unstemmed |
Log Anomaly Detection Using Adaptive Universal Transformer |
title_sort |
log anomaly detection using adaptive universal transformer |
url |
https://digilib.itb.ac.id/gdl/view/39635 |
_version_ |
1822269314631204864 |