QUANTIZATION IMPLEMENTATION OF INDONESIAN BERT LANGUAGE MODEL
In recent years, the use of pre-trained models has dominated computational research in various fields, including natural language processing. One prominent pre-training model is the Bidirectional Encoder Representations from Transformers (BERT). BERT has succeeded in becoming a state-of-the-art a...
Saved in:
Main Author: | Ayyub Abdurrahman, Muhammad |
---|---|
Format: | Final Project |
Language: | Indonesia |
Online Access: | https://digilib.itb.ac.id/gdl/view/69111 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Institut Teknologi Bandung |
Language: | Indonesia |
Similar Items
-
EXTRACTIVE SUMMARIZATION WITH SENTENCE-BERT TEXT ENCODER AND REINFORCEMENT LEARNING FOR INDONESIAN LANGUAGE TEXT
by: Denaya Rahadika Diana, Kadek -
Leveraging large language models and BERT for log parsing and anomaly detection
by: Zhou, Yihan, et al.
Published: (2024) -
Using CodeBERT model for vulnerability detection
by: Zhou, ZhiWei
Published: (2022) -
APPLICATION OF QUANTIZED LOW RANK ADAPTATION METHOD IN AUTOMATIC SPEECH RECOGNITION MODEL TRAINING IN INDONESIAN LANGUAGE USING WHISPER AND MOZILLA COMMON VOICE OPEN SOURCE DATASET
by: Faiq Dhiya Ul Haq, Muhammad -
PERFORMANCE IMPROVEMENT OF HATE SPEECH DETECTION FOR HATEFUL STATEMENTS USING CONTEXTUAL PREPROCESSING AND FINE-TUNING STRATEGY ON PRE-TRAINED LANGUAGE MODEL (BERT)
by: Donny Ericson, Muhammad