DEVELOPMENT AND IMPLEMENTATION OF COMMUNICATION MODIFICATION ON ADAM OPTIMIZER FOR DISTRIBUTED DEEP LEARNING

Deep learning is a technique used in many domains to solve various problems. One of the popular example at the time of writing is ChatGPT, which uses GPT model made by OpenAI. A deep learning model consists of several layers, each with its own parameters. More complex models are developed by incr...

وصف كامل

محفوظ في:
التفاصيل البيبلوغرافية
المؤلف الرئيسي: Febryan Suryawan, Fransiskus
التنسيق: Final Project
اللغة:Indonesia
الوصول للمادة أونلاين:https://digilib.itb.ac.id/gdl/view/74150
الوسوم: إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
الوصف
الملخص:Deep learning is a technique used in many domains to solve various problems. One of the popular example at the time of writing is ChatGPT, which uses GPT model made by OpenAI. A deep learning model consists of several layers, each with its own parameters. More complex models are developed by increasing the number of learned parameters. Increase of parameters in the model will cause communication overhead when learning in distributed architecture. Therefore, this final project will study about existing techniques to reduce communication overhead using reduction in commu- nication rounds and compression. This final project will review CADA (T. Chen dkk., 2021) and Efficient-Adam (C. Chen dkk., 2022). Then, this final project will incorporate ideas from both techniques, resulting in a technique which could reduce communications to as much as 0.97 times the communication rounds of CADA and using only 0.29 times the communication size of CADA.