On-the-fly knowledge distillation model for sentence embedding
In this dissertation, we run experimental study to investigate the performance of sentence embedding using an on-the-fly knowledge distillation model based on DistillCSE framework. This model utilizes SimCSE as the initial teacher model. After a certain number of training steps, it caches an interm...
Saved in:
Main Author: | Zhu, Xuchun |
---|---|
Other Authors: | Lihui Chen |
Format: | Thesis-Master by Coursework |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/174236 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Large language model enhanced with prompt-based vanilla distillation for sentence embeddings
by: Wang, Minghao
Published: (2024) -
Composition distillation for semantic sentence embeddings
by: Vaanavan, Sezhiyan
Published: (2024) -
Solar Power Integration in Water (H2O) Distillation (SPIN-HD)
by: Carlos, Charlize F., et al.
Published: (2021) -
Edge-computing-based knowledge distillation and multitask learning for partial discharge recognition
by: Ji, Jinsheng, et al.
Published: (2024) -
Sekali, How And Lucky - Expressing Unexpectedness in Colloqial Singapore English
by: CHEN LIANGCAI
Published: (2011)