Synthetic word embedding generation for downstream NLP task
Distributional word representation such as GloVe and BERT has garnered immense popularity and research interest in recent years due to their success in many downstream NLP applications. However, a major limitation of word embedding is its inability to handle unknown words. To make sense...
Saved in:
Main Author: | Hoang, Viet |
---|---|
Other Authors: | Chng Eng Siong |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2021
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/153201 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Data-driven and NLP for long document learning representation
by: Ko, Seoyoon
Published: (2021) -
Design and development of a NLP demo system
by: Lynn Htet Aung
Published: (2022) -
Dynamic knowledge graph embedding
by: Teo, Eugene Yu-jie
Published: (2021) -
Evolving type-2 neural fuzzy inference system with embedded deep learning in dynamic portfolio rebalancing
by: Dinh Khoat Hoang Anh
Published: (2021) -
Generalized AutoNLP model for name entity recognition task
by: Wong, Yung Shen
Published: (2022)