Leveraging linguistic knowledge to enhance low-resource NLP applications
Natural Language Processing (NLP) empowers computers to process and analyze vast amounts of text data. The introduction of pre-trained language models (PLMs) has significantly advanced NLP by incorporating deep learning algorithms, thereby enhancing the handling of natural language understanding (NL...
Saved in:
Main Author: | Zhu, Zixiao |
---|---|
Other Authors: | Mao Kezhi |
Format: | Thesis-Doctor of Philosophy |
Language: | English |
Published: |
Nanyang Technological University
2025
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/182513 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Knowledge enhanced stance detection
by: Hu, Kairui
Published: (2024) -
Empowering natural language processing in low-resource regimes
by: Feng, Zijian
Published: (2025) -
Event extraction and beyond: from conventional NLP to large language models
by: Zhou, Hanzhang
Published: (2025) -
On the transferability of pre-trained language models for low-resource programming languages
by: CHEN, Fuxiang, et al.
Published: (2022) -
Model-driven smart contract generation leveraging pretrained large language models
by: Jiang, Qinbo
Published: (2024)