Natural language processing as autoregressive generation
The advances in deep learning have led to great achievements in many Natural Language Processing (NLP) tasks. With the nature of language, i.e., sequential data, most NLP tasks can be framed into the sequence learning framework, such as text generation. As one of the most important foundations for m...
Saved in:
Main Author: | Lin, Xiang |
---|---|
Other Authors: | Joty Shafiq Rayhan |
Format: | Thesis-Doctor of Philosophy |
Language: | English |
Published: |
Nanyang Technological University
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/168487 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Adaptation of language models via text augmentation
by: Prachaseree, Chaiyasait
Published: (2023) -
Structured pointing networks for natural language understanding
by: Nguyen, Thanh Tung
Published: (2021) -
Semantic representation learning for natural language understanding
by: Zhang, Yong
Published: (2018) -
Natural language translation with graph convolutional neural network
by: Zhu, Yimin
Published: (2018) -
Feature vector generation tool for sentiment classification of product reviews using SVM.
by: Chan, Saw Nyein Aung.
Published: (2008)