Transformers acceleration on autoNLP document classification
Unsupervised pre-training has been widely used in the field of Natural Language Processing, by training a huge network with unsupervised prediction tasks, one of the representatives is the BERT model. BERT has achieved great success in various NLP downstream tasks by reaching state-of-the-art result...
Saved in:
Main Author: | Cao, Hannan |
---|---|
Other Authors: | Sinno Jialin Pan |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/138506 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Generalized AutoNLP model for name entity recognition task
by: Wong, Yung Shen
Published: (2022) -
Data-driven and NLP for long document learning representation
by: Ko, Seoyoon
Published: (2021) -
DID: Auto censorship document
by: Nichaboon Rattanabunsakul, et al.
Published: (2018) -
Auto-documentation for stack overflow
by: Tan, Ri Sheng
Published: (2017) -
Web app backend engine powered by NLP techniques for flexible document summarization
by: Tan, Laddie Ji Cheng
Published: (2023)