Lightweight and efficient neural natural language processing with quaternion networks
Many state-of-the-art neural models for NLP are heavily parameterized and thus memory inefficient. This paper proposes a series of lightweight and memory efficient neural architectures for a potpourri of natural language processing (NLP) tasks. To this end, our models exploit computation using Quate...
Saved in:
Main Authors: | TAY, Yi, ZHANG, Aston, LUU, Anh Tuan, RAO, Jinfeng, ZHANG, Shuai, WANG, Shuohang, FU, Jie, HUI, Siu Cheung |
---|---|
格式: | text |
語言: | English |
出版: |
Institutional Knowledge at Singapore Management University
2019
|
主題: | |
在線閱讀: | https://ink.library.smu.edu.sg/scis_studentpub/2 https://ink.library.smu.edu.sg/context/scis_studentpub/article/1002/viewcontent/P19_1145.pdf |
標簽: |
添加標簽
沒有標簽, 成為第一個標記此記錄!
|
相似書籍
-
Simple and effective curriculum pointer-generator networks for reading comprehension over long narratives
由: TAY, Yi, et al.
出版: (2019) -
Understanding the Genetic Makeup of Linux Device Drivers
由: Tschudin, Peter Senna, et al.
出版: (2013) -
Scaling human activity recognition via deep learning-based domain adaptation
由: KHAN, Md Abdullah Hafiz, et al.
出版: (2018) -
A Prolog-based definition of an entity-relationship language
由: CHAN, H., et al.
出版: (1993) -
Revisiting masked auto-encoders for ECG-language representation learning
由: PHAM, Hung Manh, et al.
出版: (2024)