Lightweight and efficient neural natural language processing with quaternion networks
Many state-of-the-art neural models for NLP are heavily parameterized and thus memory inefficient. This paper proposes a series of lightweight and memory efficient neural architectures for a potpourri of natural language processing (NLP) tasks. To this end, our models exploit computation using Quate...
Saved in:
Main Authors: | TAY, Yi, ZHANG, Aston, LUU, Anh Tuan, RAO, Jinfeng, ZHANG, Shuai, WANG, Shuohang, FU, Jie, HUI, Siu Cheung |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2019
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/scis_studentpub/2 https://ink.library.smu.edu.sg/context/scis_studentpub/article/1002/viewcontent/P19_1145.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Simple and effective curriculum pointer-generator networks for reading comprehension over long narratives
by: TAY, Yi, et al.
Published: (2019) -
Understanding the Genetic Makeup of Linux Device Drivers
by: Tschudin, Peter Senna, et al.
Published: (2013) -
Scaling human activity recognition via deep learning-based domain adaptation
by: KHAN, Md Abdullah Hafiz, et al.
Published: (2018) -
A Prolog-based definition of an entity-relationship language
by: CHAN, H., et al.
Published: (1993) -
Revisiting masked auto-encoders for ECG-language representation learning
by: PHAM, Hung Manh, et al.
Published: (2024)