NumGPT: Improving numeracy ability of generative pre-trained models
Existing generative pre-trained language models (e.g., GPT) focus on modeling the language structure and semantics of general texts. However, those models do not consider the numerical properties of numbers and cannot perform robustly on numerical reasoning tasks (e.g., math word problems and measur...
Saved in:
Main Authors: | JIN, Zhihua, JIANG, Xin, WANG, Xiangbo, LIU, Qun, WANG, Yong, REN, Xiaozhe, QU, Huamin |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2023
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8599 https://ink.library.smu.edu.sg/context/sis_research/article/9602/viewcontent/2109.03137.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
RecipeGPT: Generative pre-training based cooking recipe generation and evaluation system
by: LEE, Helena Huey Chong, et al.
Published: (2020) -
ChatGPT's impact
by: Lim, Donald Patrick L.
Published: (2023) -
Numeracy in children's nursing
by: Parker, Arija
Published: (2018) -
Plagiarism in the age of massive Generative Pre-trained Transformers (GPT-3)
by: Nassim Dehouche
Published: (2022) -
Re-evaluating natural intelligence in the face of ChatGPT
by: LIM, Elvin T., et al.
Published: (2023)