Taylor's theorem: a new perspective for neural tensor networks
Neural tensor networks have been widely used in a large number of natural language processing tasks such as conversational sentiment analysis, named entity recognition and knowledge base completion. However, the mathematical explanation of neural tensor networks remains a challenging problem, due to...
Saved in:
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/160695 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-160695 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1606952022-08-01T03:56:39Z Taylor's theorem: a new perspective for neural tensor networks Li, Wei Zhu, Luyao Cambria, Erik School of Computer Science and Engineering Engineering::Computer science and engineering Neural Tensor Networks Natural Language Processing Neural tensor networks have been widely used in a large number of natural language processing tasks such as conversational sentiment analysis, named entity recognition and knowledge base completion. However, the mathematical explanation of neural tensor networks remains a challenging problem, due to the bilinear term. According to Taylor's theorem, a kth order differentiable function can be approximated by a kth order Taylor polynomial around a given point. Therefore, we provide a mathematical explanation of neural tensor networks and also reveal the inner link between them and feedforward neural networks from the perspective of Taylor's theorem. In addition, we unify two forms of neural tensor networks into a single framework and present factorization methods to make the neural tensor networks parameter-efficient. Experimental results bring some valuable insights into neural tensor networks. Agency for Science, Technology and Research (A*STAR) This research is supported by the Agency for Science, Technology and Research (A*STAR), Singapore under its AME Programmatic Funding Scheme (Project #A18A2b0046). 2022-08-01T03:56:39Z 2022-08-01T03:56:39Z 2021 Journal Article Li, W., Zhu, L. & Cambria, E. (2021). Taylor's theorem: a new perspective for neural tensor networks. Knowledge-Based Systems, 228, 107258-. https://dx.doi.org/10.1016/j.knosys.2021.107258 0950-7051 https://hdl.handle.net/10356/160695 10.1016/j.knosys.2021.107258 2-s2.0-85110103450 228 107258 en A18A2b0046 Knowledge-Based Systems © 2021 Elsevier B.V. All rights reserved. |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Computer science and engineering Neural Tensor Networks Natural Language Processing |
spellingShingle |
Engineering::Computer science and engineering Neural Tensor Networks Natural Language Processing Li, Wei Zhu, Luyao Cambria, Erik Taylor's theorem: a new perspective for neural tensor networks |
description |
Neural tensor networks have been widely used in a large number of natural language processing tasks such as conversational sentiment analysis, named entity recognition and knowledge base completion. However, the mathematical explanation of neural tensor networks remains a challenging problem, due to the bilinear term. According to Taylor's theorem, a kth order differentiable function can be approximated by a kth order Taylor polynomial around a given point. Therefore, we provide a mathematical explanation of neural tensor networks and also reveal the inner link between them and feedforward neural networks from the perspective of Taylor's theorem. In addition, we unify two forms of neural tensor networks into a single framework and present factorization methods to make the neural tensor networks parameter-efficient. Experimental results bring some valuable insights into neural tensor networks. |
author2 |
School of Computer Science and Engineering |
author_facet |
School of Computer Science and Engineering Li, Wei Zhu, Luyao Cambria, Erik |
format |
Article |
author |
Li, Wei Zhu, Luyao Cambria, Erik |
author_sort |
Li, Wei |
title |
Taylor's theorem: a new perspective for neural tensor networks |
title_short |
Taylor's theorem: a new perspective for neural tensor networks |
title_full |
Taylor's theorem: a new perspective for neural tensor networks |
title_fullStr |
Taylor's theorem: a new perspective for neural tensor networks |
title_full_unstemmed |
Taylor's theorem: a new perspective for neural tensor networks |
title_sort |
taylor's theorem: a new perspective for neural tensor networks |
publishDate |
2022 |
url |
https://hdl.handle.net/10356/160695 |
_version_ |
1743119567996059648 |