Topic-Aware Deep Compositional Models for Sentence Classification
In recent years, deep compositional models have emerged as a popular technique for representation learning of sentence in computational linguistic and natural language processing. These models normally train various forms of neural networks on top of pretrained word embeddings using a task-specific...
Saved in:
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2017
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/83235 http://hdl.handle.net/10220/42502 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-83235 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-832352020-03-07T13:57:27Z Topic-Aware Deep Compositional Models for Sentence Classification Zhao, Rui Mao, Kezhi School of Electrical and Electronic Engineering Machine learning Natural language processing In recent years, deep compositional models have emerged as a popular technique for representation learning of sentence in computational linguistic and natural language processing. These models normally train various forms of neural networks on top of pretrained word embeddings using a task-specific corpus. However, most of these works neglect the multisense nature of words in the pretrained word embeddings. In this paper we introduce topic models to enrich the word embeddings for multisenses of words. The integration of the topic model with various semantic compositional processes leads to topic-aware convolutional neural network and topic-aware long short term memory networks. Different from previous multisense word embeddings models that assign multiple independent and sense-specific embeddings to each word, our proposed models are lightweight and have flexible frameworks that regard word sense as the composition of two parts: a general sense derived from a large corpus and a topic-specific sense derived from a task-specific corpus. In addition, our proposed models focus on semantic composition instead of word understanding. With the help of topic models, we can integrate the topic-specific sense at word-level before the composition and sentence-level after the composition. Comprehensive experiments on five public sentence classification datasets are conducted and the results show that our proposed topic-aware deep compositional models produce competitive or better performance than other text representation learning methods. Accepted version 2017-05-26T07:02:21Z 2019-12-06T15:18:03Z 2017-05-26T07:02:21Z 2019-12-06T15:18:03Z 2017 Journal Article Zhao, R., & Mao, K. (2017). Topic-Aware Deep Compositional Models for Sentence Classification. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 25(2), 248-260. 2329-9290 https://hdl.handle.net/10356/83235 http://hdl.handle.net/10220/42502 10.1109/TASLP.2016.2632521 en IEEE/ACM Transactions on Audio, Speech, and Language Processing © 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: [http://dx.doi.org/10.1109/TASLP.2016.2632521]. 13 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
country |
Singapore |
collection |
DR-NTU |
language |
English |
topic |
Machine learning Natural language processing |
spellingShingle |
Machine learning Natural language processing Zhao, Rui Mao, Kezhi Topic-Aware Deep Compositional Models for Sentence Classification |
description |
In recent years, deep compositional models have emerged as a popular technique for representation learning of sentence in computational linguistic and natural language processing. These models normally train various forms of neural networks on top of pretrained word embeddings using a task-specific corpus. However, most of these works neglect the multisense nature of words in the pretrained word embeddings. In this paper we introduce topic models to enrich the word embeddings for multisenses of words. The integration of the topic model with various semantic compositional processes leads to topic-aware convolutional neural network and topic-aware long short term memory networks. Different from previous multisense word embeddings models that assign multiple independent and sense-specific embeddings to each word, our proposed models are lightweight and have flexible frameworks that regard word sense as the composition of two parts: a general sense derived from a large corpus and a topic-specific sense derived from a task-specific corpus. In addition, our proposed models focus on semantic composition instead of word understanding. With the help of topic models, we can integrate the topic-specific sense at word-level before the composition and sentence-level after the composition. Comprehensive experiments on five public sentence classification datasets are conducted and the results show that our proposed topic-aware deep compositional models produce competitive or better performance than other text representation learning methods. |
author2 |
School of Electrical and Electronic Engineering |
author_facet |
School of Electrical and Electronic Engineering Zhao, Rui Mao, Kezhi |
format |
Article |
author |
Zhao, Rui Mao, Kezhi |
author_sort |
Zhao, Rui |
title |
Topic-Aware Deep Compositional Models for Sentence Classification |
title_short |
Topic-Aware Deep Compositional Models for Sentence Classification |
title_full |
Topic-Aware Deep Compositional Models for Sentence Classification |
title_fullStr |
Topic-Aware Deep Compositional Models for Sentence Classification |
title_full_unstemmed |
Topic-Aware Deep Compositional Models for Sentence Classification |
title_sort |
topic-aware deep compositional models for sentence classification |
publishDate |
2017 |
url |
https://hdl.handle.net/10356/83235 http://hdl.handle.net/10220/42502 |
_version_ |
1681040048826351616 |