Knowledge-aware representation learning for natural language processing applications

Natural Language Processing (NLP) stands as a vital subfield of artificial intelligence, empowering computers to interpret and understand human language. In recent years, NLP has seamlessly integrated into our daily lives, with applications spanning sentiment analysis, text classification, and dialo...

Full description

Saved in:
Bibliographic Details
Main Author: Zhang, Jiaheng
Other Authors: Mao Kezhi
Format: Thesis-Doctor of Philosophy
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/171068
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-171068
record_format dspace
spelling sg-ntu-dr.10356-1710682023-11-02T02:20:48Z Knowledge-aware representation learning for natural language processing applications Zhang, Jiaheng Mao Kezhi School of Electrical and Electronic Engineering EKZMao@ntu.edu.sg Engineering::Electrical and electronic engineering Natural Language Processing (NLP) stands as a vital subfield of artificial intelligence, empowering computers to interpret and understand human language. In recent years, NLP has seamlessly integrated into our daily lives, with applications spanning sentiment analysis, text classification, and dialogue systems. The bedrock of success in NLP applications lies in the domain of text representation. Robust text representations extract critical, distinguishing, and meaningful information from raw textual data. The advent of deep learning models has transformed NLP, outperforming conventional rule-based systems. These models typically encompass a sequence of stages, including raw data preprocessing, feature extraction, and classification. Nonetheless, challenges persist. Overfitting is a common pitfall, primarily stemming from the limited diversity and formatting of raw data. Additionally, the interpretability of data-driven approaches remains a stumbling block for real-world deployment. A promising avenue for addressing these challenges involves the infusion of external knowledge into NLP models. This thesis embarks on this journey by exploring knowledge-enriched solutions across three pivotal NLP applications: Sentiment Analysis: External knowledge, sourced from sentiment-related lexicons such as WordNet, is seamlessly incorporated with Siamese networks into conventional deep learning models. This integration enhances sentiment analysis, enabling a nuanced understanding of emotional nuances. Text Classification: A novel multi-scaled topic embedding methodology is introduced, effectively merging external knowledge sources with deep neural networks. The outcome is a significant boost in text classification accuracy, capitalizing on domain-specific insights. Answer Selection: Four distinct network architectures leveraging topic embeddings are proposed, yielding superior text representations. These networks prove instrumental in enhancing answer selection tasks. By delving into knowledge-enriched representation learning within NLP applications, this thesis presents innovative methodologies tailored to the unique characteristics of each application. Empirical findings underscore the efficacy of these knowledge-enriched approaches in enhancing baseline systems, thereby paving the path for future research in the field. Doctor of Philosophy 2023-10-11T06:18:22Z 2023-10-11T06:18:22Z 2023 Thesis-Doctor of Philosophy Zhang, J. (2023). Knowledge-aware representation learning for natural language processing applications. Doctoral thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/171068 https://hdl.handle.net/10356/171068 10.32657/10356/171068 en This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0). application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
spellingShingle Engineering::Electrical and electronic engineering
Zhang, Jiaheng
Knowledge-aware representation learning for natural language processing applications
description Natural Language Processing (NLP) stands as a vital subfield of artificial intelligence, empowering computers to interpret and understand human language. In recent years, NLP has seamlessly integrated into our daily lives, with applications spanning sentiment analysis, text classification, and dialogue systems. The bedrock of success in NLP applications lies in the domain of text representation. Robust text representations extract critical, distinguishing, and meaningful information from raw textual data. The advent of deep learning models has transformed NLP, outperforming conventional rule-based systems. These models typically encompass a sequence of stages, including raw data preprocessing, feature extraction, and classification. Nonetheless, challenges persist. Overfitting is a common pitfall, primarily stemming from the limited diversity and formatting of raw data. Additionally, the interpretability of data-driven approaches remains a stumbling block for real-world deployment. A promising avenue for addressing these challenges involves the infusion of external knowledge into NLP models. This thesis embarks on this journey by exploring knowledge-enriched solutions across three pivotal NLP applications: Sentiment Analysis: External knowledge, sourced from sentiment-related lexicons such as WordNet, is seamlessly incorporated with Siamese networks into conventional deep learning models. This integration enhances sentiment analysis, enabling a nuanced understanding of emotional nuances. Text Classification: A novel multi-scaled topic embedding methodology is introduced, effectively merging external knowledge sources with deep neural networks. The outcome is a significant boost in text classification accuracy, capitalizing on domain-specific insights. Answer Selection: Four distinct network architectures leveraging topic embeddings are proposed, yielding superior text representations. These networks prove instrumental in enhancing answer selection tasks. By delving into knowledge-enriched representation learning within NLP applications, this thesis presents innovative methodologies tailored to the unique characteristics of each application. Empirical findings underscore the efficacy of these knowledge-enriched approaches in enhancing baseline systems, thereby paving the path for future research in the field.
author2 Mao Kezhi
author_facet Mao Kezhi
Zhang, Jiaheng
format Thesis-Doctor of Philosophy
author Zhang, Jiaheng
author_sort Zhang, Jiaheng
title Knowledge-aware representation learning for natural language processing applications
title_short Knowledge-aware representation learning for natural language processing applications
title_full Knowledge-aware representation learning for natural language processing applications
title_fullStr Knowledge-aware representation learning for natural language processing applications
title_full_unstemmed Knowledge-aware representation learning for natural language processing applications
title_sort knowledge-aware representation learning for natural language processing applications
publisher Nanyang Technological University
publishDate 2023
url https://hdl.handle.net/10356/171068
_version_ 1781793731555360768