Granular syntax processing with multi-task and curriculum learning

Syntactic processing techniques are the foundation of natural language processing (NLP), supporting many downstream NLP tasks. In this paper, we conduct pair-wise multi-task learning (MTL) on syntactic tasks with different granularity, namely Sentence Boundary Detection (SBD), text chunking, and Par...

Full description

Saved in:
Bibliographic Details
Main Authors: Zhang, Xulang, Mao, Rui, Cambria, Erik
Other Authors: College of Computing and Data Science
Format: Article
Language:English
Published: 2024
Subjects:
Online Access:https://hdl.handle.net/10356/180666
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-180666
record_format dspace
spelling sg-ntu-dr.10356-1806662024-10-18T02:57:22Z Granular syntax processing with multi-task and curriculum learning Zhang, Xulang Mao, Rui Cambria, Erik College of Computing and Data Science Computer and Information Science Text chunking Curriculum learning Syntactic processing techniques are the foundation of natural language processing (NLP), supporting many downstream NLP tasks. In this paper, we conduct pair-wise multi-task learning (MTL) on syntactic tasks with different granularity, namely Sentence Boundary Detection (SBD), text chunking, and Part-of-Speech (PoS) tagging, so as to investigate the extent to which they complement each other. We propose a novel soft parameter-sharing mechanism to share local and global dependency information that is learned from both target tasks. We also propose a curriculum learning (CL) mechanism to improve MTL with non-parallel labeled data. Using non-parallel labeled data in MTL is a common practice, whereas it has not received enough attention before. For example, our employed PoS tagging data do not have text chunking labels. When learning PoS tagging and text chunking together, the proposed CL mechanism aims to select complementary samples from the two tasks to update the parameters of the MTL model in the same training batch. Such a method yields better performance and learning stability. We conclude that the fine-grained tasks can provide complementary features to coarse-grained ones, while the most coarse-grained task, SBD, provides useful information for the most fine-grained one, PoS tagging. Additionally, the text chunking task achieves state-of-the-art performance when joint learning with PoS tagging. Our analytical experiments also show the effectiveness of the proposed soft parameter-sharing and CL mechanisms. 2024-10-18T02:57:22Z 2024-10-18T02:57:22Z 2024 Journal Article Zhang, X., Mao, R. & Cambria, E. (2024). Granular syntax processing with multi-task and curriculum learning. Cognitive Computation. https://dx.doi.org/10.1007/s12559-024-10320-1 1866-9956 https://hdl.handle.net/10356/180666 10.1007/s12559-024-10320-1 2-s2.0-85197665079 en Cognitive Computation © 2024 The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature. All rights reserved.
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Computer and Information Science
Text chunking
Curriculum learning
spellingShingle Computer and Information Science
Text chunking
Curriculum learning
Zhang, Xulang
Mao, Rui
Cambria, Erik
Granular syntax processing with multi-task and curriculum learning
description Syntactic processing techniques are the foundation of natural language processing (NLP), supporting many downstream NLP tasks. In this paper, we conduct pair-wise multi-task learning (MTL) on syntactic tasks with different granularity, namely Sentence Boundary Detection (SBD), text chunking, and Part-of-Speech (PoS) tagging, so as to investigate the extent to which they complement each other. We propose a novel soft parameter-sharing mechanism to share local and global dependency information that is learned from both target tasks. We also propose a curriculum learning (CL) mechanism to improve MTL with non-parallel labeled data. Using non-parallel labeled data in MTL is a common practice, whereas it has not received enough attention before. For example, our employed PoS tagging data do not have text chunking labels. When learning PoS tagging and text chunking together, the proposed CL mechanism aims to select complementary samples from the two tasks to update the parameters of the MTL model in the same training batch. Such a method yields better performance and learning stability. We conclude that the fine-grained tasks can provide complementary features to coarse-grained ones, while the most coarse-grained task, SBD, provides useful information for the most fine-grained one, PoS tagging. Additionally, the text chunking task achieves state-of-the-art performance when joint learning with PoS tagging. Our analytical experiments also show the effectiveness of the proposed soft parameter-sharing and CL mechanisms.
author2 College of Computing and Data Science
author_facet College of Computing and Data Science
Zhang, Xulang
Mao, Rui
Cambria, Erik
format Article
author Zhang, Xulang
Mao, Rui
Cambria, Erik
author_sort Zhang, Xulang
title Granular syntax processing with multi-task and curriculum learning
title_short Granular syntax processing with multi-task and curriculum learning
title_full Granular syntax processing with multi-task and curriculum learning
title_fullStr Granular syntax processing with multi-task and curriculum learning
title_full_unstemmed Granular syntax processing with multi-task and curriculum learning
title_sort granular syntax processing with multi-task and curriculum learning
publishDate 2024
url https://hdl.handle.net/10356/180666
_version_ 1814777718370205696