Granular syntax processing with multi-task and curriculum learning

Syntactic processing techniques are the foundation of natural language processing (NLP), supporting many downstream NLP tasks. In this paper, we conduct pair-wise multi-task learning (MTL) on syntactic tasks with different granularity, namely Sentence Boundary Detection (SBD), text chunking, and Par...

Full description

Saved in:
Bibliographic Details
Main Authors: Zhang, Xulang, Mao, Rui, Cambria, Erik
Other Authors: College of Computing and Data Science
Format: Article
Language:English
Published: 2024
Subjects:
Online Access:https://hdl.handle.net/10356/180666
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Syntactic processing techniques are the foundation of natural language processing (NLP), supporting many downstream NLP tasks. In this paper, we conduct pair-wise multi-task learning (MTL) on syntactic tasks with different granularity, namely Sentence Boundary Detection (SBD), text chunking, and Part-of-Speech (PoS) tagging, so as to investigate the extent to which they complement each other. We propose a novel soft parameter-sharing mechanism to share local and global dependency information that is learned from both target tasks. We also propose a curriculum learning (CL) mechanism to improve MTL with non-parallel labeled data. Using non-parallel labeled data in MTL is a common practice, whereas it has not received enough attention before. For example, our employed PoS tagging data do not have text chunking labels. When learning PoS tagging and text chunking together, the proposed CL mechanism aims to select complementary samples from the two tasks to update the parameters of the MTL model in the same training batch. Such a method yields better performance and learning stability. We conclude that the fine-grained tasks can provide complementary features to coarse-grained ones, while the most coarse-grained task, SBD, provides useful information for the most fine-grained one, PoS tagging. Additionally, the text chunking task achieves state-of-the-art performance when joint learning with PoS tagging. Our analytical experiments also show the effectiveness of the proposed soft parameter-sharing and CL mechanisms.