Fan, Y.; Li, B.; Sataer, Y.; Gao, M.; Shi, C.; Gao, Z. Addressing Long-Distance Dependencies in AMR Parsing with Hierarchical Clause Annotation. Electronics2023, 12, 3908.
Fan, Y.; Li, B.; Sataer, Y.; Gao, M.; Shi, C.; Gao, Z. Addressing Long-Distance Dependencies in AMR Parsing with Hierarchical Clause Annotation. Electronics 2023, 12, 3908.
Fan, Y.; Li, B.; Sataer, Y.; Gao, M.; Shi, C.; Gao, Z. Addressing Long-Distance Dependencies in AMR Parsing with Hierarchical Clause Annotation. Electronics2023, 12, 3908.
Fan, Y.; Li, B.; Sataer, Y.; Gao, M.; Shi, C.; Gao, Z. Addressing Long-Distance Dependencies in AMR Parsing with Hierarchical Clause Annotation. Electronics 2023, 12, 3908.
Abstract
Most natural language processing (NLP) tasks operate an input sentence as a sequence with token-level embeddings and features, despite its clausal structures. Taking Abstract Meaning Representation (AMR) parsing as an example, recent parsers are empowered by Transformers and pre-trained language models, but long-distance dependencies (LDDs) introduced by long sequences are still open problems. We argue that LDDs are not superficially blamed on the sequence length but are essentially related to the internal clause hierarchy. Typically, non-verb words in a clause cannot depend on words outside, and verbs from different but related clauses have much longer dependencies than those in the same clause. With this intuition, we introduce a type of clausal feature, hierarchical clause annotation (HCA), into AMR parsing and propose two HCA-based approaches, HCA-based self-attention (HCA-SA) and HCA-based curriculum learning (HCA-CL), to integrate HCA trees of complex sentences for addressing LDDs. We conduct extensive experiments on two in-distribution (ID) AMR datasets (AMR 2.0 and AMR 3.0) and three out-of-distribution (OOD) ones (TLP, New3, and Bio). Experimental results show that our HCA-based approaches achieve significant and explainable improvements against the baseline model and outperform the state-of-the-art (SOTA) model when encountering sentences with complex clausal structures that introduce most LDD cases.
Keywords
hierarchical clause annotation; long-distance dependencies; AMR parsing; self-attention; curriculum learning
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.