The realm of natural language processing has always been fascinated by the intricacies of sentence semantics. The emergence of context-aware word representations, especially from pre-trained models like ELMO and BERT, has revolutionized several semantic tasks including but not limited to, question answering, text categorization, and sentiment analysis. Despite these advancements, the integration of supplementary knowledge, particularly syntactic, to augment semantic comprehension in models remains an area ripe for exploration. This paper introduces the Syntactic Enhancement Network (SEN), a pioneering method for synergizing syntactic elements with established pre-trained language models. Our approach encompasses a dual-phase evaluation: initially, we delve into the syntactic augmentation of both RNN-based and Transformer-based language models; subsequently, we test our SEN's proficiency in two specific domains: the task of sentence completion and the extraction of biological relationships. The results are striking, with SEN achieving a stellar 91.2% accuracy in sentence completion—surpassing baseline models by a substantial 37.8%. In the context of biological relation extraction, SEN demonstrates competitive prowess with a 75.1% F1 score.
Keywords
sentence semantics; syntactic enhancement network; language model integration
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.