Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Linguistic-aided Sentence Representation Learning

Version 1 : Received: 30 November 2023 / Approved: 30 November 2023 / Online: 30 November 2023 (15:38:21 CET)

How to cite: Nalin, G.; Dante, V.; Ali, W. Linguistic-aided Sentence Representation Learning. Preprints 2023, 2023112002. https://doi.org/10.20944/preprints202311.2002.v1 Nalin, G.; Dante, V.; Ali, W. Linguistic-aided Sentence Representation Learning. Preprints 2023, 2023112002. https://doi.org/10.20944/preprints202311.2002.v1

Abstract

The realm of natural language processing has always been fascinated by the intricacies of sentence semantics. The emergence of context-aware word representations, especially from pre-trained models like ELMO and BERT, has revolutionized several semantic tasks including but not limited to, question answering, text categorization, and sentiment analysis. Despite these advancements, the integration of supplementary knowledge, particularly syntactic, to augment semantic comprehension in models remains an area ripe for exploration. This paper introduces the Syntactic Enhancement Network (SEN), a pioneering method for synergizing syntactic elements with established pre-trained language models. Our approach encompasses a dual-phase evaluation: initially, we delve into the syntactic augmentation of both RNN-based and Transformer-based language models; subsequently, we test our SEN's proficiency in two specific domains: the task of sentence completion and the extraction of biological relationships. The results are striking, with SEN achieving a stellar 91.2% accuracy in sentence completion—surpassing baseline models by a substantial 37.8%. In the context of biological relation extraction, SEN demonstrates competitive prowess with a 75.1% F1 score.

Keywords

sentence semantics; syntactic enhancement network; language model integration

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.