Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Empowering Sentence Representation with Semantic-Enhanced Tree-LSTMs

Version 1 : Received: 27 November 2023 / Approved: 28 November 2023 / Online: 28 November 2023 (07:23:54 CET)

How to cite: H. Chris, A.; Lee, F.; Ali, W. Empowering Sentence Representation with Semantic-Enhanced Tree-LSTMs. Preprints 2023, 2023111766. https://doi.org/10.20944/preprints202311.1766.v1 H. Chris, A.; Lee, F.; Ali, W. Empowering Sentence Representation with Semantic-Enhanced Tree-LSTMs. Preprints 2023, 2023111766. https://doi.org/10.20944/preprints202311.1766.v1

Abstract

The Semantic-Enhanced Tree-LSTM (SeT-LSTM) network, a novel advancement in the realm of linguistic modeling, marks a significant step forward from traditional Tree-based Long Short Term Memory (LSTM) networks. By intricately weaving in the nuances of typed grammatical dependencies, SeT-LSTMs offer a more nuanced understanding of language semantics. Traditional models often overlook the semantic shift caused by variations in word or phrase roles, a gap this paper aims to fill by focusing on the types of grammatical connections, or typed dependencies, within sentences. Our proposed architecture, dubbed the Semantic Relationship-Guided LSTM (SRG-LSTM), leverages a control mechanism to model the interplay between sequence elements. Additionally, we present a novel Tree-LSTM variant, the Semantic Dependency Tree-LSTM (SDT-LSTM), which integrates dependency parse structures with dependency types for more robust sentence embedding. The SDT-LSTM demonstrates superior performance in Semantic Relatedness Scoring and Sentiment Analysis compared to its predecessors. Qualitatively, it shows resilience to changes in sentence voice and heightened sensitivity to nominal alterations, aligning well with human intuition. This research underlines the pivotal role of grammatical relationships in sentence understanding and paves the way for further exploration in this domain.

Keywords

Natural Language Processing; Semantic Parsing; Syntactic Dependencies; Language Understanding

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.