Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Enhancing Sentence Representation with Syntactically Structural Transformer

Version 1 : Received: 8 December 2023 / Approved: 8 December 2023 / Online: 11 December 2023 (04:37:19 CET)

How to cite: Elizabeth, P.; Patel, R.; Jose, F. Enhancing Sentence Representation with Syntactically Structural Transformer. Preprints 2023, 2023120600. https://doi.org/10.20944/preprints202312.0600.v1 Elizabeth, P.; Patel, R.; Jose, F. Enhancing Sentence Representation with Syntactically Structural Transformer. Preprints 2023, 2023120600. https://doi.org/10.20944/preprints202312.0600.v1

Abstract

The evolution of sentence representation learning, especially with parse tree encoders, has shown remarkable progress. Traditional approaches predominantly rely on recursive encoding of tree structures, which impedes parallel processing capabilities. Additionally, these methods often overlook the significance of dependency tree arc labels. To overcome these limitations, we introduce the Syntax-Enhanced Transformer (SET), incorporating a novel dual-attention mechanism that integrates relation-focused attention alongside traditional self-attention. This design effectively encodes both dependency and spatial positional relationships within sentence dependency trees. Our approach innovatively incorporates syntactic information into the Transformer framework without compromising its inherent parallelizability. The SET demonstrates superior or comparable performance to contemporary methods across various sentence representation tasks, significantly enhancing computational efficiency.

Keywords

Syntactic tree; Transformer; Self-attention

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.