Patricia, R.; Sadok, J.; Patel, R. Advanced Machine Translation with Linguistic-Enhanced Transformer. Preprints2023, 2023120186. https://doi.org/10.20944/preprints202312.0186.v1
APA Style
Patricia, R., Sadok, J., & Patel, R. (2023). Advanced Machine Translation with Linguistic-Enhanced Transformer. Preprints. https://doi.org/10.20944/preprints202312.0186.v1
Chicago/Turabian Style
Patricia, R., Judith Sadok and Rodolfo Patel. 2023 "Advanced Machine Translation with Linguistic-Enhanced Transformer" Preprints. https://doi.org/10.20944/preprints202312.0186.v1
Abstract
Recent advancements in neural language models, particularly attention-based architectures like the Transformer, have substantially surpassed traditional methods in various natural language processing tasks. These models adeptly generate nuanced token representations by considering contextual relations within a sequence. However, augmenting these models with explicit syntactic knowledge, such as part of speech tags, has been found to remarkably bolster their effectiveness, especially under constrained data scenarios. This study introduces the Linguistic Enhanced Transformer (LET), which integrates multiple syntactic features, showing a notable increase in translation accuracy, evidenced by an improvement of up to 1.99 BLEU points on subsets of the WMT '14 English-German dataset. Furthermore, this paper demonstrates that enriching BERT models with syntax-aware embeddings enhances their performance on several GLUE benchmark tasks.
Keywords
enhanced machine translation; linguistic syntax; transformer model
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.