Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Machine Reading Comprehension Model Based on Fusion of Mixed Attention

Version 1 : Received: 13 May 2024 / Approved: 14 May 2024 / Online: 14 May 2024 (11:35:07 CEST)

How to cite: Wang, Y.; Ma, N.; Guo, Z. Machine Reading Comprehension Model Based on Fusion of Mixed Attention. Preprints 2024, 2024050966. https://doi.org/10.20944/preprints202405.0966.v1 Wang, Y.; Ma, N.; Guo, Z. Machine Reading Comprehension Model Based on Fusion of Mixed Attention. Preprints 2024, 2024050966. https://doi.org/10.20944/preprints202405.0966.v1

Abstract

To address the problems of insufficient semantic fusion between text and questions and the lack of consideration of global semantic information encountered in machine reading comprehension models, we proposed a machine reading comprehension model called BERT_hybrid based on BERT and hybrid attention mechanism. In this model, BERT is utilized to separately map the text and questions into the feature space. Through the integration of Bi-LSTM, attention mechanism, and self-attention mechanism, the proposed model achieves comprehensive semantic fusion between text and questions. The probabilities distribution of answers is computed using Softmax. Experimental results on the public dataset DuReader demonstrate that the proposed model achieves improvements in BLEU-4 and ROUGE-L scores compared to existing models. Furthermore, to validate the effectiveness of the proposed model design, we analyze the factors influencing the model’s performance.

Keywords

Machine reading comprehension; hybrid attention mechanism; DuReader2; BERT

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.