Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Abstractive Summarization Model for Summarizing Scientific Article

Version 1 : Received: 16 May 2024 / Approved: 16 May 2024 / Online: 16 May 2024 (18:21:24 CEST)

How to cite: Ülker, M.; Özer, A. B. Abstractive Summarization Model for Summarizing Scientific Article. Preprints 2024, 2024051123. https://doi.org/10.20944/preprints202405.1123.v1 Ülker, M.; Özer, A. B. Abstractive Summarization Model for Summarizing Scientific Article. Preprints 2024, 2024051123. https://doi.org/10.20944/preprints202405.1123.v1

Abstract

The rapid growth in scientific article publications allows us to access articles as soon as possible. Therefore, automatic summarization systems (ATSs) are widely preferred. In most studies, the en-tire source document is expected to be summarized, just as it would be summarized by a human. Summarizing long articles, such as scientific articles, is quite difficult due to token restraint and extraction of scientific words. To address this problem, a novel Graph-Based Abstractive Summa-rization (GBAS) model is proposed, which is a novel scientific text summarization model based on SciBERT and the graph transformer network (GTN). The document's integrity is maintained since the SciIE system uses the graph structure to create a terminology-based document structure. Therefore, long documents are also summarized. The proposed model is compared with baseline models and human evaluation. Human evaluation results show that the results of the proposed model are informative, fluent, and consistent with the ground-truth summary. The experimental results indicate that the proposed model outperforms baseline models with a 37.10 and 34.96 ROUGE-L score.

Keywords

Text summarization; Abstractive method; SciBERT; SciIE; Graph transformer

Subject

Computer Science and Mathematics, Computer Science

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.