Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

RB-GAT: A Text Classification Model Based on RoBERTa-BiGRU with Graph ATtention Network

Version 1 : Received: 23 April 2024 / Approved: 23 April 2024 / Online: 25 April 2024 (15:11:24 CEST)

How to cite: Lv, S.; Dong, J.; Wang, C.; Wang, X.; Bao, Z. RB-GAT: A Text Classification Model Based on RoBERTa-BiGRU with Graph ATtention Network. Preprints 2024, 2024041579. https://doi.org/10.20944/preprints202404.1579.v1 Lv, S.; Dong, J.; Wang, C.; Wang, X.; Bao, Z. RB-GAT: A Text Classification Model Based on RoBERTa-BiGRU with Graph ATtention Network. Preprints 2024, 2024041579. https://doi.org/10.20944/preprints202404.1579.v1

Abstract

With the development of deep learning, several Graph Neural Networks (GNN)-based approaches have been utilized for text classification. However, GNNs encounter challenges in capturing contextual text information within a document sequence. To address this, a novel text classification model RB-GAT is proposed by combining RoBERTa-BiGRU embedding and a multi-head Graph ATtention Network (GAT). First, the pre-trained RoBERTa model is exploited to learn word and text embeddings in different contexts. Second, the Bidirectional Gated Recurrent Unit (BiGRU) is employed to capture long-term dependencies and bidirectional sentence information from the text context. Next, the multi-head graph attention network is applied to analyze this information, which serves as a node feature for the document. Finally, the classification results are generated through a Softmax layer. Experimental results on three benchmark datasets demonstrate that our method can achieve an accuracy of 71.48%, 98.45%, and 80.32% on Ohsumed, R8, and MR, which is superior to the existing nine text classification approaches.

Keywords

word embedding; RoBERTa; BiGRU; text classification; multi-head GAT

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.