Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

RNN-DBSVM: Optimal Recurrent Neural Network Density Based Support Vector Machine

Version 1 : Received: 18 July 2023 / Approved: 18 July 2023 / Online: 19 July 2023 (08:06:13 CEST)

How to cite: El Moutaouakil, K.; Elouissari, A.; Olaru, A.; Palade, V.; Ciorei, M. RNN-DBSVM: Optimal Recurrent Neural Network Density Based Support Vector Machine. Preprints 2023, 2023071306. https://doi.org/10.20944/preprints202307.1306.v1 El Moutaouakil, K.; Elouissari, A.; Olaru, A.; Palade, V.; Ciorei, M. RNN-DBSVM: Optimal Recurrent Neural Network Density Based Support Vector Machine. Preprints 2023, 2023071306. https://doi.org/10.20944/preprints202307.1306.v1

Abstract

When implementing SVMs, two major problems are encountered: (a) the number of local minima increases exponentially with the number of samples and (b) the quantity of required computer storage, required for a regular quadratic programming solver, increases by an exponential mag-nitude as the problem size expands. The Kernel-Adatron family of algorithms gaining attention lately which has allowed it to handle very large classification and regression problems. Howev-er, these methods treat different types of samples (Noise, border, and core) in the same manner, which causes searches in unpromising areas and increases the number of iterations. In this work, we introduce a hybrid method to overcome these shortcomings, namely Optimal Recurrent Neu-ral Network Density Based Support Vector Machine (Opt-RNN-DBSVM). This method consists of four steps: (a) characterization of different samples, (b) elimination of samples with a low probability of being a support vector, (c) construction of an appropriate recurrent neural network based on an original energy function, and (d) solution of the system of differential equations, managing the dynamics of the RNN, using the Euler-Cauchy method involving an optimal time step. The RNN remembers the regions explored during the search process thanks to its recurrent architecture. We demonstrated that RNN-SVM converges to feasible support vectors and Opt-RNN-DBSVM has a very low time complexity compared to RNN-SVM with constant time step, and KAs-SVM. Several experiments were performed on academic data sets. We used several classification performances measures to compare Opt-RNN-DBSVM to different classification methods and the results obtained show the good performance of the proposed method.

Keywords

Recurrent Neural Network(RNN); Support Vector Machine (SVM); Kernel-Adatron algorithm (KA); Euler-Cauchy Algorithm

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.