Preprint Article Version 2 Preserved in Portico This version is not peer-reviewed

A Two-Step Rule for Backpropagation

Version 1 : Received: 27 February 2023 / Approved: 1 March 2023 / Online: 1 March 2023 (01:57:59 CET)
Version 2 : Received: 2 March 2023 / Approved: 3 March 2023 / Online: 3 March 2023 (01:24:54 CET)
Version 3 : Received: 8 March 2023 / Approved: 9 March 2023 / Online: 9 March 2023 (02:04:21 CET)

A peer-reviewed article of this Preprint also exists.

BOUGHAMMOURA, A. A Two-Step Rule for Backpropagation. International Journal of Informatics and Applied Mathematics 2023, doi:10.53508/ijiam.1265832. BOUGHAMMOURA, A. A Two-Step Rule for Backpropagation. International Journal of Informatics and Applied Mathematics 2023, doi:10.53508/ijiam.1265832.

Abstract

We present a simplified computational rule for the back-propagation formulas for artificial neural networks. In this work, we provide a generic two-step rule for the back-propagation algorithm in matrix notation. Moreover, this rule incorporates both the forward and backward phases of the computations involved in the learning process. Specifically, this recursively computing rule permits the propagation of the changes to all synaptic weights in the network, layer by layer, efficiently. In particular, we use this rule to compute both the up and down partial derivatives of the cost function of all the connections feeding into the output layer.

Keywords

Artificial neural networks; back-propagation

Subject

Computer Science and Mathematics, Computational Mathematics

Comments (1)

Comment 1
Received: 3 March 2023
Commenter: Ahmed Boughammoura
Commenter's Conflict of Interests: Author
Comment: Minor changes has been done in the text: we have just added two specific formula (13) and (22) ( page 3 and 5 respectively) for distinguishing some practical case.
+ Respond to this comment

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 1
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.