ARTICLE | doi:10.20944/preprints202304.1046.v1
Subject: Computer Science And Mathematics, Applied Mathematics Keywords: Artificial neural networks; Backpropagation; Two-step rule; F-propagation; F-adjoint
Online: 27 April 2023 (05:01:36 CEST)
This paper presents a concise mathematical framework for investigating both feed-forward and backward process, during the training to learn model weights, of an artificial neural network (ANN). Inspired from the idea of the two-step rule for backpropagation, we define a notion of F_adjoint which is aimed at a better description of the backpropagation algorithm. In particular, by introducing the notions of F-propagation and F-adjoint through a deep neural network architecture, the backpropagation associated to a cost/loss function is proven to be completely characterized by the F-adjoint of the corresponding F-propagation relatively to the partial derivative, with respect to the inputs, of the cost function.
ARTICLE | doi:10.20944/preprints202303.0001.v3
Subject: Computer Science And Mathematics, Computational Mathematics Keywords: Artificial neural networks; back-propagation
Online: 9 March 2023 (02:04:21 CET)
We present a simplified computational rule for the back-propagation formulas for artificial neural networks. In this work, we provide a generic two-step rule for the back-propagation algorithm in matrix notation. Moreover, this rule incorporates both the forward and backward phases of the computations involved in the learning process. Specifically, this recursively computing rule permits the propagation of the changes to all synaptic weights in the network, layer by layer, efficiently. In particular, we use this rule to compute both the up and down partial derivatives of the cost function of all the connections feeding into the output layer.