Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Backpropagation and F-adjoint

Version 1 : Received: 26 April 2023 / Approved: 27 April 2023 / Online: 27 April 2023 (05:01:36 CEST)

How to cite: Boughammoura, A. Backpropagation and F-adjoint. Preprints 2023, 2023041046. https://doi.org/10.20944/preprints202304.1046.v1 Boughammoura, A. Backpropagation and F-adjoint. Preprints 2023, 2023041046. https://doi.org/10.20944/preprints202304.1046.v1

Abstract

This paper presents a concise mathematical framework for investigating both feed-forward and backward process, during the training to learn model weights, of an artificial neural network (ANN). Inspired from the idea of the two-step rule for backpropagation, we define a notion of F_adjoint which is aimed at a better description of the backpropagation algorithm. In particular, by introducing the notions of F-propagation and F-adjoint through a deep neural network architecture, the backpropagation associated to a cost/loss function is proven to be completely characterized by the F-adjoint of the corresponding F-propagation relatively to the partial derivative, with respect to the inputs, of the cost function.

Keywords

Artificial neural networks;Backpropagation;Two-step rule;F-propagation;F-adjoint

Subject

Computer Science and Mathematics, Applied Mathematics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.