Preprint
Article

A Formalism of the General Mathematical Expression of Multilayer Perceptron Neural Networks

Submitted:

17 May 2021

Posted:

18 May 2021

You are already at the latest version

Abstract
Neural networks models are mostly represented by oriented graphs where only the components, constitutive elements of the graph, are transcribed into mathematical xpression. Indeed, accurate knowledge of the full expression of the model is required in certain situations such as selecting among several reference models, the one that best fits the available data or comparing the explanatory and predictive performance of an established model with respect to some reference models. In this paper, we establish a formalism of the mathematical expression for multilayer perceptron neural network in a general framework, MLP-p-n-q, with p, n and q natural integers and show its restriction to cases where one has a hidden layer and multivariate outputs (MLP-p-1-q), and then a single output (MLP-p-1-1). Then, we give some specific cases of the most commonly used models. An application case is presented in the context of solving a nonlinear regression problem.
Keywords: 
;  ;  ;  
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.

Downloads

1712

Views

405

Comments

0

Subscription

Notify me about updates to this article or when a peer-reviewed version is published.

Email

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2025 MDPI (Basel, Switzerland) unless otherwise stated