Version 1
: Received: 20 January 2021 / Approved: 22 January 2021 / Online: 22 January 2021 (08:37:03 CET)
Version 2
: Received: 13 March 2021 / Approved: 22 March 2021 / Online: 22 March 2021 (16:09:04 CET)
Michelucci, U.; Venturini, F. Estimating Neural Network’s Performance with Bootstrap: A Tutorial. Mach. Learn. Knowl. Extr.2021, 3, 357-373.
Michelucci, U.; Venturini, F. Estimating Neural Network’s Performance with Bootstrap: A Tutorial. Mach. Learn. Knowl. Extr. 2021, 3, 357-373.
Michelucci, U.; Venturini, F. Estimating Neural Network’s Performance with Bootstrap: A Tutorial. Mach. Learn. Knowl. Extr.2021, 3, 357-373.
Michelucci, U.; Venturini, F. Estimating Neural Network’s Performance with Bootstrap: A Tutorial. Mach. Learn. Knowl. Extr. 2021, 3, 357-373.
Abstract
Neural networks present the characteristics that the results are strongly dependent on the training data, the weight initialisation, and the hyper-parameters chosen. The determination of the distribution of a statistical estimator, as the Mean Squared Error (MSE) or the accuracy, is fundamental to evaluate the performance of a neural network model (NNM). For many machine learning models, as linear regression, it is possible to analytically obtain information as variance or confidence intervals on the results. Neural networks present the difficulty of being not analytically tractable due to their complexity. Therefore, it is impossible to easily estimate distributions of statistical estimators. When estimating the global performance of an NNM by estimating the MSE in a regression problem, for example, it is important to know the variance of the MSE. Bootstrap is one of the most important resampling techniques to estimate averages and variances, between other properties, of statistical estimators. In this tutorial, the application of two resampling (including bootstrap) techniques to the evaluation of neural networks’ performance is explained from both a theoretical and practical point of view. Pseudo-code of the algorithms is provided to facilitate their implementation. Computational aspects, as the training time, are discussed since resampling techniques always require to run simulations many thousands of times and, therefore, are computationally intensive. A specific version of the bootstrap algorithm is presented that allows the estimation of the distribution of a statistical estimator when dealing with an NNM in a computationally effective way. Finally, algorithms are compared on synthetically generated data to demonstrate their performance.
Computer Science and Mathematics, Algebra and Number Theory
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.