Submitted:
02 December 2025
Posted:
03 December 2025
You are already at the latest version
Abstract
Keywords:
MSC: 68T07, 68Q32, 41A25
1. Introduction
2. Preliminaries
2.1. Deep Convolutional Neural Networks with Down Sampling
2.2. Spherical Harmonics and Sobolev Space on the Sphere
2.3. Near-Optimal Approximation and Cubature Formula
3. Approximating Functions with Multiple Features on Sphere
3.1. Proof of Theorem 1
3.2. Approximating Functions with Polynomial Features
3.3. Approximating Functions with Symmetric Polynomial Features
4. Conclusions
Funding
Data Availability Statement
Conflicts of Interest
References
- Goodfellow, I. ; Bengio, Y. and Courville, A. Deep Learning. MIT Press, 2016.
- Hinton, G. E.; Osindero, S. and Teh, Y. W. A fast learning algorithm for deep belief nets. Neural Computation. 2006, 18, 1527–1554. [Google Scholar] [CrossRef] [PubMed]
- Krizhevsky, A.; Sutskever, I. and Hinton, G. E. ImageNet classification with deep convolutional neural networks. Communications of the ACM. 2017, 60, 84–90. [Google Scholar] [CrossRef]
- DeVore, R.; Hanin, B. and Petrova, G. Neural network approximation. ACTA Numerica. 2021, 30, 327–444. [Google Scholar] [CrossRef]
- Elbrächter, D.; Perekrestenko, D.; Grohs, P. and Böelcskei, H. Deep Neural Network Approximation Theory. IEEE Transactions on Information Theory. 2021, 67, 2581–2623. [Google Scholar] [CrossRef]
- Bartolucci, F.; De Vito, E.; Rosasco, L. and Vigogna, S. Understanding neural networks with reproducing kernel Banach spaces. Applied and Computational Harmonic Analysis. 2023, 62, 194–236. [Google Scholar] [CrossRef]
- Song, L. H.; Liu, Y.; Fan, J. and Zhou, D. X. Approximation of smooth functionals using deep ReLU networks. Neural Networks. 2023, 166, 424–436. [Google Scholar] [CrossRef]
- Zhou, D. X. Universality of deep convolutional neural networks. Applied and Computational Harmonic Analysis. 2020, 48, 787–794. [Google Scholar] [CrossRef]
- Zhou, D. X. Theory of deep convolutional neural networks: Downsampling. Neural Networks. 2020, 124, 319–327. [Google Scholar] [CrossRef]
- Fang, Z. Y.; Feng, H.; Huang, S. and Zhou, D. X. Theory of deep convolutional neural networks II: Spherical analysis. Neural Networks. 2020, 131, 154–162. [Google Scholar] [CrossRef] [PubMed]
- Yarotsky, D. Error bounds for approximations with deep ReLU networks. Neural Networks. 2017, 94, 103–114. [Google Scholar] [CrossRef] [PubMed]
- Zhou, D. X. Deep distributed convolutional neural networks: Universality. Analysis and Applications. 2018, 16, 895–919. [Google Scholar] [CrossRef]
- Barron, A. R. Universal approximation bounds for superpositions of a sigmoidal function. IEEE Transactions on Information Theory. 1993, 39, 930–945. [Google Scholar] [CrossRef]
- Klusowski, J. and Barron, A. R. Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With ℓ1 and ℓ0 Controls. IEEE Transactions on Information Theory. 2018, 64, 7649–7656. [Google Scholar] [CrossRef]
- Suzuki,T. Adaptivity of deep ReLU network for learning in Besov and mixed smooth Besov spaces: optimal rate and curse of dimensionality. International Conference on Learning Representations. New Orleans, United States, 2019.
- Montanelli, H. and Du, Q. New Error Bounds for Deep ReLU Networks Using Sparse Grids. SIAM Journal on Mathematics of Data Science. 2019, 1, 78–92. [Google Scholar] [CrossRef]
- Bach, F. Breaking the Curse of Dimensionality with Convex Neural Networks. Journal of Machine Learning Research. 2017, 18, 19. [Google Scholar]
- Bauer, B. and Kohler, M. On deep learning as a remedy for the curse of dimensionality in nonparametric regression. Annals of Statistics. 2019, 47, 2261–2285. [Google Scholar] [CrossRef]
- Feng, H.; Huang, S. and Zhou, D. X. Generalization Analysis of CNNs for Classification on Spheres. IEEE Transactions on Neural Networks and Learning Systems. 2023, 34, 6200–6213. [Google Scholar] [CrossRef]
- Mao, T.; Shi, Z. J. and Zhou, D. X. Approximating functions with multi-features by deep convolutional neural networks. Analysis and Applications. 2023, 21, 93–125. [Google Scholar] [CrossRef]
- Mao, T.; Shi, Z. J. and Zhou, D. X. Theory of deep convolutional neural networks III: Approximating radial functions. Neural Networks. 2021, 144, 778–790. [Google Scholar] [CrossRef]
- Dai, F. and Xu, Y. Approximation Theory and Harmmonic Analysis on Spheres and Balls. Springer, 2013.
- Brown, G. and Dai, F. Approximation of smooth functions on compact two-point homogeneous spaces. Journal of Functional Analysis. 2005, 220, 401–423. [Google Scholar] [CrossRef]
- De Boor, C. and Fix, G. J. Spline approximation by quasiinterpolants. Journal of Approximation Theory. 1973, 8, 19–45. [Google Scholar] [CrossRef]
- Feng, H.; Hou, S. Z.; Wei, L. Y. and Zhou, D. X. CNN models for readability of chinese texts. Mathematical Foundations of Computing. 2022, 5, 351–362. [Google Scholar] [CrossRef]
- Ahmed, R.; Fahim, A. I.; Islam, M.; Islam, S. and Shatabda, S. Dolg-next: Convolutional neural network with deep orthogonal fusion of local and global features for biomedical image segmentation. Neurocomputing. 2023, 546, 126362. [Google Scholar] [CrossRef]
- Silver, D.; Schrittwieser, J.; Simonyan, K.; et al. Mastering the game of Go without human knowledge. Nature. 2017, 550, 354–359. [Google Scholar] [CrossRef] [PubMed]
- Herrmann, L.; Opschoor, J. and Schwab, C. Constructive Deep ReLU Neural Network Approximation. Journal of Scientific Computing. 2022, 90, 75. [Google Scholar] [CrossRef]
- Cybenko, G. Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals and Systems. 1989, 2, 303–314. [Google Scholar] [CrossRef]
- Hornik, K. ; Stinchcombe,M. and White,H. Multilayer feedforward networks are universal approximators. Neural Networks. 1989, 2, 359–366. [Google Scholar] [CrossRef]
- DeVore, R.; Howard, R. and Micchelli, C. Optimal nonlinear approximation. Manuscripta Mathematica. 1989, 63, 469–478. [Google Scholar] [CrossRef]
- Mallat, S. Understanding deep convolutional networks. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences. 2016, 374, 20150203. [Google Scholar] [CrossRef]
- Wiatowski, T. and Bölcskei, H. A Mathematical Theory of Deep Convolutional Neural Networks for Feature Extraction. IEEE Transactions on Information Theory. 2018, 64, 1845–1866. [Google Scholar] [CrossRef]
- Mhaskar, H. and Poggio, T. Deep vs. shallow networks: An approximation theory perspective. Analysis and Applications. 2016, 14, 829–848. [Google Scholar] [CrossRef]
- Mhaskar, H. N. and Micchelli, C. A. Approximation by superposition of sigmoidal and radial basis functions. Advances in Applied Mathematics. 1992, 13, 350–373. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
