Submitted:
01 April 2026
Posted:
02 April 2026
You are already at the latest version
Abstract
Keywords:
1. Introduction
- A measure-theoretic reformulation of MOST that provides a unified mathematical framework for region-based optimization.
- A rigorous construction of discrete MOST using counting measures and weighted measures.
- A theoretical formulation of bisection strategies for both even and odd cardinalities in discrete domains.
- A unified theory showing that continuous and discrete MOST are special cases of a single measure-based optimization framework.
- A clarification of the theoretical limitations of MOST, particularly in the presence of highly localized optima.
- Numerical validation through benchmark functions (e.g., Ackley function), comparing theoretical solutions and discrete MOST solutions under various discretization resolutions.
2.1. Problem Setting
2.2. Core Idea: Region-Based Optimization via Integral Comparison
2.3. Binary Partitioning of the Search Domain
2.4. Monte Carlo Approximation of Regional Integrals
2.5. Region Selection Rule
2.6. Deterministic Shrinking of the Search Region
2.7. Integral Averaging Effect
- ·
- Narrow local minima contribute negligibly to the integral
- ·
- Regions containing the global minimum dominate asymptotically
2.8. Comparison with Classical Optimization Methods
- ·
- ·
- ·
- ·
- Does not require gradients
- ·
- Does not rely on surrogate models
- ·
- Provides deterministic region shrinking
- ·
- Uses integral averaging to mitigate local irregularities
2.9. Summary
- Region-based evaluation using average integrals (2)
- Recursive binary partitioning of the search domain (4)–(5)
- Monte Carlo approximation of regional integrals (7)–(9)
- Deterministic region selection based on integral comparison (11)
- Geometric shrinking of the search domain (12)–(13)
3.1. Measure Space Formulation
- ·
- Continuous domains: is the Lebesgue measure
- ·
- Discrete domains: is the counting measure
3.2. Measure-Based Evaluation Functional
3.3. Correspondence to Probability Measures
3.4. Monte Carlo Approximation
3.5. Fundamental Theoretical Properties
3.6. Interpretation and Structural Insights
- 1.
- Measure-Based Optimization
- 2.
- Expectation-Based Evaluation
- 3.
- Intrinsic Smoothing Effect
- 4.
- Unified Framework
- o Continuous spaces (Lebesgue measure)
- o Discrete spaces (counting measure)
- o Hybrid spaces (product measures)
3.7. Summary
- The definition of the MOST evaluation functional as a normalized integral (17).
- The equivalence between regional evaluation and expectation under a probability measure (19).
- The Monte Carlo approximation framework and its almost sure convergence (21)–(24).
- The explicit dependence of MOST on the underlying measure (25).
4.1. Discrete Search Space and Counting Measure
4.2. Bisection Strategy: Even and Odd Cardinalities
4.2.1. Even Cardinality
4.2.2. Odd Cardinality
- ·
- Symmetry of partitioning
- ·
- Equal effective measure
- ·
- Consistency with continuous domain splitting
4.3. Monte Carlo Sampling in Discrete MOST
4.3.1. Continuous vs. Discrete Sampling
4.3.2. Uniform Discrete Sampling
4.3.3. Weighted Sampling for Odd Partition
4.3.4. Monte Carlo Estimation
4.4. Discrete MOST Algorithm
- 1.
- Partition into ,
- 2.
- Estimate:
- 3.
- Select:
4.5. Fundamental Properties
4.6. Summary
- Definition of discrete MOST via counting measure (29)
- Symmetric bisection strategies for even and odd cases (30)–(34)
- Discrete Monte Carlo sampling consistent with measure theory (35)–(37)
- A complete recursive optimization algorithm (38)–(40)
- Fundamental convergence properties (41)–(44)
5.1. Main Unification Theorem: Lebesgue versus Counting Measure
- 1.
- Continuous MOST is obtained when and is the Lebesgue measure , in which case
- 2.
- Discrete MOST is obtained when is a finite set and is the counting measure , in which case
5.2. Odd Partitioning as a Weighted Measure
5.3. Interpretation: Different Realizations of the Same Algorithm
5.4. Summary
- A single measure-based evaluation functional, defined by normalized integration, generates both continuous and discrete MOST (Theorem 5.1).
- Odd-cardinality partitioning in discrete domains is rigorously justified through weighted counting measures, thereby preserving symmetry and measure-theoretic consistency (Theorem 5.2).
- Continuous and discrete MOST are structurally identical algorithms whose apparent differences arise solely from the underlying measure (Theorem 5.3).
6.1. Conditions for Effectiveness
6.1.1. Unimodality
6.1.2. Low-Value Regions with Positive Measure
6.2. Fundamental Limitations
6.2.1. Isolated Minima (Needle Problem)
6.2.2. Multimodality
6.3. Comparison with Other Optimization Methods
6.3.1. Gradient-Based Methods
- ·
- does not require differentiability
- ·
- is inherently global due to region-based evaluation
- ·
- is less sensitive to local irregularities
6.3.2. Evolutionary Algorithms (GA, PSO)
- ·
- it provides deterministic domain reduction
- ·
- it does not require population-based search
- ·
- it leverages averaging to stabilize evaluation
6.3.3. Structural Distinction
6.4. Summary
- MOST performs effectively when the objective function exhibits unimodality or when near-optimal regions have positive measure.
- The method is inherently limited in problems with isolated global minima, where the contribution of the optimum to regional averages is negligible.
- In multimodal landscapes, convergence may be influenced by sampling variability.
- MOST fundamentally differs from conventional methods by optimizing regions rather than individual points.
7.1. Experimental Setup
7.1.1. Search Domain
7.1.2. Discretization
7.1.3. Monte Carlo Sampling
7.2. Test Functions
7.3. Application of Discrete MOST
- ·
- The current discrete set is bisected
- ·
- Monte Carlo estimation (37) is performed
- ·
- The region with smaller average value is selected
- ·
- The process continues until a single point remains
7.4. Results: Ackley Function(Revised)
7.4.1. Discrete Optimum
7.4.2. MOST Result
7.4.3. Error Analysis
- ·
- Discretization error is zero because the optimal point is included in the grid
- ·
- Algorithmic error is zero because MOST correctly identifies the optimal region
7.4.4. Interpretation
- ·
- The solution coincides with the true optimum
- ·
- The error is bounded by
- ·
- The method successfully avoids local minima despite strong multimodality
7.5. Results: Sphere Function(Revised)
7.5.1. Discrete Optimum
7.5.2. MOST Result
7.5.3. Estimation of Error
7.5.4. Interpretation
- ·
- Exact recovery of the optimum
- ·
- Confirms correctness under unimodality
7.6. Comparative Summary
7.7. Discussion
- ·
- Discretization Error
- ·
- Robustness to Multimodality
- ·
- Consistency with Theory
- ·
- Works well when low-value regions have measure
- ·
- Accuracy improves with finer discretization
7.8. Summary
- ·
- Discrete MOST successfully approximates global optima under uniform discretization
- ·
- The error is bounded by the discretization step size
- ·
- The method is robust to multimodal landscapes
- ·
- The theoretical framework developed in Chapters 3–6 is validated numerically
8.1. Physical and Geometric Interpretation
8.2. Robustness and Noise Tolerance
- ·
- stochastic noise
- ·
- measurement errors
- ·
- high-frequency oscillations
8.3. Future Extensions
8.3.1. Mixed Continuous–Discrete Optimization
8.3.2. High-Dimensional Optimization
- ·
- adaptive partitioning
- ·
- dimension-wise decomposition
- ·
- importance sampling
8.3.3. Adaptive and Anisotropic Partitioning
8.4. Central Insight
8.5. Concluding Remarks of Discussion
Appendix A. Discrete MOST Algorithm for Numerical Experiments
- ·
- The evaluation functional corresponds to (29)
- ·
- The Monte Carlo estimator corresponds to (37)
- ·
- The selection rule corresponds to (39)
- ·
- The algorithm performs dimension-wise recursive reduction
- ·
- The total number of iterations is bounded by:
References
- Nocedal, J.; Wright, S. Numerical Optimization; Springer, 2006. [Google Scholar] [CrossRef]
- Boyd, S.; Vandenberghe, L. Convex Optimization; Cambridge Univ. Press, 2004. [Google Scholar] [CrossRef]
- Bertsekas, D. Nonlinear Programming; Athena Scientific, 1999. [Google Scholar] [CrossRef]
- Conn, A.; Gould, N.; Toint, P. Trust Region Methods; SIAM, 2000. [Google Scholar] [CrossRef]
- Horst, R.; Pardalos, P. Handbook of Global Optimization; Springer, 1995. [Google Scholar] [CrossRef]
- Floudas, C. Deterministic Global Optimization; Springer, 2000. [Google Scholar] [CrossRef]
- Spall, J. Introduction to Stochastic Search and Optimization; Wiley, 2003. [Google Scholar] [CrossRef]
- Holland, J. Adaptation in Natural and Artificial Systems; MIT Press, 1975. [Google Scholar]
- Storn, R.; Price, K. Differential Evolution, J. Global Optimization 1997, 11, 341–359. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle Swarm Optimization. Proc. IEEE ICNN, 1995. [Google Scholar] [CrossRef]
- Hansen, N. CMA-ES Evolution Strategy. In Evolutionary Computation; 2003. [Google Scholar] [CrossRef]
- Bäck, T. Evolutionary Algorithms in Theory and Practice; Oxford Univ. Press, 1996. [Google Scholar]
- Robbins, H.; Monro, S. Stochastic Approximation. In Ann. Math. Stat.; 1951. [Google Scholar] [CrossRef]
- Kushner, H.; Yin, G. Stochastic Approximation; Springer, 2003. [Google Scholar] [CrossRef]
- Jones, D.; Schonlau, M.; Welch, W. Efficient Global Optimization, J. In Global Optimization; 1998. [Google Scholar] [CrossRef]
- Shahriari, B. Bayesian Optimization Review. Proc. IEEE, 2016. [Google Scholar] [CrossRef]
- Snoek, J. Practical Bayesian Optimization. NIPS 2012. [Google Scholar]
- Lawler, E.; Wood, D. Branch-and-Bound Methods. In Operations Research; 1966. [Google Scholar] [CrossRef]
- Jones, D.; DIRECT Algorithm, J. Optimization Theory Appl.; 1993. [Google Scholar] [CrossRef]
- Piyavskii, S. Lipschitz Optimization. USSR Comput. Math. 1972. [Google Scholar]
- Deb, K. NSGA-II. In IEEE Trans. Evolutionary Computation; 2002. [Google Scholar] [CrossRef]
- Zitzler, E.; Thiele, L. SPEA2. TIK Report 2001. [Google Scholar]
- Karush, W. Minima of Functions; 1939. [Google Scholar]
- Kuhn, H.; Tucker, A. Nonlinear Programming. Proc. Berkeley Symp., 1951. [Google Scholar]
- Miettinen, K. Nonlinear Multiobjective Optimization; Springer, 1999. [Google Scholar] [CrossRef]
- Inage, S.; Hebishima, H. Monte Carlo Stochastic Optimization Technique (MOST). Mathematics and Computers in Simulation 2022, 199, 257–271. [Google Scholar] [CrossRef]
- Inage, S.; Ohgi, S.; Takahashi, Y. Multi-objective MOST. Mathematics and Computers in Simulation 2024, 215, 146–157. [Google Scholar] [CrossRef]
- Inage, S.; Ajito, K. A Deterministic Global Optimization Framework via Monte Carlo Region Integration. Preprint 2024. [Google Scholar]
- Hoeffding, W. Probability Inequalities for Sums of Bounded Random Variables. Journal of the American Statistical Association 1963, 58(301), 13–30. [Google Scholar] [CrossRef]
| Function | True Optimum | MOST Solution | Error |
| Ackley (10D) | (0,…,0) | (0,…,0) | 0 |
| Sphere (10D) | (0,…,0) | (0,…,0) | 0 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.