Submitted:
26 April 2024
Posted:
26 April 2024
You are already at the latest version
Abstract
Keywords:
1. Introduction
- Application of termination rules, that are based on asymptotic considerations and they are defined in the recent bibliography. This addition will achieve early termination of the method and will not waste computational time on iterations that do not yield a better estimate of the global minimum of the objective function.
- A periodic application of a local search procedure. By using local optimization, the local minima of the objective function will be found more efficiently, which will also lead to a faster discovery of the global minimum.
2. The proposed method
- 1.
-
Initialization step
- Set as the number of armadillos in the population.
- Set the maximum number of allowed generations.
- Initialize randomly the armadillos in S.
- Set iter=0.
- Set the local search rate.
- 2.
-
Evaluation step
- For do Set .
- endfor
- 3.
-
Computation step
-
For do
- (a)
-
Phase 1: Attack on termite mounds
- -
- Construct the termite mounds set
- -
- Select the termite mound for armadillo i.
- -
- Create a new position for the armadillo according to the formula: where are random numbers in and are random numbers in and
- -
- Update the position of the armadillo i according to:
- (b)
-
Phase 2: Digging in termite mounds
- -
- Calculate a new trial positionwhere are random numbers in .
- -
- Update the position of the armadillo i according to:
- (c)
- Local search. Draw a random number . If then a local optimization algorithm is applied to . Some local search procedures found in the optimization literature are the BFGS method [51], the Steepest Descent method [52], the L-Bfgs method [53] for large scaled optimization etc. A BFGS variant of Powell [54] was used in the current work as the local search optimizer.
- endfor
-
- 4.
-
Termination Check Step
- Set iter=iter+1.
- For the valid termination of the method, two termination rules that have recently appeared in the literature are proposed here and they are based on asymptotic considerations. The first stopping rule will be called DoubleBox in the conducted experiments and it was introduced in the work of Tsoulos in 2008 [55]. This termination rule is based on calculating the variance of the best function value discovered by the optimization method in each iteration. The second termination rule was introduced in the work of Charilogis et al [56] and will be called Similarity in the experiments. In this termination termination technique, at every iteration the difference between the current best value and the previous best value is calculated and the algorithm terminates when this difference is zero for a number of predefined iterations.
- If the termination criteria are not hold then goto step 3.
3. Experiments
3.1. Experimental Functions
- Bf1 function. The function Bohachevsky 1 is defined as:with .
- Bf2 function. The Bohachevsky 2 function is defined as:with .
- Branin function with the following definition: with .
- Camel function defined as:
- Easom defined as:with .
- Exponential function defined as:The global minimum is located at with value . The cases of were used in the conducted experiments.
- Gkls function. a function with w local minima and dimension n. This function is provided in [59] with . The values and were used in the conducted experiments.
- Goldstein and Price function
- Griewank2 function. The function is given byThe global minimum is located at the with value 0.
- Griewank10 function defined as:with .
- Hansen function. , .
- Hartman 3 function defined as:with and and
- Hartman 6 function given by:with and and
- Potential function, the well - known Lennard-Jones potential[60] is used as a test function here and it is defined as:The values were adopted in the conducted experiments.
- Rastrigin function defined as:
- Rosenbrock function.The values were used in the provided experiments.
- Shekel 7 function.with and .
- Shekel 5 function.with and .
- Shekel 10 function.with and .
- Sinusoidal function defined as:. The values and were examined in the conducted experiments.
- Test2N function defined as:The function has local minima and the values were used in the conducted experiments.
- Test30N function defined as:with . The function has local minima and the values were used in the conducted experiments.
3.2. Experimental Results
- 1.
- The column PROBLEM denotes the objective problem.
- 2.
- The column GENETIC denotes the average function calls for the Genetic algorithm. The same number of armadillos and chromosomes and particles was used in the conducted experiments in order to be a fair comparison between the algorithms. Also, the same number of maximum generations and the same stopping criteria were utilized among the different optimization methods.
- 3.
- The column PSO stands for the application of a Particle Swarm Optimization method in the objective problem. The number of particles and the stopping rule in the PSO method are the same as in proposed method.
- 4.
- The column PROPOSED represents the experimental results for the Gao method with the suggested modifications.
- 5.
- The final row denoted as AVERAGE stands for the average results for all the used objective functions.
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
References
- Rothlauf, F.; Rothlauf, F.Optimization problems. Design of Modern Heuristics: Principles and Application 2011, pp. 7–44.
- Horst, R.; Pardalos, P.M.; Van Thoai, N. Introduction to global optimization; Springer Science & Business Media, 2000.
- Weise, T. Global optimization algorithms-theory and application. Self-Published Thomas Weise 2009, 361, 153. [Google Scholar]
- Ovelade, O.N.; Ezugwu, A.E. Ebola Optimization Search Algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. In Proceedings of the 2021 International Conference on Electrical, Computer and Energy Technologies (ICECET). IEEE; 2021; pp. 1–10. [Google Scholar]
- Deb, K.; Sindhya, K.; Hakanen, J. Multi-objective optimization. In Decision sciences; CRC Press, 2016; pp. 161–200.
- Liberti, L.; Kucherenko, S. Comparison of deterministic and stochastic approaches to global optimization. International Transactions in Operational Research 2005, 12, 263–285. [Google Scholar] [CrossRef]
- Casado, L.G.; García, I.; Csendes, T. A new multisection technique in interval methods for global optimization. Computing 2000, 65, 263–269. [Google Scholar] [CrossRef]
- Zhang, X.; Liu, S. Interval algorithm for global numerical optimization. Engineering Optimization 2008, 40, 849–868. [Google Scholar] [CrossRef]
- Price, W. Global optimization by controlled random search. Journal of optimization theory and applications 1983, 40, 333–348. [Google Scholar] [CrossRef]
- Křivỳ, I.; Tvrdík, J. The controlled random search algorithm in optimizing regression models. Computational statistics & data analysis 1995, 20, 229–234. [Google Scholar]
- Ali, M.M.; Törn, A.; Viitanen, S. A numerical comparison of some modified controlled random search algorithms. Journal of Global Optimization 1997, 11, 377–385. [Google Scholar] [CrossRef]
- Aarts, E.; Korst, J.; Michiels, W. Simulated annealing Search methodologies: introductory tutorials in optimization and decision support techniques2005, pp.187–210.
- Nikolaev, A.G.; Jacobson, S.H. Simulated annealing Handbook of metaheuristics 2010, pp.1–39.
- Rinnooy Kan, A.; Timmer, G. Stochastic global optimization methods part II: Multi level methods. Mathematical Programming 1987, 39, 57–78. [Google Scholar] [CrossRef]
- Ali, M.M.; Storey, C. Topographical multilevel single linkage. Journal of Global Optimization 1994, 5, 349–358. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Lagaris, I.E. MinFinder: Locating all the local minima of a function. Computer Physics Communications 2006, 174, 166–179. [Google Scholar] [CrossRef]
- Pardalos, P.M.; Romeijn, H.E.; Tuy, H. Recent developments and trends in global optimization. Journal of computational and Applied Mathematics 2000, 124, 209–228. [Google Scholar] [CrossRef]
- Fouskakis, D.; Draper, D. Stochastic optimization: a review. International Statistical Review 2002, 70, 315–349. [Google Scholar] [CrossRef]
- Rocki, K.; Suda, R. An efficient GPU implementation of a multi-start TSP solver for large problem instances. In Proceedings of the Proceedings of the 14th annual conference companion on Genetic and evolutionary computation, 2012, pp.; pp. 1441–1442.
- Van Luong, T.; Melab, N.; Talbi, E.G. GPU-based multi-start local search algorithms. In Proceedings of the Learning and Intelligent Optimization: 5th International Conference, LION 5, Rome, Italy, 17-21 January 2011; Selected Papers 5. Springer, 2011. pp. 321–335. [Google Scholar]
- Bartz-Beielstein, T.; Branke, J.; Mehnen, J.; Mersmann, O. Evolutionary algorithms. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery 2014, 4, 178–195. [Google Scholar] [CrossRef]
- Simon, D. Evolutionary optimization algorithms; John Wiley & Sons, 2013.
- Blum, C. Ant colony optimization: Introduction and recent trends. Physics of Life reviews 2005, 2, 353–373. [Google Scholar] [CrossRef]
- Dorigo, M.; Blum, C. Ant colony optimization theory: A survey. Theoretical computer science 2005, 344, 243–278. [Google Scholar] [CrossRef]
- Haldurai, L.; Madhubala, T.; Rajalakshmi, R. A study on genetic algorithm and its applications. International Journal of computer sciences and Engineering 2016, 4, 139. [Google Scholar]
- Jamwal, P.K.; Abdikenov, B.; Hussain, S. Evolutionary optimization using equitable fuzzy sorting genetic algorithm (EFSGA). IEEE Access 2019, 7, 8111–8126. [Google Scholar] [CrossRef]
- Wang, Z.; Sobey, A. A comparative review between Genetic Algorithm use in composite optimisation and the state-of-the-art in evolutionary computation. Composite Structures 2020, 233, 111739. [Google Scholar] [CrossRef]
- Eberhart, R.; Kennedy, J. Particle swarm optimization. Proceedings of the Proceedings of the IEEE international conference on neural networks. Citeseer, 1995, Vol. 4, pp. 1942–1948.
- Wang, D.; Tan, D.; Liu, L. Particle swarm optimization algorithm: an overview. Soft computing 2018, 22, 387–408. [Google Scholar] [CrossRef]
- Price, K.V. Differential evolution. In Handbook of optimization: From classical to modern approach; Springer, 2013; pp. 187–214.
- Pant, M.; Zaheer, H.; Garcia-Hernandez, L.; Abraham, A.; et al. Differential Evolution: A review of more than two decades of research. Engineering Applications of Artificial Intelligence 2020, 90, 103479. [Google Scholar]
- Asselmeyer, T.; Ebeling, W.; Rosé, H. Evolutionary strategies of optimization. Physical Review E 1997, 56, 1171. [Google Scholar] [CrossRef]
- Arnold, D.V. Noisy optimization with evolution strategies; Vol. 8, Springer Science & Business Media, 2002.
- Yao, X.; Liu, Y.; Lin, G. Evolutionary programming made faster. IEEE Transactions on Evolutionary computation 1999, 3, 82–102. [Google Scholar]
- Stephenson, M.; O’Reilly, U.M.; Martin, M.C.; Amarasinghe, S. Genetic programming applied to compiler heuristic optimization. In Proceedings of the European conference on genetic programming. Springer; 2003; pp. 238–253. [Google Scholar]
- Banga, J.R. Optimization in computational systems biology. BMC systems biology 2008, 2, 1–7. [Google Scholar] [CrossRef] [PubMed]
- Beites, T.; Mendes, M.V. Chassis optimization as a cornerstone for the application of synthetic biology based strategies in microbial secondary metabolism. Frontiers in microbiology 2015, 6, 159095. [Google Scholar] [CrossRef] [PubMed]
- Hartmann, A.K.; Rieger, H. Optimization algorithms in physics; Citeseer, 2002.
- Hanuka, A.; Huang, X.; Shtalenkova, J.; Kennedy, D.; Edelen, A.; Zhang, Z.; Lalchand, V.; Ratner, D.; Duris, J. Physics model-informed Gaussian process for online optimization of particle accelerators. Physical Review Accelerators and Beams 2021, 24, 072802. [Google Scholar] [CrossRef]
- Ferreira, S.L.; Lemos, V.A.; de Carvalho, V.S.; da Silva, E.G.; Queiroz, A.F.; Felix, C.S.; da Silva, D.L.; Dourado, G.B.; Oliveira, R.V. Multivariate optimization techniques in analytical chemistry-an overview. Microchemical Journal 2018, 140, 176–182. [Google Scholar] [CrossRef]
- Bechikh, S.; Chaabani, A.; Said, L.B. An efficient chemical reaction optimization algorithm for multiobjective optimization. IEEE transactions on cybernetics 2014, 45, 2051–2064. [Google Scholar] [CrossRef] [PubMed]
- Filip, M.; Zoubek, T.; Bumbalek, R.; Cerny, P.; Batista, C.E.; Olsan, P.; Bartos, P.; Kriz, P.; Xiao, M.; Dolan, A.; et al. Advanced computational methods for agriculture machinery movement optimization with applications in sugarcane production. Agriculture 2020, 10, 434. [Google Scholar] [CrossRef]
- Zhang, D.; Guo, P. Integrated agriculture water management optimization model for water saving potential analysis. Agricultural Water Management 2016, 170, 5–19. [Google Scholar] [CrossRef]
- Intriligator, M.D. Mathematical optimization and economic theory; SIAM, 2002.
- Dixit, A.K. Optimization in economic theory; Oxford University Press, USA, 1990.
- Alsayyed, O.; Hamadneh, T.; Al-Tarawneh, H.; Alqudah, M.; Gochhait, S.; Leonova, I.; Malik, O.P.; Dehghani, M. Giant Armadillo Optimization: A New Bio-Inspired Metaheuristic Algorithm for Solving Optimization Problems. Biomimetics 2023, 8, 619. [Google Scholar] [CrossRef]
- Desbiez, A.; Kluyber, D.; Massocato, G.; Attias, N. Methods for the characterization of activity patterns in elusive species: the giant armadillo in the Brazilian Pantanal. Journal of Zoology 2021, 315, 301–312. [Google Scholar] [CrossRef]
- Owaid, S.R.; Zhuravskyi, Y.; Lytvynenko, O.; Veretnov, A.; Sokolovskyi, D.; Plekhova, G.; Hrinkov, V.; Pluhina, T.; Neronov, S.; Dovbenko, O. DEVELOPMENT OF A METHOD OF INCREASING THE EFFICIENCY OF DECISION-MAKING IN ORGANIZATIONAL AND TECHNICAL SYSTEMS. Eastern-European Journal of Enterprise Technologies, 2024. [Google Scholar]
- Basheer, I.A.; Hajmeer, M. Artificial neural networks: fundamentals, computing, design, and application. Journal of microbiological methods 2000, 43, 3–31. [Google Scholar] [CrossRef] [PubMed]
- Zou, J.; Han, Y.; So, S.S. Overview of artificial neural networks Artificial neural networks: methods and applications 2009, pp.14–22.
- Fletcher, R. A new approach to variable metric algorithms. The computer journal 1970, 13, 317–322. [Google Scholar] [CrossRef]
- Yuan, Y.x. A new stepsize for the steepest descent method. Journal of Computational Mathematics, 2006; 149, 156. [Google Scholar]
- Zhu, C.; Byrd, R.H.; Lu, P.; Nocedal, J. Algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound-constrained optimization. ACM Transactions on mathematical software (TOMS) 1997, 23, 550–560. [Google Scholar] [CrossRef]
- Powell, M. A tolerant algorithm for linearly constrained optimization calculations. Mathematical Programming 1989, 45, 547–566. [Google Scholar] [CrossRef]
- Tsoulos, I.G. Modifications of real code genetic algorithm for global optimization. Applied Mathematics and Computation 2008, 203, 598–607. [Google Scholar] [CrossRef]
- Charilogis, V.; Tsoulos, I.G. Toward an Ideal Particle Swarm Optimizer for Multidimensional Functions. Information 2022, 13. [Google Scholar] [CrossRef]
- Ali, M.M.; Khompatraporn, C.; Zabinsky, Z.B. A numerical evaluation of several stochastic algorithms on selected continuous global optimization test problems. Journal of global optimization 2005, 31, 635–672. [Google Scholar] [CrossRef]
- Floudas, C.A.; Pardalos, P.M.; Adjiman, C.; Esposito, W.R.; Gümüs, Z.H.; Harding, S.T.; Klepeis, J.L.; Meyer, C.A.; Schweiger, C.A. Handbook of test problems in local and global optimization; Vol. 33, Springer Science & Business Media, 2013.
- Gaviano, M.; Kvasov, D.E.; Lera, D.; Sergeyev, Y.D. Algorithm 829: Software for generation of classes of test functions with known local and global minima for global optimization. ACM Transactions on Mathematical Software (TOMS) 2003, 29, 469–480. [Google Scholar] [CrossRef]
- Jones, J.E. On the determination of molecular fields.—II. From the equation of state of a gas. Proceedings of the Royal Society of London. Series A, Containing Papers of a Mathematical and Physical Character 1924, 106, 463–477. [Google Scholar]
- Gropp, W.; Lusk, E.; Doss, N.; Skjellum, A. A high-performance, portable implementation of the MPI message passing interface standard. Parallel computing 1996, 22, 789–828. [Google Scholar] [CrossRef]
- Chandra, R. Parallel programming in OpenMP; Morgan kaufmann, 2001.
- Li, C.C.; Lin, C.H.; Liu, J.C. Parallel genetic algorithms on the graphics processing units using island model and simulated annealing. Advances in Mechanical Engineering 2017, 9, 1687814017707413. [Google Scholar] [CrossRef]
- da Silveira, L.A.; Soncco-Álvarez, J.L.; de Lima, T.A.; Ayala-Rincón, M. Parallel island model genetic algorithms applied in NP-hard problems. In Proceedings of the 2019 IEEE Congress on Evolutionary Computation (CEC). IEEE; 2019; pp. 3262–3269. [Google Scholar]


| PARAMETER | MEANING | VALUE |
|---|---|---|
| Number of armadillos or chromosomes | 100 | |
| Maximum number of allowed generations | 200 | |
| Local Search rate | 0.05 | |
| Selection rate in genetic algorithm | 0.10 | |
| Mutation rate in genetic algorithm | 0.05 |
| PROBLEM | Genetic | PSO | PROPOSED |
|---|---|---|---|
| BF1 | 2179 | 2364(0.97) | 2239 |
| BF2 | 1944 | 2269(0.90) | 1864 |
| BRANIN | 1177 | 2088 | 1179 |
| CAMEL | 1401 | 2278 | 1450 |
| EASOM | 979 | 2172 | 886 |
| EXP4 | 1474 | 2231 | 1499 |
| EXP8 | 1551 | 2256 | 1539 |
| EXP16 | 1638 | 2165 | 1581 |
| EXP32 | 1704 | 2106 | 1567 |
| GKLS250 | 1195 | 2113 | 1292 |
| GKLS350 | 1396 (0.87) | 1968 | 1510 |
| GOLDSTEIN | 1878 | 2497 | 1953 |
| GRIEWANK2 | 2360 (0.87) | 3027(0.97) | 2657 |
| GRIEWANK10 | 3474(0.87) | 3117(0.87) | 4064 (0.97) |
| HANSEN | 1761 (0.97) | 2780 | 1885 |
| HARTMAN3 | 1404 | 2086 | 1448 |
| HARTMAN6 | 1632 | 2213(0.87) | 1815 |
| POTENTIAL3 | 2127 | 3557 | 1942 |
| POTENTIAL5 | 3919 | 7132 | 3722 |
| RASTRIGIN | 2438(0.97) | 2754 | 2411 |
| ROSENBROCK4 | 1841 | 2909 | 2690 |
| ROSENBROCK8 | 2570 | 3382 | 3573 |
| ROSENBROCK16 | 4331 | 3780 | 5085 |
| SHEKEL5 | 1669(0.97) | 2700 | 1911 |
| SHEKEL7 | 1696 | 2612 | 1930 |
| SHEKEL10 | 1758 | 2594 | 1952 |
| TEST2N4 | 1787(0.97) | 2285 | 1840(0.83) |
| TEST2N5 | 2052(0.93) | 2368(0.97) | 2029(0.63) |
| TEST2N6 | 2216(0.73) | 2330(0.73) | 2438(0.80) |
| TEST2N7 | 2520 (0.73) | 2378(0.63) | 2567(0.60) |
| SINU4 | 1514 | 2577 | 1712 |
| SINU8 | 1697 | 2527 | 1992 |
| SINU16 | 2279 (0.97) | 2657 | 2557 |
| TEST30N3 | 1495 | 3302 | 1749 |
| TEST30N4 | 1897 | 3817 | 2344 |
| AVERAGE | 68953(0.97) | 95391(0.97) | 74982(0.97) |
| PROBLEM | Similarity | Doublebox |
|---|---|---|
| BF1 | 2239 | 2604 |
| BF2 | 1974 | 1864 |
| BRANIN | 1179 | 1179 |
| CAMEL | 1450 | 1245 |
| EASOM | 886 | 775 |
| EXP4 | 1499 | 1332 |
| EXP8 | 1539 | 1371 |
| EXP16 | 1581 | 1388 |
| EXP32 | 1567 | 1384 |
| GKLS250 | 1292 | 1483 |
| GKLS350 | 1510 | 2429 |
| GOLDSTEIN | 1953 | 2019 |
| GRIEWANK2 | 2657 | 5426 |
| GRIEWANK10 | 4064(0.97) | 4940 (0.97) |
| HANSEN | 1885 | 4482 |
| HARTMAN3 | 1448 | 1458 |
| HARTMAN6 | 1815 | 1625 |
| POTENTIAL3 | 1942 | 1700 |
| POTENTIAL5 | 3722 | 3395 |
| RASTRIGIN | 2411 | 4591 |
| ROSENBROCK4 | 2690 | 2371 |
| ROSENBROCK8 | 3573 | 3166 |
| ROSENBROCK16 | 5085 | 4386 |
| SHEKEL5 | 1911 | 1712 |
| SHEKEL7 | 1930 | 1722 |
| SHEKEL10 | 1952 | 1956 |
| TEST2N4 | 1840(0.83) | 3103(0.83) |
| TEST2N5 | 2029(0.63) | 3375(0.67) |
| TEST2N6 | 2438(0.80) | 4458(0.83) |
| TEST2N7 | 2567(0.60) | 4425(0.63) |
| SINU4 | 1712 | 1657 |
| SINU8 | 1992 | 1874 |
| SINU16 | 2557 | 2612 |
| TEST30N3 | 1749 | 1483 |
| TEST30N4 | 2344 | 2737 |
| AVERAGE | 74982(0.97) | 87727(0.97) |
| PROBLEM | |||
| BF1 | 1531 (0.97) | 1559 | 2239 |
| BF2 | 1457 (0.97) | 1319 | 1864 |
| BRANIN | 921 | 913 | 1179 |
| CAMEL | 1037 | 1022 | 1450 |
| EASOM | 871 | 850 | 886 |
| EXP4 | 942 | 926 | 1499 |
| EXP8 | 930 | 936 | 1539 |
| EXP16 | 1020 | 961 | 1581 |
| EXP32 | 1005 | 982 | 1567 |
| GKLS250 | 1197 | 1106 | 1292 |
| GKLS350 | 1256 | 1221 | 1510 |
| GOLDSTEIN | 1124 | 1146 | 1953 |
| GRIEWANK2 | 1900 (0.93) | 1976(0.97) | 2657 |
| GRIEWANK10 | 1444(0.40) | 1963(0.70) | 4064 (0.97) |
| HANSEN | 1872 | 1726(0.93) | 1885 |
| HARTMAN3 | 1005 | 967 | 1448 |
| HARTMAN6 | 976(0.87) | 1052(0.97) | 1815 |
| POTENTIAL3 | 1018 | 1081 | 1942 |
| POTENTIAL5 | 1313 | 1439 | 3722 |
| RASTRIGIN | 1614(0.97) | 1687(0.97) | 2411 |
| ROSENBROCK4 | 1097 | 1203 | 2690 |
| ROSENBROCK8 | 1179 | 1403 | 3573 |
| ROSENBROCK16 | 1437 | 1801 | 5085 |
| SHEKEL5 | 1070(0.97) | 1073 | 1911 |
| SHEKEL7 | 1076(0.93) | 1124 | 1930 |
| SHEKEL10 | 1152(0.97) | 1170(0.97) | 1952 |
| TEST2N4 | 1409(0.80) | 1285(0.87) | 1840(0.83) |
| TEST2N5 | 1451(0.53) | 1350(0.63) | 2029(0.63) |
| TEST2N6 | 1417(0.60) | 1529(0.67) | 2438(0.80) |
| TEST2N7 | 1500 (0.47) | 1451(0.33) | 2567(0.60) |
| SINU4 | 1210 | 1199 | 1712 |
| SINU8 | 1163 | 1145 | 1992 |
| SINU16 | 1377 | 1296 | 2557 |
| TEST30N3 | 1057 | 1189 | 1749 |
| TEST30N4 | 1897 | 3817 | 2344 |
| AVERAGE | 43213(0.92) | 44331(0.94) | 74982(0.97) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
