Submitted:
10 July 2025
Posted:
11 July 2025
Read the latest preprint version here
Abstract
Keywords:
1. Introduction
2. Brief Review of the Mathematics of Coincidences
2.1. Index of Coincidence
2.2. The Birthday Problem
3. Kolmogorov Complexity and Algorithmic Probability
3.1. Kolmogorov Complexity
3.2. Algorithmic Probability
4. Algorithmic Probability and Index of Coincidence
4.1. IC for Algorithmic Probability
4.2. Curious Coincidences and Uncomputability
4.3. Real-World Data Examples
5. Simplicity Bias
5.1. Upper Bound
- (3) Tooth shapes from intricate and realistic computational models of dental morphology [51].
- (4) Plant shapes in computational simulations of plant growth [48].
- (5) Output binary strings from finite state transistors [52].
- (8) Natural time series data [56], in particular digitised up-down series patterns.
5.2. IC and Simplicity Bias
5.3. Convergent Evolution
6. Coincidences in Time Series
6.1. Time Series Patterns
6.2. What’s Next?
6.3. Solomonoff Induction
6.4. Coincidence in Time Series Anomalies
6.5. Near Matches
7. Discussion
Acknowledgments
References
- Jung, C.G.; Hull, R. Synchronicity: An Acausal Connecting Principle. (From Vol. 8. of the Collected Works of C. G. Jung), rev - revised ed.; Princeton University Press, 1960.
- Diaconis, P.; Mosteller, F. Methods for studying coincidences. Journal of the American Statistical Association 1989, 84, 853–861.
- Friedman, W.F. The index of coincidence and its applications in cryptography; Aegean Park Press, 1922.
- Hofert, M. Random number generators produce collisions: Why, how many and more. The American Statistician 2021, 75, 394–402.
- Pollanen, M. A Double Birthday Paradox in the Study of Coincidences. Mathematics 2024, 12, 3882.
- Johansen, M.K.; Osman, M. Coincidences: A fundamental consequence of rational cognition. New Ideas in Psychology 2015, 39, 34–44.
- Hand, D.J. The Improbability Principle: Why Coincidences, Miracles, and Rare Events Happen Every Day; Macmillan, 2014.
- Fisher, R. The design of experiments; Springer, 1971.
- Graham, R.L.; Rothschild, B.L.; Spencer, J.H. Ramsey theory; John Wiley & Sons, 1991.
- Li, M.; Vitanyi, P. An introduction to Kolmogorov complexity and its applications; Springer-Verlag New York Inc, 2008.
- Cover, T.; Thomas, J. Elements of information theory; John Wiley and Sons, 2006.
- MacKay, D.J. Information theory, inference and learning algorithms; Cambridge university press, 2003.
- Holst, L. The general birthday problem. Random Structures & Algorithms 1995, 6, 201–208.
- Henze, N. A Poisson limit law for a generalized birthday problem. Statistics & Probability Letters 1998, 39, 333–336.
- Camarri, M.; Pitman, J. Limit distributions and random trees derived from the birthday problem with unequal probabilities. Electronic Journal of Probability [electronic only] 2000, 5, Paper–No.
- Zhou, Q. Birth, Death, Coincidences and Occupancies: Solutions and Applications of Generalized Birthday and Occupancy Problems. Methodology and Computing in Applied Probability 2023, 25, 53.
- Mase, S. Approximations to the birthday problem with unequal occurrence probabilities and their application to the surname problem in Japan. Annals of the Institute of Statistical Mathematics 1992, 44, 479–499.
- Mezard, M.; Montanari, A. Information, physics, and computation; Oxford University Press, USA, 2009.
- Solomonoff, R.J. A Preliminary Report on a General Theory of Inductive Inference (Revision of Report V-131). Contract AF 1960, 49, 376.
- Kolmogorov, A. Three approaches to the quantitative definition of information. Problems of information transmission 1965, 1, 1–7.
- Chaitin, G.J. A theory of program size formally identical to information theory. Journal of the ACM (JACM) 1975, 22, 329–340.
- Bédard, C.A. Lecture Notes on Algorithmic Information Theory. arXiv preprint arXiv:2504.18568 2025.
- Turing, A.M. On computable numbers, with an application to the Entscheidungsproblem. J. of Math 1936, 58, 5.
- Vitányi, P.M. Similarity and denoising. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 2013, 371, 20120091.
- Vitányi, P. How incomputable is Kolmogorov complexity? Entropy 2020, 22, 408.
- Veness, J.; Ng, K.S.; Hutter, M.; Uther, W.; Silver, D. A monte-carlo aixi approximation. Journal of Artificial Intelligence Research 2011, 40, 95–142.
- Delahaye, J.; Zenil, H. Numerical Evaluation of Algorithmic Complexity for Short Strings: A Glance into the Innermost Structure of Algorithmic Randomness. Appl. Math. Comput. 2012, 219, 63–77.
- Soler-Toscano, F.; Zenil, H.; Delahaye, J.P.; Gauvrit, N. Calculating Kolmogorov complexity from the output frequency distributions of small Turing machines. PloS one 2014, 9, e96223.
- Bennett, C. The thermodynamics of computation – a review. International Journal of Theoretical Physics 1982, 21, 905–940.
- Kolchinsky, A.; Wolpert, D.H. Thermodynamic costs of Turing machines. Physical Review Research 2020, 2, 033312.
- Zurek, W. Algorithmic randomness and physical entropy. Physical Review A 1989, 40, 4731.
- Avinery, R.; Kornreich, M.; Beck, R. Universal and accessible entropy estimation using a compression algorithm. Physical review letters 2019, 123, 178102.
- Martiniani, S.; Chaikin, P.M.; Levine, D. Quantifying hidden order out of equilibrium. Physical Review X 2019, 9, 011031.
- Ferragina, P.; Giancarlo, R.; Greco, V.; Manzini, G.; Valiente, G. Compression-based classification of biological sequences and structures via the Universal Similarity Metric: experimental assessment. BMC bioinformatics 2007, 8, 252.
- Johnston, I.G.; Dingle, K.; Greenbury, S.F.; Camargo, C.Q.; Doye, J.P.; Ahnert, S.E.; Louis, A.A. Symmetry and simplicity spontaneously emerge from the algorithmic nature of evolution. Proceedings of the National Academy of Sciences 2022, 119, e2113883119.
- Adams, A.; Zenil, H.; Davies, P.C.; Walker, S.I. Formal definitions of unbounded evolution and innovation reveal universal mechanisms for open-ended evolution in dynamical systems. Scientific reports 2017, 7, 1–15.
- Devine, S.D. Algorithmic Information Theory for Physicists and Natural Scientists; IOP Publishing, 2020.
- Cilibrasi, R.; Vitányi, P. Clustering by compression. Information Theory, IEEE Transactions on 2005, 51, 1523–1545.
- Hutter, M. On universal prediction and Bayesian confirmation. Theoretical Computer Science 2007, 384, 33–48.
- Levin, L. Laws of information conservation (nongrowth) and aspects of the foundation of probability theory. Problemy Peredachi Informatsii 1974, 10, 30–35.
- Zenil, H.; Badillo, L.; Hernández-Orozco, S.; Hernández-Quiroz, F. Coding-theorem like behaviour and emergence of the universal distribution from resource-bounded algorithmic probability. International Journal of Parallel, Emergent and Distributed Systems 2019, 34, 161–180.
- Kirchherr, W.; Li, M.; Vitányi, P. The miraculous universal distribution. The Mathematical Intelligencer 1997, 19, 7–15.
- Cilibrasi, R.; Vitanyi, P. Automatic meaning discovery using Google. Schloss Dagstuhl–Leibniz-Zentrum für Informatik, 2006.
- Fink, T.M. Recursively divisible numbers. Journal of Number Theory 2024, 256, 37–54.
- Berger, A.; Hill, T.P. An introduction to Benford’s law; Princeton University Press, 2015.
- Wolfram, S. A New Kind of Science; Wolfram Media, 2002.
- Alaskandarani, M.; Dingle, K. Low complexity, low probability patterns and consequences for algorithmic probability applications. Complexity 2023, 2023, 9696075.
- Dingle, K.; Camargo, C.Q.; Louis, A.A. Input–output maps are strongly biased towards simple outputs. Nature communications 2018, 9, 761.
- Lempel, A.; Ziv, J. On the complexity of finite sequences. Information Theory, IEEE Transactions on 1976, 22, 75–81.
- Dingle, K.; Batlle, P.; Owhadi, H. Multiclass classification utilising an estimated algorithmic probability prior. Physica D: Nonlinear Phenomena 2023, 448, 133713.
- Dingle, K.; Hagolani, P.; Zimm, R.; Umar, M.; O’Sullivan, S.; Louis, A.A. Bounding phenotype transition probabilities via conditional complexity. bioRxiv 2024, pp. 2024–12.
- Dingle, K.; Pérez, G.V.; Louis, A.A. Generic predictions of output probability based on complexities of inputs and outputs. Scientific reports 2020, 10, 1–9.
- Valle-Pérez, G.; Camargo, C.Q.; Louis, A.A. Deep learning generalizes because the parameter-function map is biased towards simple functions. arXiv preprint arXiv:1805.08522 2018.
- Mingard, C.; Skalse, J.; Valle-Pérez, G.; Martínez-Rubio, D.; Mikulik, V.; Louis, A.A. Neural networks are a priori biased towards Boolean functions with low entropy. arXiv preprint arXiv:1909.11522 2019.
- Mingard, C.; Rees, H.; Valle-Pérez, G.; Louis, A.A. Deep neural networks have an inbuilt Occam’s razor. Nature Communications 2025, 16, 220.
- Dingle, K.; Kamal, R.; Hamzi, B. A note on a priori forecasting and simplicity bias in time series. Physica A: Statistical Mechanics and its Applications 2023, 609, 128339.
- Dingle, K.; Alaskandarani, M.; Hamzi, B.; Louis, A.A. Exploring simplicity bias in 1D dynamical systems. Entropy 2024, 26, 426.
- Hamzi, B.; Dingle, K. Simplicity bias, algorithmic probability, and the random logistic map. Physica D: Nonlinear Phenomena 2024, 463, 134160.
- Conway-Morris, S. Evolution: like any other science it is predictable. Philosophical Transactions of the Royal Society B: Biological Sciences 2010, 365, 133.
- McGhee, G.R. Convergent evolution: limited forms most beautiful; MIT press, 2011.
- Morris, S. Life’s solution: inevitable humans in a lonely universe; Cambridge University Press, 2003.
- Morris, S.C. The deep structure of biology: is convergence sufficiently ubiquitous to give a directional signal; Number 45, Templeton Foundation Press, 2008.
- Dingle, K.; Schaper, S.; Louis, A.A. The structure of the genotype–phenotype map strongly constrains the evolution of non-coding RNA. Interface focus 2015, 5, 20150053.
- Louis, A.A. Contingency, convergence and hyper-astronomical numbers in biological evolution. Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences 2016, 58, 107–116.
- Wright, E.S. Tandem repeats provide evidence for convergent evolution to similar protein structures. Genome Biology and Evolution 2025, 17, evaf013.
- Chatfield, C. The analysis of time series: theory and practice; Springer, 2013.
- Zenil, H.; Delahaye, J. An algorithmic information theoretic approach to the behaviour of financial markets. Journal of Economic Surveys 2011, 25, 431–463.
- Rathmanner, S.; Hutter, M. A philosophical treatise of universal induction. Entropy 2011, 13, 1076–1136.
- Merhav, N.; Feder, M. Universal prediction. IEEE Transactions on Information Theory 1998, 44, 2124–2147.
- Willems, F.M.; Shtarkov, Y.M.; Tjalkens, T.J. The context-tree weighting method: Basic properties. IEEE transactions on information theory 1995, 41, 653–664.
- Hutter, M.; Legg, S.; Vitanyi, P.M. Algorithmic probability. Scholarpedia 2007, 2, 2572.
- Solomonoff, R.J. A formal theory of inductive inference. Part I. Information and control 1964, 7, 1–22.
- Hutter, M. Universal artificial intelligence: Sequential decisions based on algorithmic probability; Springer Science & Business Media, 2004.
- Ryabko, B.; Astola, J.; Malyutov, M. Compression-based methods of statistical analysis and prediction of time series; Springer, 2016.
- Blázquez-García, A.; Conde, A.; Mori, U.; Lozano, J.A. A review on outlier/anomaly detection in time series data. ACM computing surveys (CSUR) 2021, 54, 1–33.
- Dasgupta, A.; Li, B. Detection and analysis of spikes in a random sequence. Methodology and Computing in Applied Probability 2018, 20, 1429–1451.
- Horváth, L.; Rice, G. Change Point Analysis for Time Series; Springer, 2024.
- Ishkuvatov, R.; Musatov, D. On approximate uncomputability of the Kolmogorov complexity function. In Proceedings of the Computing with Foresight and Industry: 15th Conference on Computability in Europe, CiE 2019, Durham, UK, July 15–19, 2019, Proceedings 15. Springer, 2019, pp. 230–239.
- Ishkuvatov, R.; Musatov, D.; Shen, A. Approximating Kolmogorov complexity. Computability 2023, 12, 283–297.
- Dingle, K. Optima and simplicity in nature. arXiv preprint arXiv:2210.02564 2022.
- Dingle, K. Fitness, optima, and simplicity. Preprints 2022, p. 2022080402.
- Wyeth, C.; Bu, D.; Yu, Q.; Gao, W.; Liu, X.; Li, M. Lossless data compression by large models. Nature Machine Intelligence 2025.
| 1 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
