Introduction
Most academics have heard the claim that the works of Shakespeare could be replicated by randomly typing monkeys provided sufficient time [
1]. The idea that even the most unlikely events become inevitable with infinite time gives credence to this claim. But if one wants to deal with statistical realities, how much of infinity would it likely require?
According to the calculations of James Goetz, at the constant rate of 70 words per minute it would require approximately 10
942 years of random typing to generate a single sonnet (560 letter-spaces) [
2], much less the entire works of Shakespeare. Notably, the expected age of the universe at heat death is 10
106 years. This means that even if we were able to employ random typists from another 10
836 multiverses of similar lifespans, not a single sonnet would be produced. Moreover, as noted by Goetz, the signal-to-noise ratio would make it impossible to find the sonnet. Plus, even if “10
100 letter spaces were typed on every neutron and proton in the universe, M [the monkey] would fall short of typing 10
950 letter-spaces by 770 orders of magnitude” [
2].
These observations point to the fundamental problem raised by appeals to infinity as a tenant for claiming that even the most unlikely of events is “possible,” given enough time and/or universes. In my view, these abstract “possibilities” are so untethered to reality that they undermine, rather than enhance, productive discourse and evidence-based judgments.
The reality is, we live on a planet in a universe, both of which have limited time and resources. So, my thesis is simply this:
To better inform rational judgments we need a statistically validated and evidence-based means of describing statistical results when computed odds of an event indicate that the event is likely, possible, improbable, impossible, and absurdly impossible.
My recommendation is that just as we have a convention for describing findings as “statistically significant” when the P-statistic is less than a pre-determined significance level (alpha), we should have a similar convention for describing the odds of an occurrence as “statistically impossible within the domain” of either:
A human lifetime
The history of all possible human events
The square of the history of all possible human events
All possible earthly events
The square of all possible earthly events
The universe of all quantum events
The square of the universe of all quantum events
Or some special case domain carefully defined by the investigator.
With the exception of the latter, my recommendation for defining each of these domains are so conservatively defined that the conclusion that event is “impossible” or even “absurdly impossible” relative to these domains can be accepted with confidence as an evidence-based fact. Therefore, the observation of such an “impossible” event should be interpreted as evidence that the event was not due to random chance.
Definitions
The Universe of All Quantum Events
The total number of all quantum events in the history of the universe is on the order of 10
120 [
3]. This value is supported by the calculations of Lloyd which examined the maximum count of elementary bits of information the universe can have either stored and/or processed during its existence.
It is also supported by converting all the mass of the universe to energy and calculating the total number of cycles (vibrations) over the course of time. Specifically, if we assume a Big Bang model of the universe, with a Hubble time (which assumes a constant rate of expansion without deceleration) of 19 billion years which would have a critical density of 5x10-30gm/cm3. A universe more dense than this would experience deceleration and eventual contraction and collapse. Assuming expansion at the speed of light, the maximum volume of a 19-billion-year-old spherical universe would be 2.4x1079 cubic meters. Thus, at critical density, the mass of the universe would be 1.2x1053 kilograms. Using the formula E=mc2=hv and solving for frequency v=mc2/h, we find that the frequency of all quantum vibrations of all the mass in our model universe is 1.6x10103 hertz. Thus, the age of the universe times the frequency of all mass in the universe yields 9.6x10121 vibrations (cycles) since the Big Bang. This is within two orders of magnitude of Lloyd’s limit, 10120. As will be seen below, an error in these estimates on the order of one to ten orders of magnitude (10121 to 10131) would have very little impact on the calculations which follow.
The Square of the Universe of All Quantum Events
The square of 10120 is 10240. This represents the total number of all quantum events if each quantum event created a new universe of equal magnitude to the first. In other words, it is the total number of quantum events in 10120 multiverses.
All Possible Earthly Events
Applying the quantum vibrations calculations to the mass of the earth (6 x 1027g) to solve for the frequency (v=mc2/h), the frequency of all quantum vibrations of all the mass of Earsh is approximately 8.1x1074 hertz. Multiplying this value by an estimated 4.6 billion years yields 1092 as the maximum number of quantum events since Earth began to form.
The Square of All Possible Earthly Events
The square of 1092 is 10184.
The History of All Possible Human Events
Homo sapiens have walked the earth for approximately 300,000 years. For the sake of an extremely conservative estimate, assume that the present-day population, 8.2 billion people, was consistent throughout for the entirely of human history. That yields a total of 2.5x1015 human years. Finally, assuming on average, that distinct human actions take one minute, the maximum total number of human events in the history of earth is 1.3x1021. For the convenience dealing with whole powers of 10 in this paper, this will be rounded to 1021.
The Square of the History of All Possible Human Events
The square of 1021 is 1042.
A Single Lifetime
Assuming a lifetime of 100 years and one event every 30 seconds, a single human’s lifetime of events cannot reasonably exceed 107.
The Proposition
It is self-evident that any event of interest, such as the result of a coin flip, cannot have occurred more often than the total of all sub-atomic vibrations since the universe began. Indeed, all events of interest clearly occur many orders of magnitude less frequently since the number of different events occurring in the universe is vast.
Therefore, it is herein proposed that any event that is statistically less likely to occur than all the events of the universe can and should be properly described as “statistically impossible relative to the domain of all quantum events in the history of the universe.”
Similarly, any event that is less likely than the square of that standard can and should be described as “absurdly impossible relative to the domain of the square of all quantum events in the history of the universe.” It is “absurdly” impossible since this calculation even allows for consideration of 10120 multiverses.
Each of these descriptions can also be applied to smaller domains, including the domain of all earthly events, all human events, and other domains which may be carefully defined for specific analyses.
Example Application
Having established these definitions, we can now apply them to concrete examples which demonstrate their effectiveness at separating ludicrous assertions of the "possible" from the impossible.
For example, let us first take another look at the assertion that if a monkey were to type at keyboard for a long enough period of time, sooner or later it would be able to type out one of Shakespeare's sonnets.
Assume that the monkey types randomly and that it is a computer keyboard with x keys. Then the chance that the next key will represent be correct next letter is 1/x. Therefore, the chance that the n-th letter will be correct is (1/x)n.
Therefore, to determine the number of correct letters which can be typed in a row before crossing the boundary of impossibility we use the formula:
which reduces to:
or more generally,
If the monkey uses a computer keyboard on which there are 50 keys, then x is 50 and n = 70.6. This tells us that the odds of an immortal monkey typing even the first 71 characters of a Shakespearian sonnet as “statistically impossible relative to the domain of all quantum events in the history of the universe.”
To identify the limits for an absurd impossibility, 240 is substituted into the value for the log of the domain of events. This would lead us to conclude that any more than 141 randomly typed characters corresponding to a Shakespearian sonnet is “absurdly impossible relative to the domain of the square of all quantum events in the history of the universe.” This result proves that even in 120 multiverses, in which each generated a keystroke for literally every quantum fluctuation, it is absurdly impossible to generate more than 141 correct characters of a Shakespearian sonnet, or any other work.
As shown in
Table 1, if we instead use the domain of all human events (log=21) as our reference, the maximum number of keystrokes prior to declaring any higher number as impossible is only 12. In short, if we ever observe more than 12 sensible keystrokes from a monkey, we should discard the notion that this is just the result of chance. Instead, it is far more likely that the monkey knows more than we previously assumed. This example is also one which highlights why one might choose to define a different domain, such as a domain describing the entire history of the population of all species of monkeys [
1].
Notice that this task remains truly impossible no matter how many monkeys are assigned to typing duty. This is because we already allowed our god-like monkey to encompass the entire universe of all events. This universe of events, 10120 in all, does not change just because we add more monkeys. If two immortal monkeys are used, they could share the task, but their combined efforts could not exceed the total of all events in the universe.
Instead, it is obvious that we have already been exceedingly generous in our assumptions about this god-like monkey's ability to consume the totality of all the universe's atomic vibrations for achieving its task. Obviously, in real life literally billions of atomic vibrations are consumed in the single macro-event of the monkey's finger striking the keyboard, the mechanical recoil of the key, and the electronic processing of the resulting keystroke.
Let us look at another common occurrence in statistical theory, the flipping of a coin. If the coin is evenly balanced, it has one chance in two of coming up in a predicted way, therefore (x=2).
Table 1 shows that over 70 tosses following a predetermined sequence, such as all heads, is impossible in the domain of all human events, though 797 consecutive heads cannot be ruled out in the square of the universe of all quantum events. But here, reason suggests that a domain related to human events or even a single human lifetime of events is more suitable for testing the threshold of the impossible, since coin tossing is specifically a human event.
But to illustrate the dangers of narrowing a reference domain too far, we can go to another extreme. For example, what if an investigator were to define the domain of all events as 100, since she promised her colleagues that she was going to toss the coin exactly and only 100 times? Using 2 as the number of possible outcomes and 100 as the domain of possible events would yield a border for the impossible of 6.6 [log(100)/log(2)) = 6.64]. But that result is clearly not reasonable, since a run of seven heads will actually occur in approximately 55.3% of the sets of 100-coin tosses [
4].
This example demonstrates the problem of an improperly defined domain of all possible events. Clearly, the above experiment can be repeated many times. The investigators proposed domain of 100-coin tosses is not appropriate for calculating the threshold of impossibility because it is not overwhelmingly larger than the event being investigated (seven consecutive heads). Thus, we are reminded that the universe of all events must be carefully selected to fully encompass all possible opportunities for an event to occur.
Are there any practical uses for a statistical definition of the impossible?
An important scientific application will be discussed in a later paper. For now, since we are on the topic of tossing coins, it is sufficient to note that the above definition could be used to assess claims of psychic foreknowledge or mind-over matter. Unless a professed psychic can predict or control the outcome of a fairly tossed, evenly balanced coin 24 consecutive times, for example, the claim of achieving the impossible is not statistically proven within the domain of a human lifetime.
And what if a professed psychic did exceed 24 consecutive tosses of heads? Then we would be forced to conclude one of two points, either:
- a)
The psychic's achievement is the result of a non-random event, due either to a gimmicked coin or telekinesis, or,
- b)
The sub-universe of events upon which we are evaluating this achievement was defined too narrowly, it may be necessary to expand the domain to include all human events.
In short, if a sequence of events exceeds the threshold for impossible events within a specified domain, then either the events are non-random, or our perception of the universe must be wrong. Therefore, this definition of the impossible provides us with a tool to address new questions, and to question our assumptions.
Conclusions
This paper provides a conservative and objective benchmark for identifying when statistically computed odds are sufficient to determine that a sequence of events cannot possibly be due to random chance. Examples of this method are applied to the universe of all quantum events, all events in the history of Earth, or humankind, or in a single (long) lifetime.
This investigation is limited to defining what is “impossible” or even “absurdly impossible. It does not provide any standards for verbal descriptions such as “certain”, “likely”, “possible”, or “improbable” relative to any specific domain of reference.
In another field of statistics, Cohen’s d effect sizes are interpreted within the guidelines that 0.20 is a small effect, 0.50 a medium effect, and 0.80 a large effect. It would be helpful to build on the present paper by computing a ratio between the maximum number of events prior to becoming impossible and some other metric of possibility such that the resulting ratio would be useful for describing the likelihood of a series of events within a given domain such as either “certain”, “likely”, “possible”, “improbable”, and “impossible.”
Funding
This research received no external funding.
Conflicts of Interest
The author declare no conflict of interest.
References
- Woodcock S, Falletta J. A numerical evaluation of the Finite Monkeys Theorem. Franklin Open 2024, 9, 100171. [CrossRef]
- Goetz, J. Classical Probability, Shakespearean Sonnets, and Multiverse Hypotheses. International Society for Complexity, Information, and Design Archive , https://philpapers.org/rec/GOECPS (2006, accessed 9 May 2025).
- Lloyd, S. Computational Capacity of the Universe. Phys Rev Lett 2002, 88, 237901. [Google Scholar] [CrossRef] [PubMed]
- Schilling, M.F. The Longest Run of Heads. The College Mathematics Journal 1990, 21, 196–207. [Google Scholar] [CrossRef]
Table 1.
The maximum number of events prior to becoming impossible relative to the number of possible outcomes for the event of interest.
Table 1.
The maximum number of events prior to becoming impossible relative to the number of possible outcomes for the event of interest.
| Possible Outcomes |
lifetime (7) |
human (21) |
human2 (42) |
earth (92) |
earth2 (184) |
universe (120) |
universe2 (240) |
| 2 |
23.3 |
69.8 |
139.5 |
305.6 |
611.2 |
398.6 |
797.3 |
| 3 |
14.7 |
44.0 |
88.0 |
192.8 |
385.6 |
251.5 |
503.0 |
| 4 |
11.6 |
34.9 |
69.8 |
152.8 |
305.6 |
199.3 |
398.6 |
| 5 |
10.0 |
30.0 |
60.1 |
131.6 |
263.2 |
171.7 |
343.4 |
| 10 |
7.0 |
21.0 |
42.0 |
92.0 |
184.0 |
120.0 |
240.0 |
| 20 |
5.4 |
16.1 |
32.3 |
70.7 |
141.4 |
92.2 |
184.5 |
| 30 |
4.7 |
14.2 |
28.4 |
62.3 |
124.6 |
81.2 |
162.5 |
| 40 |
4.4 |
13.1 |
26.2 |
57.4 |
114.9 |
74.9 |
149.8 |
| 50 |
4.1 |
12.4 |
24.7 |
54.2 |
108.3 |
70.6 |
141.3 |
| 100 |
3.5 |
10.5 |
21.0 |
46.0 |
92.0 |
60.0 |
120.0 |
| 200 |
3.0 |
9.1 |
18.3 |
40.0 |
80.0 |
52.2 |
104.3 |
| 300 |
2.8 |
8.5 |
17.0 |
37.1 |
74.3 |
48.4 |
96.9 |
| 400 |
2.7 |
8.1 |
16.1 |
35.4 |
70.7 |
46.1 |
92.2 |
| 500 |
2.6 |
7.8 |
15.6 |
34.1 |
68.2 |
44.5 |
88.9 |
| 10,000 |
1.8 |
5.3 |
10.5 |
23.0 |
46.0 |
30.0 |
60.0 |
| 100,000 |
1.4 |
4.2 |
8.4 |
18.4 |
36.8 |
24.0 |
48.0 |
| 1,000,000 |
1.2 |
3.5 |
7.0 |
15.3 |
30.7 |
20.0 |
40.0 |
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).