1. Introduction
The Church-Turing thesis, formulated independently by Alonzo Church and Alan Turing in the 1930s, stands as one of the foundational principles of computer science. In its classical form, the thesis states that any function that can be effectively calculated can be computed by a Turing machine. This principle has guided the development of computation theory for nearly a century and underlies our understanding of what is computable.
However, the classical formulation contains a crucial gap: it treats computation as an abstract mathematical process while ignoring the physical substrate required for any actual computation. Real computers are physical systems that must obey the laws of thermodynamics, quantum mechanics, and relativity. These physical constraints impose fundamental limits on what can actually be computed in our universe, regardless of mathematical computability.
1.1. The Gap Between Mathematical and Physical Computation
The distinction between mathematical and physical computation has become increasingly important as we approach fundamental physical limits in computing technology. Consider the following physical constraints:
Thermodynamic Constraints: Landauer’s principle establishes that erasing one bit of information requires at least energy, where is Boltzmann’s constant and T is temperature. This provides a fundamental lower bound on the energy cost of irreversible computation.
Quantum Speed Limits: The Margolus-Levitin theorem proves that a quantum system with energy E can perform at most orthogonalizing operations per second, where ℏ is the reduced Planck constant. This provides an absolute speed limit for any physical computation.
Information Density Bounds: The Bekenstein bound limits the amount of information that can be stored in a finite region of space with finite energy. For a spherical region of radius R containing energy E, the maximum information content is .
Relativistic Constraints: Information cannot travel faster than light, imposing fundamental limits on communication in distributed computational systems.
These physical constraints suggest that not all Turing-computable functions can be physically realized with finite resources.
1.2. Toward a Physical Church-Turing Thesis
We propose a fundamental revision of the Church-Turing thesis that explicitly acknowledges the physical nature of computation:
Physical Church-Turing Thesis: Any effectively calculable function that can be physically computed must respect the fundamental constraints imposed by physical law, including thermodynamics, quantum mechanics, and relativity.
This leads to the recognition that physical computability, when properly defined with asymptotic resource scaling, can form a proper subset of Turing computability under explicit resource constraints we formalize.
1.3. Our Contributions
This paper develops the theoretical foundations of the Physical Church-Turing Thesis through:
Rigorous Framework: We develop a mathematically precise framework based on erasure complexity that connects physical constraints to computational limits.
Provable Energy Bounds: We provide concrete theorems showing how time-space-coherence trade-offs force unavoidable bit erasures, yielding quantitative energy lower bounds.
Reversible Computation Integration: We properly account for Bennett’s reversible computation results to distinguish between operations and erasures.
Physical Complexity Classes: We define new complexity classes that account for multiple physical resources simultaneously.
Practical Applications: We derive implications for quantum computing, artificial intelligence, and energy-efficient algorithm design.
2. Mathematical Framework for Physical Computation
2.1. Asymptotic Physical Computability
We begin by addressing the fundamental issue of defining physical computability in a way that is neither trivial nor vacuous.
Definition 1
(Asymptotic Physical Computability). Fix a device model with resource counters representing energy, time, space, bandwidth, and coherence respectively. A total function is-physically computable under resource budget if there exists a family of devices and an algorithm A such that on all inputs x with , halts with output and uses resources .
We say f isphysically scalableif is polynomial in n component-wise.
Model Assumptions: Throughout this work, we assume: (i) ambient temperature T is fixed and finite; (ii) logical erasure refers to irreversible discarding of computationally relevant information; (iii) energy E represents free energy available to computation; (iv) coherence C represents the maximum number of quantum degrees of freedom that can be maintained in superposition simultaneously; (v) all bounds apply in the weak gravity regime with well-defined system boundaries.
Model Assumptions
Ambient temperature T is fixed; Q counts dissipated heat (free energy lost to the bath). A logical erasure is any many-to-one map on the computational degrees of freedom. Energy E denotes average available energy for QSL bounds; space S is peak working memory; bandwidth B counts communicated bits; coherence C bounds simultaneously maintained live state. All asymptotics are in input size n.
This definition avoids the triviality of requiring finite resources for all inputs while maintaining meaningful constraints on resource scaling.
2.2. Erasure Complexity and Landauer’s Principle
The key insight for connecting physical constraints to computational complexity is to focus on erasure complexity rather than operation count.
Definition 2
(Erasure Complexity). For a computational problem f and input size n, define as the minimum number oflogically irreversible bit erasuresneeded by any algorithm that computes f on inputs of length n within time and workspace .
Lemma 1
(Landauer Erasure Cost).
Let a computation at ambient temperature T perform logical bit erasures. Then the dissipated heat satisfies
Proof: This follows directly from Landauer’s principle, which has been experimentally verified by Bérut et al. (2012) and others.
2.3. Reversible Computation and Bennett’s Results
A crucial component of our framework is properly accounting for reversible computation, which can dramatically reduce erasure requirements.
Theorem 1
(Bennett Reversible Simulation). For any , a multitape Turing machine running in time T and space S can be simulated by a logically reversible machine in time and space .
Implication: The minimum energy depends on the number of erasures, not total operations. Via reversible computation, erasure can be made sublinear in step count (up to output and cleanup costs).
However, when both time and space are simultaneously constrained, reversible strategies may be forced to perform erasures:
Definition 3
(Computation DAG and Reversible Pebbling Price). Let be an algorithm class (e.g., circuits of fan-in 2 and bounded depth, or tree-like resolution refutations). For input length n, any induces a DAG whose sinks are the output/accept nodes. Let be the minimum number of irreversible discards (erasures) required by any reversible pebbling schedule that uses at most S pebbles and completes within deadline T.
Theorem 2
(Time-Space-Erasure Tradeoff, Precise).
Fix an algorithm class and budgets . For every on inputs of length n, any implementation that computes all sinks of within workspace S and deadline T must perform at least
logically irreversible bit erasures. Consequently .
Proof Sketch: The proof uses reversible pebble game lower bounds. When both space (pebbles) and time are constrained below the reversible pebbling threshold, any pebbling strategy must perform "forced unpebblings" without available prerequisites, corresponding to irreversible information discards and thus erasures.
3. Quantum Speed Limits and Universal Computational Bounds
3.1. Margolus-Levitin Bound
The quantum speed limit provides a fundamental constraint on the rate of computation.
Theorem 3
(Universe Operations Bound).
Any device of average energy E operating for time T performs at most
elementary orthogonalizing operations (Margolus-Levitin bound).
For the observable universe with J and age s, this gives total possible operations.
3.2. Cosmic-Uniform Unrealizability
We can now formulate a precise notion of physical unrealizability:
Lemma 2
(Cosmic-Uniform Unrealizability). Suppose the total free energy available to computation in our universe is upper bounded by . If computing f on inputs of length n requires, in any implementation within model , at least logically necessarybit erasures with for all sufficiently large n, then f is not uniformly realizable within a single such universe.
Remark: The burden is to lower-bound necessary erasures, not step count. This requires problem-specific analysis using reversible pebbling techniques.
4. Physical Complexity Classes
4.1. Definitions and Basic Properties
We define complexity classes that account for multiple physical resources simultaneously:
Definition 4
(Physical Complexity Classes). Fix resource budgets .
consists of decision problems solvable by devices respecting component-wise.
consists of problems verifiable under the same resource constraints.
consists of problems solvable in polynomial space with polynomial energy.
Proposition 1
(Physical Complexity Hierarchy). .
Note: The inclusion assumes that the verifier’s energy consumption is also polynomially bounded, not just the prover’s certificate length.
Open Problem: Does ? This question depends on whether polynomial-time verification can be achieved with significantly fewer erasures than polynomial-time solution.
4.2. Energy-Based Separations
Rather than claiming unconditional collapses, we can prove conditional separations based on erasure complexity:
Theorem 4
(Conditional Energy Separation). If there exist problem families in NP requiring erasures for any polynomial-time, polynomial-space algorithm, then under polynomial energy constraints.
5. Concrete Applications and Lower Bounds
5.1. SAT and Reversible Pebbling
We can derive concrete energy lower bounds for specific problem families using reversible pebbling techniques:
Theorem 5
(Energy Lower Bound for Pebbling CNFs under Tree-like Resolution). Let be CNF families obtained from pebbling graphs . If , then any tree-like resolution refutation of that uses workspace and completes within must perform erasures. Hence .
Proof Sketch: The proof uses the correspondence between tree-like resolution space and reversible pebbling numbers. When space is constrained below the reversible pebbling threshold, the resolution algorithm must discard clauses irreversibly to make progress within the time bound.
5.2. Quantum Computing and Physical Constraints
For quantum algorithms, we can derive rigorous bounds based on the quantum speed limit:
Theorem 6
(Per-gate Quantum Speed Limit).
Let generate gate U on the computational subspace. Then the time τ to implement U satisfies
where is any submultiplicative operator norm and is a unitary distance (e.g., Fubini-Study-induced or geodesic on ).
Proposition 2
(Power-limited QEC Throughput).
For surface code with distance d, let be the control+measurement energy per cycle and the cycle rate needed for target logical error. If is the available cooling power at the relevant stages, then the steady-state logical-gate rate satisfies
where counts the number of cycles per logical gate (including magic-state distillation).
6. Information-Theoretic Bounds on Intelligence
6.1. Task-Based Capability Bounds
Rather than defining a vague "intelligence capacity," we provide rigorous bounds on specific computational tasks:
Lemma 3
(Landauer-Fano Bound).
For any M-ary decision task with error probability at most δ, any device at temperature T must dissipate
Proof: Fano’s inequality gives , so . Each bit of irreversible information gain costs by Landauer’s principle.
6.2. Decision Rate Bounds
Combining quantum speed limits with information-theoretic requirements:
Theorem 7
(Physical Decision Bounds). For an M-ary decision with error probability at most δ:
- (i)
Energy-limited: Over energy budget Q, the number of decisions satisfies
- (ii)
Speed-limited: If each decision requires orthogonalizing steps, then
Proof: Part (i) follows from Lemma 6.1 by dividing the total energy budget by the per-decision energy requirement. Part (ii) follows from the Margolus-Levitin bound by dividing the total operation rate by the operations required per decision.
7. Information Storage and the Bekenstein Bound
7.1. Storage Density Limits
Proposition 3
(Bekenstein Storage Bound).
For a weak-gravity, well-isolated system of energy E and radius R, the total storable information (bits) is bounded by
Scope and Limitations: This bound applies in the weak gravity regime with well-defined system boundaries. The bound assumes the system can be treated as isolated and that quantum field theory corrections are negligible. Edge cases involving black hole formation, strong gravitational fields, or systems with poorly defined boundaries may require modified analysis.
8. Implications and Future Directions
8.1. Algorithm Design Principles
The Physical Church-Turing Thesis suggests several principles for optimal algorithm design:
Minimize Erasures: Design algorithms to minimize irreversible operations through reversible computation techniques.
Time-Space-Energy Trade-offs: Optimize across multiple resource dimensions simultaneously rather than focusing on single metrics.
Coherence Management: For quantum algorithms, balance coherence requirements against decoherence timescales.
Energy-Aware Complexity: Consider energy complexity alongside traditional time and space complexity.
8.2. Open Problems
Several important questions remain open:
Erasure Complexity Characterization: Develop general techniques for lower-bounding necessary erasures for broad problem classes.
Physical vs Classical Separations: Determine which classical complexity separations persist under physical constraints.
Quantum Coherence Complexity: Develop a rigorous theory of coherence as a computational resource.
Biological Computation: Extend the framework to understand how biological systems achieve computational efficiency.
9. Conclusions
We have established a rigorous mathematical framework for the Physical Church-Turing Thesis that recognizes computation as an inherently physical process. Our main contributions include:
Rigorous Definitions: We provided mathematically precise definitions of physical computability that avoid triviality while maintaining meaningful constraints.
Erasure-Based Energy Bounds: We developed a framework based on erasure complexity that yields provable energy lower bounds via Landauer’s principle.
Reversible Computation Integration: We properly accounted for Bennett’s reversible computation results to distinguish between operations and erasures.
Concrete Lower Bounds: We provided specific theorems connecting time-space-coherence trade-offs to unavoidable erasures for explicit problem families.
Quantum Applications: We derived rigorous bounds for quantum computing based on quantum speed limits and error correction requirements.
Information-Theoretic Foundations: We established precise bounds on decision-making capabilities using Fano’s inequality and Landauer’s principle.
The Physical Church-Turing Thesis represents a paradigm shift from abstract mathematical computation to physically grounded computational theory. By recognizing that computation is fundamentally constrained by physical law, we gain both theoretical insights into the nature of computation and practical guidance for designing optimal computational systems.
Future work should focus on developing general techniques for characterizing erasure complexity, exploring the implications for specific computational problems, and investigating how biological and artificial systems can be designed to operate efficiently within fundamental physical constraints.
The framework developed here provides a foundation for understanding the ultimate capabilities and limitations of any computational system that can exist in our physical universe, bridging computer science and fundamental physics in a mathematically rigorous way.
AI Assistance Statement
Language and editorial suggestions were supported by AI tools; the author takes full responsibility for the content.
Data Availability Statement
No new data were generated or analyzed in this study.
Acknowledgments
The author thanks the Octonion Group research team for valuable discussions and computational resources. Special recognition goes to the broader computational complexity and quantum computing communities whose foundational work made this synthesis possible. The author particularly acknowledges the detailed technical feedback that helped strengthen the mathematical rigor of this work.
Conflicts of Interest
The author declares no conflicts of interest.
References
- Church, A. An unsolvable problem of elementary number theory. American Journal of Mathematics 1936, 58(2), 345–363. [Google Scholar] [CrossRef]
- Turing, A. M. On computable numbers, with an application to the Entscheidungsproblem. Proceedings of the London Mathematical Society 1936, 42(2), 230–265. [Google Scholar]
- Landauer, R. Irreversibility and heat generation in the computing process. IBM Journal of Research and Development 1961, 5(3), 183–191. [Google Scholar] [CrossRef]
- Bérut, A.; Arakelyan, A.; Petrosyan, A.; Ciliberto, S.; Dillenschneider, R.; Lutz, E. Experimental verification of Landauer’s principle linking information and thermodynamics. Nature 2012, 483(7388), 187–189. [Google Scholar] [CrossRef] [PubMed]
- Bennett, C. H. Logical reversibility of computation. IBM Journal of Research and Development 1973, 17(6), 525–532. [Google Scholar] [CrossRef]
- Bennett, C. H. Time/space trade-offs for reversible computation. SIAM Journal on Computing 1989, 18(4), 766–776. [Google Scholar] [CrossRef]
- Margolus, N.; Levitin, L. B. The maximum speed of dynamical evolution. Physica D: Nonlinear Phenomena 1998, 120(1-2), 188–195. [Google Scholar] [CrossRef]
- Bekenstein, J. D. Universal upper bound on the entropy-to-energy ratio for bounded systems. Physical Review D 1981, 23(2), 287–298. [Google Scholar] [CrossRef]
- Lloyd, S. Ultimate physical limits to computation. Nature 2000, 406(6799), 1047–1054. [Google Scholar] [CrossRef] [PubMed]
- Vitányi, P. M. B. Locality, communication, and interconnect length in multicomputers. SIAM Journal on Computing 1990, 19(4), 683–707. [Google Scholar] [CrossRef]
- Lange, K.-J.; McKenzie, P.; Tapp, A. Reversible space equals deterministic space. Journal of Computer and System Sciences 2000, 60(2), 354–367. [Google Scholar] [CrossRef]
- Chan, S. M. Just a pebble game. In Proceedings of the 2013 IEEE Conference on Computational Complexity; 2013; pp. 133–143. [Google Scholar]
- Torán, J.; Wörz, F. Reversible pebble games and the relation between tree-like and general resolution space. Computational Complexity 2021, 30(1), 1–32. [Google Scholar] [CrossRef]
- Knill, E. An analysis of Bennett’s pebble game. arXiv 1995, arXiv:math/9508218. [Google Scholar]
- Cover, T. M.; Thomas, J. A. Elements of Information Theory, 2nd ed.; Wiley, 2006. [Google Scholar]
- Fano, R. M. Transmission of Information: A Statistical Theory of Communications; MIT Press, 1961. [Google Scholar]
- Shor, P. W. Algorithms for quantum computation: discrete logarithms and factoring. Proceedings 35th Annual Symposium on Foundations of Computer Science; 1994; pp. 124–134. [Google Scholar]
- Preskill, J. Quantum computing in the NISQ era and beyond. Quantum 2018, 2, 79. [Google Scholar] [CrossRef]
- Fowler, A. G.; Mariantoni, M.; Martinis, J. M.; Cleland, A. N. Surface codes: Towards practical large-scale quantum computation. Physical Review A 2012, 86(3), 032324. [Google Scholar] [CrossRef]
- Deutsch, D. Quantum theory, the Church-Turing principle and the universal quantum computer. Proceedings of the Royal Society of London A 1985, 400(1818), 97–117,. [Google Scholar]
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).