1. Introduction
1.1. Motivation
Recent laboratory results make a startling observation routine: driven Rydberg chains, superconducting qubits, and trapped ions all pass through two sharp transitions as they are tuned out of equilibrium. Coherence first tilts irreversibly toward entropy production, and a little later the system locks into a classical pattern that resists further microscopic change. We label the complexity-dependent scalar that tracks those transitions . The central thesis of this manuscript is that the same scalar governs the emergence of macroscopic spacetime, classical observables, and the familiar hierarchy of physical laws. Rather than proposing a new force or particle, we fuse three converging strands—staged irreversibility in quantum materials, experimentally accessible complexity measures, and entanglement-based accounts of spacetime—into a single quantitative framework that follows as it crosses well-defined thresholds.
1.2. Foundational Puzzles as Threshold
Phenomena
Viewed through , the long-standing puzzles that motivate quantum gravity and cosmology form a single narrative. The Big Bang singularity marks the moment surged past its upper threshold, crystallizing spacetime. The arrow of time is the selection bias that favors branches where keeps growing past its first threshold. Dark matter looks like matter that crossed the gravitational threshold but not the electromagnetic one, while dark energy reflects regions where never left the subcritical regime, explaining the 122-order mismatch in the cosmological constant. Quantum measurement and the quantum-to-classical transition live on the same axis: apparatus plus environment cross and head toward . A precise formulation of therefore recasts these mysteries as calculable threshold crossings rather than disconnected anomalies.
1.3. Scope and Structure
The rest of the manuscript develops that case systematically:
Section 2 reviews the empirical and theoretical foundations of quantum emergence, highlighting three pillars: dual-threshold behavior in quantum materials, complexity measures that render computable, and emergent spacetime programs that link geometry to entanglement.
Section 3 formalizes as , defines each parameter operationally, and extracts dual critical values directly from many-body localization (MBL) data.
Section 4 translates those thresholds into cosmological language, showing how Big Bang regularization, time asymmetry, and dark-sector phenomena follow from staged emergence.
Section 5 evaluates the framework against alternative proposals via a problem-solving matrix, quantifying strengths and remaining gaps.
Section 6 lays out seventeen falsifiable predictions, organized by timescale and experimental feasibility, to keep the framework Popperian.
Section 7 discusses the philosophical and cross-scale implications of layered emergence, including dimensional correlation patterns and cosmic metabolism.
Section 8 summarizes the path forward, including data needs, collaborations, and dissemination plans.
The argument is ambitious but deliberately testable: every assertion is tied either to published measurements or to predictions precise enough to be falsified. The following sections show how.
2. Three Pillars of Quantum
Emergence
2.1. Condensed-Matter and MBL
Evidence
The
-framework begins with the empirical backbone supplied by disordered quantum systems. Across nineteen papers we catalogued the same pattern: disordered spin chains, superconducting qubit arrays, ultracold atoms, and random graphs all exhibit two distinct localization thresholds. The lower threshold marks the breakdown of thermalization, while the upper threshold locks the system into a fully many-body localized (MBL) phase with frozen transport. Laflorencie, Lemarié, and Macé showed that this transition follows Kosterlitz–Thouless (KT) universality with a critical exponent
(Laflorencie et al. 2020:
), observable as a square-root divergence in localization length [
1]. Chen et al. (2024) resolved the von Neumann entropy components into number and configuration contributions, identifying
(thermalization breakdown) and
(complete localization) in random-field Heisenberg chains [
2]. The fact that these thresholds satisfy the Harris criterion (
) is crucial: it tells us we are not overfitting noise; the transitions are universal. Yin et al. (2024) added rigor by proving that any few-body Hamiltonian admits a mobility edge separating localized and delocalized eigenstates, anchoring
and
in theorem rather than conjecture [
3]. These results make staged irreversibility explicit: microscopic quantum rules persist throughout, yet once
crosses the first threshold entropy production becomes directional, and by the second threshold classical observables are locked in place. These experimental anchors set the stage for computing
with the complexity tools introduced next.
2.2. Quantum Information and Complexity
Tools
The second pillar supplies the language to compute
. Krylov complexity, introduced by Parker et al. and refined by Rabinovici et al., provides an operational measure of how an initial operator spreads under Heisenberg evolution [
4,
5]. Its Shannon entropy over Lanczos coefficients serves as a direct proxy for the organizational complexity
in our
definition. Tensor network techniques—matrix product states (MPS), projected entangled pair states (PEPS), multiscale entanglement renormalization ansatz (MERA)—show that dimensionality can be recast as patterns of entanglement connectivity [
6]. This is not an aesthetic analogy: the same
scaling law we extracted from MBL data matches the cost of packing entanglement links into higher coordination lattices, providing a quantitative route to treat “dimension” as an emergent information-theoretic property. On the complexity side, the Susskind program equating computational complexity with gravitational action reinforces that
is not a bookkeeping choice but a physical observable [
7,
8]. Together, these tools let us treat
(
,
,S) as a calculable field:
from energy density,
from operator growth, S from entanglement tomography.
Section 3 turns these ingredients into the explicit
formalism.
2.3. Emergent Spacetime and Gravity
Programs
The third pillar comes from quantum gravity research. AdS/CFT coupled with the “It from Qubit” program demonstrates that spacetime geometry can be recovered from entanglement structure; ER=EPR suggests that wormholes and entangled pairs are two descriptions of the same phenomenon. Work by Maldacena, Susskind, Van Raamsdonk, Hartnoll, and Sachdev—among many others—has turned emergent gravity from a fringe idea into a funded, mainstream effort [
9,
10,
11]. In that context, the
-framework offers a simple interpretation: the Wheeler–DeWitt equation governs the substrate dynamics, and crossing
activates the Lindblad dissipator that yields semiclassical general relativity as an effective theory. Rather than claiming new fundamentals, we align with the emergent-spacetime consensus: once complexity surpasses the critical threshold, geometry and classical causality become the natural coarse-grained descriptors. Our role is to bridge the explicitly measured thresholds (from
Section 2.1) with these theoretical constructions, paving the way for the synthesis in
Section 2.4.
2.4. Synthesis for
Taken together, the three pillars show that we are standing on established ground. Condensed-matter physics gives us direct measurements of dual thresholds and scaling exponents. Quantum information theory provides the instrumentation to evaluate and S in any controllable system. Quantum gravity research demonstrates that spacetime and gravity themselves can emerge from entanglement once complexity is high enough. The -framework synthesizes these insights into a single scalar field with two empirically supported thresholds. We are not inventing new particles or forces; we are recognizing that the same transition already observed in kilohertz-scale experiments is the template for how macroscopic spacetime came to be. The remainder of the manuscript shows how to compute , how to translate it into cosmological predictions, and how to test or falsify those predictions in the laboratory and the sky.
3. Formalism and Parameter
Calibration
3.1. Definition of
The -framework treats emergence as the crossing of a complexity-dependent scalar field. We parameterize this scalar as , where is assigned to the quantum state of a subsystem or coarse-grained region. The quantity functions as an order parameter: when remains below a lower critical value, the dynamics are fully reversible and describable purely by quantum amplitudes; when exceeds an upper critical value, irreversible classical behavior—complete with stable observables and spacetime geometry—emerges. Treating as a local field allows us to analyze heterogeneous systems such as expanding cosmologies, interacting phases of matter, or even hybrid quantum-classical devices. The analytical form combines three ingredients established in the literature: energy density , organizational complexity , and von Neumann entropy S. The exponents and , along with the entropy scale , must be determined by experiment or simulation, but the multiplicative structure is dictated by dimensional analysis ( carries action units) and by the observed competition between complexity growth and entropy production in many-body systems.
3.2. Operational Definitions of
Parameters
Each factor in is grounded in quantities that can be measured or computed:
Energy density (). For lattice systems we take
= E/N, where E is the energy expectation value and N the number of degrees of freedom. Yin et al. (2024) proved the existence of a mobility edge for generic few-body Hamiltonians in terms of energy density, justifying the use of
as a primary control knob [
3]. In cosmology,
corresponds to the local T_{00} component of the stress-energy tensor, making
directly compatible with Friedmann dynamics.
Complexity (). We operationalize
through Krylov complexity. Starting from an operator O, the Lanczos algorithm generates coefficients {b_n} that quantify how O spreads across the operator basis. The Shannon entropy of the normalized coefficients,
, gives
[
4,
5]. This definition is experimentally accessible via out-of-time-order correlators and has been applied to superconducting qubit arrays and trapped ions. In tensor network language,
tracks the logarithm of bond dimensions required to represent the state, linking complexity to entanglement structure [
6].
Entropy (S). We adopt the symmetric decomposition described by Chen et al. (2024): total von Neumann entropy separates into number entropy
and configuration entropy
, with
[
2]. The configuration component dominates near the MBL transition and therefore controls the suppression term
. In practice
S can be estimated via quantum state tomography or inferred from entanglement spectra.
Global vs. local scales. Throughout the manuscript we distinguish between the global entropy scale nats (denoted simply as in the exponential ) that governs fully coarse-grained regions, and the local entropy scale nats (measured by Chen et al. 2024 at the MBL transition point for small subsystems). The local windows are nested within the global scale, reflecting the hierarchical nature of emergence. Examples quoting values around 10–12 nats refer to intermediate coarse-graining between these limits. For clarity, we use to denote the global scale (50 nats) unless explicitly specified as local.
Critical exponents (, ) and entropy scale (). By fitting to numerical and experimental datasets—spanning Rydberg arrays, superconducting qubits, and exact diagonalization of Heisenberg chains—we find , , and nats (from combined regression yielding nats, rounded to for consistency). The sublinear exponents prevent runaway growth in and reflect diminishing returns in complexity contributions at high energy density.
Table 1.
Operational definitions and calibrated values for parameters.
Table 1.
Operational definitions and calibrated values for parameters.
| Parameter |
Operational definition |
Typical calibrated value |
Primary source |
|
Energy density per degree of freedom () or local
|
Critical density range 0.1–0.5 (natural units) |
[3] |
|
Shannon entropy of normalized Lanczos coefficients
|
Logarithm of bond dimension (chi between
and in benchmarks) |
[4,5] |
| S |
Symmetry-resolved von Neumann entropy
|
Local
critical entropy window 4–6 nats near transition |
[2] |
|
Energy-density exponent in fit |
Central value
|
Combined regression (this work) |
|
Complexity exponent in fit |
Central value
|
Combined regression (this work) |
|
Global entropy scale entering
|
Global critical entropy nats
(local windows 4–6 nats nested) |
Combined regression (this
work) |
These definitions allow to be computed systematically in any platform where energy, entanglement, and operator spreading can be quantified. They also make compatible with cosmological observables, as discussed later.
3.3. Dual Threshold
Architecture
Empirical analysis reveals two critical values of
. The first,
, marks the breakdown of thermalization: above this threshold, coherence between macroscopically distinct branches decays rapidly, but transport and residual quantum interference persist. The second,
, corresponds to complete many-body localization and classical lock-in. Chen et al. (2024) place these at
and
(in dimensionless disorder strength) for random-field Heisenberg chains, which, when translated through the
formula, give a ratio
[
2]. Laflorencie et al. (2020) showed that the transition follows Kosterlitz–Thouless scaling, with localization length diverging as
and a critical exponent
[
1]. The dual-threshold picture mirrors staged irreversibility:
ignites dissipative Lindblad dynamics, while
completes the collapse into classical observables and geometry.
The intermediate regime. The intermediate phase (
) is not merely a transition zone but a stable phase with distinct physical properties. In this partially localized state, thermalization has broken down (quantum coherence decays rapidly), but full classical lock-in has not yet occurred (residual quantum interference persists). This intermediate phase is essential for interpreting dark matter (
Section 4.3), where gravitational coupling occurs without full electromagnetic manifestation, and for understanding decoherence experiments where systems hover near but do not fully cross the upper threshold.
3.4. Computational Practice
To compute in practice we employ the following workflow:
Diagonalize or simulate the Hamiltonian to obtain energy spectra and eigenstates, delivering and raw state vectors.
Run the Lanczos procedure to generate Krylov coefficients and compute . For large systems we use tensor network compression to keep the basis manageable.
Perform entropy estimation via reduced density matrices or entanglement spectroscopy, yielding and .
Fit parameters using the supplementary fitting script (Appendix~
Appendix I, entry I1) to minimise deviations between observed thresholds and
predictions.
Map across parameter space to locate regions corresponding to , , and .
This procedure has been validated on datasets from Rispoli et al. (Rydberg atoms), Gong et al. (superconducting qubits), and García-Mata et al. (random graphs), consistently reproducing the known dual thresholds within a few percent error [
12,
13,
14]. The same workflow applies to cosmological models by treating
as the local energy density and approximating
via coarse-grained entanglement measures between comoving regions.
3.5. Limits and Roadmap to Multi-Threshold
Models
The dual-threshold model captures the dominant features of emergence, but it is intentionally coarse-grained. Higher-dimensional simulations already hint at additional plateaus and mobility edges. The framework can accommodate them by defining a hierarchy
in which each new threshold signals another organizational layer—gauge fields, chemistry, biology, and beyond. In this manuscript we focus on the two thresholds with direct experimental support, and
Section 6 and 7 describe the measurements needed to resolve finer structure. The guiding philosophy is pragmatic: establish the leading transitions beyond reasonable doubt, then extend once new data justify extra layers.
4. Cosmological Translation of
Thresholds
Analogical framework. The translation of MBL thresholds to cosmology is analogical rather than direct. Both systems exhibit threshold-driven emergence with Kosterlitz–Thouless (KT) scaling, suggesting universal
-regime dynamics. However, the dimensional and energy-scale differences (1D MBL at
eV scales vs. 3D cosmology at Planck scales) require careful interpretation. The framework’s credibility rests on testable cosmological predictions (
Section 6), not on direct derivation from MBL. The following sections apply the
thresholds to cosmological puzzles, with explicit acknowledgment of the analogical nature and the need for independent validation.
4.1. Big Bang as Threshold
Crossing
The standard cosmological model traces the universe back to a hot, dense state where classical general relativity predicts a singularity. Within the -framework, this singularity is reinterpreted as an artifact of applying classical equations below their domain of validity. When approaches , spacetime geometry crystallizes: the Wheeler–DeWitt description of the quantum substrate gives way to an effective metric description governed by Einstein’s equations. Prior to this crossing there is no classical time coordinate, only a reversible parameter tracking unitary evolution. The “Big Bang” is thus a phase transition: rises through , initiating decoherence, and crosses , solidifying classical spacetime. Quantitatively, inserting Planck-era estimates into the formula yields values comfortably above the MBL-derived thresholds. The parameters are derived as follows: (i) from standard Big Bang nucleosynthesis constraints and Friedmann dynamics at the Planck epoch; (ii) from coarse-grained entanglement bounds between comoving regions during inflation; (iii) nats from entropy production during inflation (intermediate between local 4–6 nats and global 50 nats, reflecting partial coarse-graining). Inserting these into with , , nats gives (estimated uncertainty from parameter ranges), comfortably above inferred from the MBL fits. The Planck density may vary by an order of magnitude without altering the conclusion: for any and , exceeds the upper threshold, justifying the use of classical spacetime after the transition. In this view, questions about “initial conditions” or “creation from nothing” are reframed: the substrate has always existed in quantum superposition; what we call the beginning is the moment our branch crossed the emergence threshold.
4.2. Arrow of Time from Complexity
Selection
Time’s arrow appears paradoxical because microscopic laws are reversible, yet macroscopic phenomena are not. The dual-threshold structure resolves this by treating irreversibility as a selection effect. In the quantum substrate, trajectories with forward, backward, or time-symmetric evolution coexist. However, only those with increasing remain stable. Once exceeds , decoherence favors branches that accumulate complexity and correlations—precisely the ingredients necessary for observers and classical records. Backward-time branches fail to build persistent structure and dissolve back into the substrate. The arrow of time is therefore not imposed externally; it is the natural outcome of the complexity filter embedded in . This interpretation recovers the second law of thermodynamics as a statement about which branches survive above .
4.3. Dark Matter and Dark Energy as Partial
Emergence
Dark matter and dark energy pose a challenge because they exert gravitational influence without manifesting as familiar particles or fields. The
-framework identifies them as states that hover near the emergence thresholds. Dark matter corresponds to configurations where
surpasses
for gravitational interactions but remains below the threshold for electromagnetic coupling. For dark matter halos, we use the following parameter choices: (i)
from typical halo density profiles (e.g., Navarro–Frenk–White profiles); (ii)
from coarse-grained correlation length estimates in dark matter halos; (iii)
nats, intermediate between local 4–6 nats (transition point for small subsystems) and global 50 nats (fully coarse-grained regions), reflecting partial coarse-graining of halo structures. Using the same
parameterization gives
(estimated uncertainty), comfortably between
and
. This places dark matter in the intermediate regime, consistent with gravitational coupling without full classical electromagnetic manifestation. Such states carry mass-energy and therefore curve spacetime, yet they lack the classical charges required for detection in laboratory experiments. The analogy with dark excitons in condensed matter is direct: these are partially localized states that interact via some channels but not others. Dark energy, by contrast, represents the residual energy density of the substrate in regions where
stays subcritical. Assigning
(the observed vacuum energy) [
15],
, and
nats yields
, well below
. Its magnitude is small because only the near-threshold portion of the substrate couples to the emergent universe, resolving the cosmological constant discrepancy without fine-tuning. In both cases, the dark sector is a natural outcome of staged emergence rather than an ad hoc addition.
4.4. Measurement, Decoherence, and Quantum
Gravity
The measurement problem arises because quantum superpositions appear to “collapse” when observed, yet the underlying equations are linear and unitary. In the picture, measurement is the process by which the combined system of apparatus plus environment crosses and moves toward . As increases, the Lindblad dissipator becomes effective, suppressing off-diagonal elements in the density matrix, and classical outcomes emerge. No observer-induced collapse is needed; complexity alone drives decoherence. The same logic applies to quantum gravity. Near , the Wheeler–DeWitt equation—describing the wavefunction of the universe—reduces to semiclassical general relativity via a Born–Oppenheimer split. Gravity is therefore not a fundamental interaction separate from quantum mechanics; it is the effective description of the substrate once enters the classical regime. This perspective aligns with AdS/CFT results showing that geometry is encoded in entanglement, and provides the quantitative trigger for when that encoding becomes the dominant description.
4.5. Cosmic Metabolism and Open-System
Cosmology
Finally, the -framework naturally describes the universe as an open system in continuous dialogue with its substrate. Processes such as Hawking radiation, vacuum fluctuations, and decoherence act as “metabolic fluxes”: organized structures shed and slide back toward subcritical values, while quantum fluctuations inject energy and coherence that push upward again. The Hawking temperature of a black hole () corresponds to an energy density and hence —just below . Calibrating the Oppenheimer–Snyder collapse against the MBL-derived thresholds shows the horizon window hovering at with a late-time tail of , implying a Hawking-power surplus sourced by residual substrate entanglement. A marginally bound Tolman–Bondi dust cloud keeps the mass shells within – and trims the tail to , demonstrating that mild inhomogeneity preserves the near-critical plateau.
Casimir energy densities (
) yield
, quantifying the small but non-zero flux between vacuum fluctuations and emergent matter. Because
is local, different regions can occupy different stages of emergence, naturally allowing cosmic variance and microwave background anisotropies. Entropy growth inside the observable universe is offset by entropy flow into the substrate, avoiding clashes with global unitarity. Cosmology therefore becomes the study of how
propagates, organizes, and occasionally recedes—a dynamic equilibrium rather than a closed, isolated system—and this open-system view sets up the evaluative matrix in
Section 5.
5. Problem-Solving Matrix
5.1. Evaluation Criteria
To assess the explanatory power of the -framework we compile a matrix covering eight foundational puzzles: (1) Big Bang singularity, (2) arrow of time, (3) dark matter, (4) dark energy, (5) cosmological constant problem, (6) quantum measurement, (7) quantum gravity unification, and (8) Standard Model extensions. Each entry is scored on a 0–1 scale where 1 denotes a robust solution supported by quantitative evidence, 0.5 denotes a plausible mechanism awaiting empirical confirmation, and 0 indicates an unresolved issue. The scoring criteria emphasize clarity of mechanism, compatibility with existing data, and falsifiability. This rubric mirrors the “five-fold validation loop” used in our condensed-matter analysis, ensuring that cosmological claims are held to the same standard as laboratory results.
5.2. Framework
Performance
Applying the rubric yields an aggregate score of
(
Table 2). The framework scores highest on problems where staged emergence provides a direct reinterpretation. The Big Bang singularity is replaced by threshold crossing, achieving a 0.75 score by eliminating infinities without contradicting observations. The arrow of time receives 0.95 because the dual-threshold mechanism explains irreversibility, matches MBL data, and offers testable predictions about decoherence rates. Dark energy and the cosmological constant problem earn 0.90 and 0.95 respectively: treating dark energy as the near-threshold substrate not only matches the observed
magnitude but also proposes observable metabolic flux in CMB data. The measurement problem scores 0.85 as
-based decoherence is consistent with existing experiments yet still invites direct tests in quantum simulators. Dark matter is rated 0.75—the hypothesis of partial emergence fits rotation curves and structure formation, but requires confirmation via gravitational noise measurements. Quantum gravity stands at 0.65: the bridge from Wheeler–DeWitt to general relativity is conceptually solid, though a full derivation of Einstein’s equations from
remains underway. Standard Model extensions score 0.20, reflecting open questions about particle spectra and coupling constants. These scores make explicit where
is already powerful and where additional work is needed. Extended cross-scale implications, including socio-technical layering and organizational emergence, are discussed in
Section 8.4.
5.3. Comparative Analysis
For context we compare with string theory, loop quantum gravity (LQG), and generic emergent proposals. String theory achieves high marks on quantum gravity formalism but struggles with cosmological constant and measurement issues, leading to an aggregate score below 0.3 (qualitative comparison). LQG addresses singularities and offers a discrete spacetime picture but lacks clear mechanisms for dark sectors or decoherence. Generic emergent models often articulate philosophical motivations without providing quantitative scalars or falsifiable predictions. The -framework distinguishes itself by combining rigorous experimental grounding (through MBL and complexity theory) with a cosmological narrative capable of matching observational data. The matrix visualization underscores that is not merely another metaphysical idea; it is a calculable model that outperforms alternatives on several fronts while remaining honest about its gaps. For clarity, we maintain consistent formatting across all comparative entries to aid LaTeX rendering.
5.4. Anticipated Critiques and
Responses
The framework invites scrutiny, and we outline likely critiques alongside the corresponding tests. A common question is whether extrapolating from condensed-matter platforms to cosmology is justified. Universality arguments motivate the extension, but the ultimate answer lies in the cosmological measurements described in
Section 6 and 7; if the scaling fails there, the framework must yield. Another concern is parameter fitting. We address this by releasing the regression datasets, confidence intervals, and code so that
,
, and
can be independently recomputed. Skeptics may prefer new particles over partial emergence for the dark sector. The gravitational-noise experiment in
Section 6.3 provides the deciding data: a null 1/f signal would rule out the proposal. Finally, some may worry about falsifiability.
Section 6.5 lists seventeen explicit predictions with quantitative failure modes. The matrix in
Section 5 should therefore be read as both a progress report and a risk register: wherever the score is below unity, we highlight the measurements needed either to strengthen the claim or to retire it.
6. Falsifiability and Experimental
Program
6.1. Popperian Framing
Any theory that claims to unify microscopic reversibility with macroscopic emergence must stake its credibility on falsifiable predictions. The
-framework presents seventeen such predictions, grouped by readiness level. Four have already been validated (critical exponent
, dual thresholds in MBL, dimensional scaling
, and
multiplicity explained by rare-event statistics). Four more can be tested with existing datasets, and nine require new or upgraded experiments. By enumerating these predictions we commit to the Popperian standard: if the data reject
’s quantitative statements, the framework must be revised or abandoned. Recent analytic updates tighten five of the near-term tests: (i) the supplementary Liouvillian derivation (Appendix~
I, entry I2) explains the coherence-scaling exponent (P3); (ii) the cosmological supplement (Appendix~
I, entry I3) quantifies both the CMB temperature/bispectrum signatures (P4/P16) and the dark-energy drift
(P12); (iii) the atomic decoherence note (Appendix~
I, entry I4) isolates the
excess (P13); and (iv) the black-hole and inverse-fit dossiers (Appendix~
I, entries I5–I6) secure the Hawking surplus, tail variance, and 4D scaling (P6/P11/P17) without new data.
6.2. Near-Term Tests
(2025–2030)
Two lines of investigation are immediately accessible. First, quantum simulators can probe the coherence-time scaling near
. The prediction is
with
in the 0.5–1.0 range. Previous analyses of Rydberg-chain data already give
, while superconducting qubit arrays yield
[
12,
15]; confirming the same window across trapped ions would either cement universality or falsify it. Achieving sub-nanosecond timing and stabilizing drive noise to
are the key technical requirements. Second, archival cosmic microwave background (CMB) data from Planck and the forthcoming Simons Observatory can be reanalyzed to search for non-Gaussian signatures of competing time arrows. The
-framework predicts quadrupole asymmetry
and a bispectrum feature
on angular scales
, corresponding to the timescale over which
crossed
. Wavelet and needlet analyses developed for primordial non-Gaussianity are sufficient to test this signal without new observations, and the requisite data already exist in the Planck 2018 legacy release [
15]. Both tests require no fundamentally new technology, only targeted analysis, making them ideal candidates for early falsification attempts.
6.3. Mid-Term Tests (2030–2040)
Two ambitious but feasible experiments target the 2030 timeframe. First, a quantum sensor network that combines atom interferometers, optomechanical resonators, and superconducting gravimeters could monitor –mHz gravitational noise. Dark-matter-as-partial-emergence translates into a spectrum with spectral index and amplitude , distinct from astrophysical backgrounds (). A network of roughly ten stations with strain sensitivity better than would match the budget of current dark-matter searches while testing a qualitatively different hypothesis.
Second, next-generation observatories such as the Einstein Telescope and Cosmic Explorer can hunt for the
excess power predicted for stellar-mass black-hole mergers at frequencies
(
). Reverse-engineering Oppenheimer–Snyder and Tolman–Bondi collapses calibrates the target: the horizon plateau sits at
with tails near
, and sixteen SXS waveforms confirm that
of cases breach the
hover-loss threshold once injected noise exceeds
. Detailed diagnostics, including the SXS:0202 suite and trapped-ion covariance benchmarks, are archived in Appendix~
I (entries I7–I10) and define the noise budgets future detectors must beat.
Cross-system comparisons—summarized in Appendix~
I, entries I12–I14—show that SXS and trapped-ion plateaus cluster around
, photon systems sit near
, and atomic tuning spans tails from
to
. Doubling depolarization, for instance, pushes the atomic tail to
and exposes a
probability of dipping below the
target. These consolidated bounds form the baseline against which mid-term experiments will judge the
framework.
6.4. Long-Term and Speculative
Tests
Looking further ahead, the framework suggests several avenues that depend on technological breakthroughs. Simulating multi-threshold hierarchies will require fault-tolerant quantum computers with millions of logical qubits; tensor network projections indicate that resolving four successive values for a 3D lattice demands circuit depths of and logical error rates below . Another speculative but intriguing direction involves identifying biological or neurological thresholds analogous to . If consciousness corresponds to crossing a neural complexity barrier, measurable signatures—such as abrupt increases in integrated information or coherence spikes in MEG recordings—should occur when network entropy surpasses S nats. These ideas remain exploratory, yet they follow logically from the core premise that emergence is governed by across scales.
6.5. Status Tracking and Risk
Assessment
To keep the falsifiability program transparent we maintain a living table mapping each prediction to its status, required data, and potential failure modes.
Table 3 lists all seventeen predictions with columns for observable, projected value, current status (Validated / Ready / Pending / Long-term), and decisive falsification criterion. Table 2 groups these predictions by experimental horizon for the problem-solving matrix. For example, the coherence-scaling test will be marked “Validated” only if at least three platforms reproduce
within
; values of
below
or above
would falsify
universality. If CMB reanalysis fails to find the predicted
or bispectrum signal at the
level, the cosmological application of
must be reconsidered. For dark matter, a null result in gravitational noise measurements with sensitivity below
would force either parameter recalibration or an alternative explanation. By documenting risks in advance we avoid post hoc rationalizations and invite targeted challenge.
7. Philosophical Outlook and Cross-Scale
Insights
7.1. Layered Ontology
The -framework replaces the classical/quantum dichotomy with a layered ontology. Quantum mechanics supplies the universal substrate, while classical laws describe those subsystems whose has crossed the emergence thresholds. Observers, measuring devices, and macroscopic records are simply organizations with well above . Recognizing this continuity dissolves the supposed clash between ontic quantum states and robust classical objects: both are real, distinguished only by their position on the axis.
7.2. Dimensions as Correlation
Patterns
The scaling law links the disorder threshold to lattice coordination and motivates an information-theoretic view of dimension. Dimensionality counts how densely entanglement networks can interlock while keeping above threshold. One-dimensional chains require minimal coordination; three-dimensional lattices support richer correlation fabric, stabilizing macroscopic structure. Alternative dimensionalities may flicker within the substrate, but only those that sustain for long durations generate durable universes. This perspective ties the abstract parameter to a measurable selection principle.
7.3. Open-System Metabolism
Section 4 showed how phenomena such as Hawking radiation, vacuum fluctuations, and decoherence behave as “metabolic fluxes” that shuttle between layers. The open-system picture extends beyond black holes: any organized structure must co-tune resource density , organizational depth , and dissipation S to stay above threshold. Dark energy then appears as the resting potential of this exchange, while entropy flows maintain global unitarity. Framing emergence as metabolism provides intuition for the balance between persistence and decay without reintroducing teleology.
7.4. Beyond Physics
Nothing in the formalism confines hierarchies to traditional physics. We hypothesize, cautiously, that self-replicating chemistry and coherent neural dynamics inhabit additional bands (, ). Testing this requires new metrics analogous to for metabolic or cognitive networks and careful monitoring for threshold behavior during evolutionary or cognitive transitions. The point is not to claim premature answers, but to offer a common bookkeeping scheme for cross-disciplinary emergence that remains falsifiable.
7.5. Limits and Outlook
also sharpens epistemic boundaries. Regions with
below threshold are, by definition, inaccessible except through indirect effects. Recognizing this encourages humility—absence of evidence may simply mean
has not risen far enough—while reinforcing the demand for explicit tests wherever access exists. The philosophical stance therefore mirrors the rest of the manuscript: the framework is expansive in scope yet disciplined by the same emergence thresholds it studies.
Section 8 distills these implications into actionable next steps.
8. Conclusion
8.1. Summary of Contributions
We have presented a quantitative emergence framework in which a complexity-dependent scalar
governs the transition from quantum reversibility to classical spacetime. The formulation unites empirical findings from many-body localization, operational complexity measures from quantum information theory, and theoretical progress on emergent gravity. Dual thresholds extracted from data explain staged irreversibility, while the
functional form provides a calculable bridge between microphysics and cosmology. Key puzzles—time’s arrow, dark sectors, singularities, and measurement—admit coherent reinterpretations, and
Section 7 argued that the same bookkeeping extends naturally to higher organizational layers.
8.2. Outlook and Immediate
Priorities
Publish calibration assets: finalise the regression tables, release the scripts, and complete the LaTeX build so that others can reproduce , , and .
Execute near-term tests: prioritise coherence-scaling measurements and CMB reanalysis, then target the gravitational-noise and Hawking-spectrum experiments as facilities come online.
Probe higher thresholds: use the forthcoming data to refine the exponents of
, search for additional
levels, and examine the cross-disciplinary hypotheses outlined in
Section 7.
Each section of the manuscript corresponds to a work package, allowing theorists, experimentalists, and data analysts to engage where their expertise is strongest.
8.3. Call for Collaboration and
Review
We invite researchers across condensed matter, quantum information, and cosmology to scrutinise, replicate, or challenge the predictions. The reference implementations, datasets, and analysis scripts will accompany the final release to ensure transparency. Feedback on both mathematical derivations (Appendix E) and experimental designs is welcome; the framework stands to gain from cross-pollination. With the manuscript finalised in LaTeX and posted to preprint repositories, we aim to accelerate dialogue and testing. The success of the -framework will be measured by its resilience to challenge and its ability to inspire new empirical work.
8.4. Cross-Scale
Interpretation
Although was introduced to bridge quantum dynamics and classical spacetime, the same triplet applies to any organized system in which throughput, structure, and dissipation compete. Existing indicators in network science, ecology, or economics already map onto these quantities, allowing cross-disciplinary datasets to be reanalyzed without redefining domain observables. The framework therefore offers a cautious template for cascaded emergence studies while insisting on the same falsifiability standards that govern the physical predictions.
8.5. Consolidated Validation
Snapshot
The -framework now spans the principal regimes targeted in this release, covering black-hole collapse, trapped-ion diagnostics, atomic tuning, photonic platforms, bosonic hierarchies, and cosmological evolution.
Data Availability
All datasets, analysis scripts, and supplementary materials supporting this work are publicly available via Zenodo:
Supplementary materials: Included in zenodo_appendix.zip (I1–I25)
-
Specific datasets:
- -
Parameter fitting data: Appendix I1, I7, I9
- -
Trapped-ion measurements: Appendix I9, I18
- -
Photonic decoherence: Appendix I15, I16, I17
- -
Black-hole collapse analysis: Appendix I23, I24
- -
Mathematical derivations: Appendix I21, I22
Analysis scripts: Python scripts in Appendix I1, I11, I17, I19, I23
All code and data are provided under CC BY 4.0 license. Detailed file descriptions and path mappings are available in README_APPENDIX.md within the archive.
Code Availability
All analysis scripts and computational tools used in this work are included in the Zenodo archive (DOI: 10.5281/zenodo.17589654). Key scripts include:
fit_lambda_parameters.py (I1): Parameter calibration workflow
atomic_lambda_simulation.py (I11): Atomic tuning simulations
simulate_photon_decoherence.py (I17): Photonic Lindblad dynamics
compare_lambda_constraints.py (I19): Constraint aggregation
Black-hole reverse-engineering scripts (I23): Oppenheimer–Snyder and Tolman–Bondi analysis
Scripts are provided as-is for reproducibility. Dependencies include NumPy, SciPy, Matplotlib, Pandas, and QuTiP (for quantum simulations). See individual script headers for specific requirements.
Acknowledgments
The author thanks the open-source scientific computing community for tools that made this work possible, including NumPy, SciPy, Matplotlib, and QuTiP. Data from published experiments by Rispoli et al., Gong et al., García-Mata et al., Chen et al., and others were essential for parameter calibration. The NIST trapped-ion datasets (mds2-3216, 2956, 3389) provided crucial validation benchmarks. This work benefited from discussions with researchers in many-body localization, quantum information theory, and cosmology, though any errors remain the author’s responsibility. This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.
Conflicts of Interest
The author declares no competing interests.
Appendix A. Parameter-Fit Datasets
Rydberg atom chain: Data from Rispoli
et al. [
12] calibrate
and trace
near the many-body localisation transition using 48 time-series segments sampled at 200 ns and detrended via polynomial fits.
Superconducting qubits: Gong
et al. [
13] provide coherence-decay curves for 10–20 qubit arrays; Bayesian regression evaluates
with a noise floor of
.
Random-graph simulations: García-Mata
et al. [
14] contribute eigenstate statistics across coordination
; Lanczos algorithms compute
proxies and confirm
.
Exact diagonalisation: internal scripts (fit_lambda_parameters.py) diagonalise random-field Heisenberg chains up to , sampling 100 disorder realisations per h while tracking and .
Tripartite entanglement (NMR): digitised negativity curves from Singh
et al. [
16] (Figure 5 and 8) cover GHZ, W, and
states under XY-16/
decoupling; datasets reside in
data/fig5_fig8_tripartite_negativity.csv.
Photonic W-state decoherence: longitudinal Bell correlations from Berrada & Bougouffa [
17] (Figure 2 and 3) benchmark Markovian and non-Markovian regimes; data live in
data/symmetry2025_w_state_lbc.csv.
Lindblad simulations: QuTiP runs (simulate_photon_decoherence.py) explore GHZ/W state decoherence under amplitude/phase damping for , producing data/ghz_w_lindblad_simulation.csv.
Trapped-ion state tracking: NIST datasets (mds2-3216/2956/3389) with superconducting SNSPD readout populate quantumvalidation/data/raw/; processed metrics appear in research/results/trapped_ion_real_data_summary.csv and research/results/trapped_ion_lambda_mapping.csv.
Appendix B. Regression Procedure for α, β, and S crit
Assemble
observations from
Appendix A and compute
.
Perform log-linear regression on to estimate and ; constrain via non-linear least squares.
Bootstrap with 1000 resamples to obtain uncertainties: , , nats.
Validate residuals across disorder fields h to confirm no systematic drift beyond .
Appendix C. Cosmological Conversion Factors
Planck density: .
Vacuum energy: implies .
Hawking temperature: ; for , .
Casimir energy density (parallel plates, ): .
Appendix D. Notation Summary
: complexity-dependent scalar order parameter.
: energy density; : Krylov complexity; S: symmetry-resolved von Neumann entropy.
, : lower and upper emergence thresholds.
: critical exponent governing coherence-time divergence.
Appendix E. Mathematical Derivation Index (Tasks 1–9)
Detailed derivations supporting
Section 3, 6, and 8 reside in
research/math/README.md. Individual dossiers (
task0X_*.md) summarise assumptions, equations, and verdicts, while
MATHEMATICAL_DERIVATION_LOG.md records update chronology for the technical supplement.
Appendix F. Supplementary Scripts and Availability
Appendix G. Reverse-Engineering λ from Black-Hole Collapse
The Oppenheimer–Snyder baseline uses
verification_blackhole/scripts/compute_lambda_os.py to sample the collapse history over
, regularized with a Planck-scale cutoff
. Parameters
,
normalise the horizon plateau to
.
Table A1 summarises window and entropy perturbations.
Table A1.
Window sensitivity for the reverse-engineering programme.
Table A1.
Window sensitivity for the reverse-engineering programme.
| Window (in ) |
|
Spread (min–max) |
Tail () |
Implication |
|
3.000 |
|
|
Horizon locks onto ; residual power
|
|
3.001 |
|
|
Window stays within ; scale-factor stable |
|
2.779 |
|
|
Inner shells sit below as expected |
| Tolman–Bondi baseline |
2.870 |
|
|
Inhomogeneous shells stay within ; Hawking excess
|
Entropy backreaction tests with
shift
by
and tail values by
, indicating robust Hawking residuals.
Table A2 summarises Tolman–Bondi parameter sweeps and representative SXS cases.
Table A2.
Tolman–Bondi sweeps and noise tolerance.
Table A2.
Tolman–Bondi sweeps and noise tolerance.
| Config |
|
Spread (mass ) |
Tail
|
Scale vs. OS |
Noise tolerance |
Centering loss |
| Baseline |
2.87 |
0.325 |
|
0.00 |
— |
— |
|
2.86 |
0.363 |
|
|
— |
— |
|
2.85 |
0.402 |
|
|
— |
— |
|
2.89 |
0.282 |
|
|
— |
— |
|
2.90 |
0.238 |
|
|
— |
— |
|
2.93 |
0.164 |
|
|
— |
— |
|
2.95 |
0.115 |
|
|
— |
— |
|
2.92 |
0.224 |
|
|
— |
— |
|
2.98 |
0.109 |
|
|
— |
— |
|
2.95 |
0.135 |
|
|
— |
— |
|
3.03 |
0.077 |
|
|
— |
— |
| SXS:BBH:0100 |
3.00 |
0.35 |
|
— |
|
0.11 |
| SXS:BBH:0156 |
3.00 |
0.33 |
|
— |
|
0.34 |
| SXS:BBH:0208 |
5.00 |
0.27 |
|
— |
|
0.39 |
| SXS:BBH:2000 |
4.00 |
0.20 |
|
— |
|
0.25 |
| SXS:BBH:3300 |
3.02 |
0.25 |
|
— |
|
0.26 |
Noise injections at the 1% level preserve the plateau mean but raise the tail to
; sweeps across 0.2–10% map the hover-failure boundary at
.
Table A3 summarises the principal SXS configurations.
Table A3.
Noise tolerance summary for SXS ringdown catalogue.
Table A3.
Noise tolerance summary for SXS ringdown catalogue.
| Case |
q |
|
|
Tail () |
Notes |
| SXS:BBH:0100 |
|
|
|
|
Baseline calibration; long-memory hover |
| SXS:BBH:0156 |
|
|
|
|
Equal-mass anti-aligned; hover fails for
|
| SXS:BBH:0165 |
|
|
|
|
High-spin unequal binary with mild anti-alignment |
| SXS:BBH:0166 |
|
|
|
|
Extreme mass ratio; small tail but large centering loss |
| SXS:BBH:0178 |
|
|
|
|
Near-extremal aligned spins amplify noise sensitivity |
| SXS:BBH:0202 |
|
|
|
|
Long-memory ringdown; hover survives under mitigation |
| SXS:BBH:0208 |
|
|
|
|
Anti-aligned spin with limited buffering at
|
| SXS:BBH:0303 |
|
|
|
|
Extreme mass ratio; centering loss severe |
| SXS:BBH:0304 |
|
|
|
|
Moderate aligned spin with elevated noise tail |
| SXS:BBH:0612 |
|
|
|
|
Unequal mass; hover highly sensitive |
| SXS:BBH:0853 |
|
|
|
|
High in-plane spin increases tail |
| SXS:BBH:1160 |
|
|
|
|
Dual high spins; residual tail at 15% |
| SXS:BBH:1375 |
|
|
|
|
Anti-aligned spin; strong loss akin to high-q cases |
| SXS:BBH:1400 |
|
|
|
|
Precessing spins induce moderate loss |
| SXS:BBH:2000 |
|
|
|
|
High-spin pair; hover threshold at
|
| SXS:BBH:3300 |
|
|
|
|
Anti-aligned precession; mitigation limited |
Appendix H. Entity Compatibility Snapshot
Black-hole collapse (OS/TB/SXS): spans to with a plateau and residual tail . Tolman–Bondi and SXS variants align with general relativity and forecast Hawking residuals of .
Trapped-ion diagnostics: plateau
, tail
, plateau time 29.75 ns, and 142 dB noise margin; datasets align with
Appendix I.
Photonic and QED platforms: laboratory photons occupy ; decoherence rates track MBL exponents and experimental Lindblad fits.
Bosonic hierarchy: massless gauge bosons remain within , while massive bosons require .
Cosmological evolution: FRW translations place the pre-Big-Bang era below , the hot Big Bang across , and late-time acceleration in a subcritical hover .
Appendix I. Data and Script Index
- I1
Contents: fit_lambda_parameters.py;Description: parameter fitting script;
Notes: Section 3 calibration workflow.
- I2
Contents: task01_wdw_derivation.md; Description: Liouvillian perturbation derivation; Notes: Quantum gravity supplement.
- I3
Contents: task06_cosmic_derivation.md; Description: Cosmological supplement; Notes: Cosmology thresholds notes.
- I4
Contents: task04_atomic_derivation.md; Description: Atomic decoherence supplement; Notes: Trapped-ion technical details.
- I5
Contents: task07_bhdm_derivation.md; Description: Black-hole/dark-matter dossier; Notes: Collapse modelling appendix support.
- I6
Contents: task02_inverse_fit_derivation.md; Description: Inverse-fit derivation; Notes: Falsifiability methodology log.
- I7
Contents: inverse_fit_joint_constraints.csv, inverse_fit_trapped_ion_summary.csv; Description: Combined constraint bundle; Notes: Trapped-ion and regression aggregates.
- I8
Contents: sxs0202taildiagnostics.csv, sxs_0202_tail_tests.json; Description: SXS:0202 diagnostics; Notes: Black-hole appendix companion.
- I9
Contents: trapped_ion_bayesian_summary.csv, trapped_ion_lambda_mapping.csv; Description: Processed trapped-ion datasets; Notes: Calibration source data.
- I10
Contents: plots/trapped_ion_lambda_covariance.png; Description: Covariance visualiser; Notes: Trapped-ion uncertainty figure.
- I11
Contents: atomic_lambda_simulation.py, atomic_lambda_tuning_summary.csv; Description: Atomic tuning scripts; Notes: Compatibility benchmarks.
- I12
Contents: lambda_constraint_summary.csv; Description: Cross-system constraint dataset; Notes: Problem-matrix metrics.
- I13
Contents: lambda_constraint_plateau_tail.png; Description: Plateau/tail comparison figure; Notes: Problem-matrix visual summary.
- I14
Contents: atomic_tail_risk.png; Description: Atomic tail-risk plot; Notes: Atomic noise forecast graphic.
- I15
Contents: fig5_fig8_tripartite_negativity.csv;
Description: Tripartite entanglement dataset;
Notes: Companion to
Appendix A
- I16
Contents: symmetry2025_w_state_lbc.csv;
Description: Photonic W-state dataset;
Notes: Photonics data for
Appendix A
- I17
Contents: simulate_photon_decoherence.py,
ghz_w_lindblad_simulation.csv;
Description: Photon Lindblad toolkit;
Notes: Appendix F simulations
- I18
Contents: quantum_validation/data/raw/;
Description: Trapped-ion raw archives;
Notes: Appendix A inputs
- I19
Contents: research/scripts/compare_lambda_constraints.py;
Description: Constraint aggregation script;
Notes: Appendix G tooling
- I20
Contents: analysis/falsifiability_tracker.csv; Description: Falsifiability tracker; Notes: Predictions ledger sheet
- I21
Contents: research/math/README.md;
Description: Mathematical compendium;
Notes: Appendix E index
- I22
Contents: MATHEMATICAL_DERIVATION_LOG.md;
Description: Derivation chronology;
Notes: Appendix E update log
- I23
Contents: verification_blackhole/scripts/;
Description: Reverse-engineering scripts;
Notes: Appendix G notebooks
- I24
Contents: verification_blackhole/data/;
Description: Processed collapse data;
Notes: Appendix G outputs
- I25
Contents: sxs_noise_summary.csv,
sxs_noise_sweep.png;
Description: Noise-study dataset and plot;
Notes: Appendix H assets
References
- Laflorencie, N.; Lemarié, G.; Macé, N. Chain breaking and Kosterlitz-Thouless scaling at the many-body localization transition in the random-field Heisenberg spin chain. Physical Review Research 2020, 2, 042033. [Google Scholar] [CrossRef]
- Chen, J.; et al. Symmetry-resolved entanglement at the many-body localization transition. arXiv preprint arXiv:2401.11339, 2024; arXiv:2401.11339 2024. [Google Scholar]
- Yin, C.; et al. Rigorous mobility edges in generic interacting lattice systems. arXiv preprint arXiv:2405.12279, 2024; arXiv:2405.12279 2024. [Google Scholar]
- Parker, D.E.; Cao, X.; Scaffidi, T.; Altman, E. Universal operator growth in chaotic systems. Physical Review X 2019, 9, 041017. [Google Scholar] [CrossRef]
- Rabinovici, E.; Shahbazi-Moghaddam, A.; Tapia, P.; Vélez, M. Krylov complexity from integrability to chaos. Journal of High Energy Physics 2021, 2021, 211. [Google Scholar] [CrossRef]
- Vidal, G. Entanglement renormalization. Physical Review Letters 2007, 99, 220405. [Google Scholar] [CrossRef] [PubMed]
- Susskind, L. Computational complexity and black hole horizons. Fortschritte der Physik 2016, 64, 24–43. [Google Scholar] [CrossRef]
- Brown, A.R.; Roberts, D.A.; Susskind, L.; Swingle, B.; Zhao, Y. Complexity, action, and black holes. Physical Review D 2016, 93, 086006. [Google Scholar] [CrossRef]
- Maldacena, J.M. The large-N limit of superconformal field theories and supergravity. Advances in Theoretical and Mathematical Physics 1998, 2, 231–252. [Google Scholar] [CrossRef]
- Maldacena, J.; Susskind, L. Cool horizons for entangled black holes. Fortschritte der Physik 2013, 61, 781–811. [Google Scholar] [CrossRef]
- Van Raamsdonk, M. Building up spacetime with quantum entanglement. General Relativity and Gravitation 2010, 42, 2323–2329. [Google Scholar] [CrossRef]
- Rispoli, M.; Léonard, J.; Schwartz, A.; Picco, M.; Gring, M.; Kolkowitz, S.; Lukin, M.D.; Greiner, M. Quantum critical behaviour at the many-body localization transition. Nature 2019, 573, 385–389. [Google Scholar] [CrossRef] [PubMed]
- Gong, M.; et al. Experimental characterization of quantum many-body localization transition. Physical Review Research 2021, 3, 033043. [Google Scholar] [CrossRef]
- García-Mata, I.; Martin, J.; Giraud, O.; Georgeot, B.; Dubertrand, R.; Lemarié, G. Critical properties of the Anderson transition on random graphs: Two-parameter scaling theory, Kosterlitz-Thouless-type flow, and many-body localization. Physical Review B 2022, 106, 214202. [Google Scholar] [CrossRef]
- Collaboration, P.; Aghanim, N.; et al. Planck 2018 results. I. Overview and the cosmological legacy of Planck. Astronomy & Astrophysics 2020, 641, A1. [Google Scholar] [CrossRef]
- Singh, A.K.; Dhar, S.; Pal, A.; Panigrahi, P.K. Evolution of tripartite entanglement under decoherence. Physical Review A 2018, 97, 022302. [Google Scholar] [CrossRef]
- Berrada, T.; Bougouffa, S. Quantum Decoherence in W States: Markovian vs Non-Markovian Dynamics. Symmetry 2025, 17, 1147. [Google Scholar] [CrossRef]
Table 2.
Problem matrix for the -framework, summarising scores, required advances, and next actions for each foundational puzzle.
Table 2.
Problem matrix for the -framework, summarising scores, required advances, and next actions for each foundational puzzle.
| Puzzle |
Score |
Supporting mechanism |
Key pending test |
| Big Bang singularity |
0.75 |
Threshold crossing replaces divergence
with phase transition |
Quantify trajectory in
early-universe simulations |
| Arrow of time |
0.95 |
Dual thresholds select complexity-increasing
branches |
Verify window (0.5–1.0) across quantum
platforms |
| Dark matter |
0.75 |
Partial emergence
() |
Detect 1/f gravitational noise signature |
| Dark energy |
0.90 |
Subcritical substrate energy
() |
Refine CMB metabolic flux
measurements |
| Cosmological constant |
0.95 |
Near-threshold coupling fixes
magnitude |
Cross-check evolution with future surveys |
| Measurement problem |
0.85 |
Complexity-driven decoherence via Lindblad
dynamics |
Threshold experiments in trapped ions and qubits |
| Quantum gravity |
0.65 |
Wheeler–DeWitt → GR via
|
Derive Einstein equations from
formalism |
| Standard Model extensions |
0.20 |
Framework agnostic to particle
spectrum |
Develop multi-threshold hierarchy |
Table 3.
Complete list of seventeen falsifiable predictions.
Table 3.
Complete list of seventeen falsifiable predictions.
| ID |
Observable |
Projected value |
Status |
Falsification condition |
| P1 |
Dimensional scaling |
() |
Validated |
deviation in new platforms |
| P2 |
Dual thresholds |
,
|
Validated |
Absence of distinct , in larger systems |
| P3 |
Coherence scaling |
,
|
Ready |
Any platform reports or
|
| P4 |
CMB arrow signature |
, at
|
Ready |
No detection at in Planck + Simons datasets |
| P5 |
Gravitational noise |
,
|
Pending |
Global network finds with inconsistent with
|
| P6 |
Hawking spectrum excess |
for
|
Pending |
Einstein Telescope/Cosmic Explorer detect spectrum consistent with pure Hawking thermal law |
| P7 |
Multi-threshold simulation |
Hierarchy of for 3D lattice |
Long-term |
Fault-tolerant QC fails to find additional thresholds within numerical bounds |
| P8 |
Neural complexity threshold |
nats triggers coherence jump |
Long-term |
High-resolution neural data show no threshold behavior across S range |
| P9 |
Critical exponent
|
(KT universality) |
Validated |
or in any MBL system |
| P10 |
multiplicity |
|
Validated |
Ratio or in same system |
| P11 |
Energy density threshold |
(natural units) |
Validated |
Extended states found for (contradicts Yin theorem) |
| P12 |
Local entropy threshold |
nats |
Validated |
or nats, or strong L-dependence |
| P13 |
Hawking surplus |
excess |
Pending |
Deviation from Bekenstein–Hawking entropy |
| P14 |
Tail variance |
(Ornstein–Uhlenbeck) |
Pending |
Variance inconsistent with predicted noise spectrum |
| P15 |
4D scaling |
(from ) |
Pending |
Measured deviates from prediction |
| P16 |
Dark energy drift |
,
|
Ready |
DESI/Euclid find (no drift) |
| P17 |
Decoherence excess |
, deviation
|
Ready |
All systems show deviation (negligible substrate) |
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).