Physical Sciences

Sort by

Article
Physical Sciences
Theoretical Physics

Jesper Lyng Jensen

Abstract: This work develops a Lorentz-invariant variational framework in which Fisher-information geometry appears as an intrinsic structural contribution to quantum dynamics. Motivated by longstanding attempts to connect quantum mechanics with information-theoretic principles, we introduce an action functional depending on the density and phase fields in the Madelung representation. Variation of this action yields a modified Klein–Gordon equation containing a single nonlinear term proportional to the four-dimensional Fisher-information curvature of the probability density. The standard Klein–Gordon equation is recovered when the structural parameter vanishes, ensuring full compatibility with established relativistic dynamics. Taking the nonrelativistic limit, we obtain a uniquely determined nonlinear Schrödinger equation in which the correction term is the functional derivative of the Fisher information. The resulting dynamics preserve probability, maintain the Hamilton–Jacobi correspondence, and contain the linear Schrödinger equation as a special case. Analytical expressions for Gaussian and superposed states demonstrate how the structural modification scales with spatial localization and interference structure, providing clear qualitative signatures that distinguish the model from previous nonlinear extensions and offer a theoretical basis for future experimental verification. The results establish a mathematically transparent link between information geometry and quantum dynamics and provide a foundation for future extensions to fermionic, gauge, and many-body systems.
Article
Physical Sciences
Theoretical Physics

Paulo Jorge Adriano

Abstract: The MMA–DMF framework connects cosmological “dark sector” phenomenology with quantum-foundational phenomena by treating a single screened scalar field as both a mediator of large-scale modified gravity and a stochastic vacuum bath responsible for gravitational decoherence. This paper consolidates the full, dated MMA–DMF validation record contained in the project materials (with an audited, frozen parameter set) and reports the complete test suite relevant to uncertainty and decoherence: (i) a strict Fluctuation–Dissipation Theorem (FDT) stability test for the Generalized Langevin Equation (GLE) memory kernel, which passes an energy-drift criterion of |slope| < 10−5 in long integrations; (ii) a dynamic contextuality roll-off test in which the CHSH Bell parameter transitions from the Tsirelson value S ≈ 2.828 at quasi-static settings to the classical bound S → 2 under fast modulation, quantified by explicit frequency-dependent suppression formulas; and (iii) a T-MAGIS atom-interferometry campaign prediction in which a density-modulated environment produces a detectable contrast loss ∆V ≈ 3.4 × 10−3 to 4 × 10−3 under representative configurations, with a tabulated scaling versus distance and interrogation time and a shot-noise sensitivity forecast yielding high signal-to-noise for hour-scale integration. We also summarize MMA–DMF-linked phenomenology across scales, including a joint cosmological likelihood structure with cross-covariance correction and representative reported values (H0, S8) ≈ (72.1 km s−1 Mpc−1, 0.761), plus a gravitational-wave echo delay estimate of ∆techo ≈ 32 ms for stellar-mass systems. The combined record constrains MMA–DMF by demanding simultaneous thermodynamic consistency of the stochastic sector, a controlled transition from contextual to classical correlations under finite response time, and a falsifiable laboratory decoherence signature under controlled density modulation.
Article
Physical Sciences
Theoretical Physics

Melih Gümüş

,

Bilgehan Barış Öner

Abstract: In this study, it is shown that inserting geometric drift vectors within definition on tetrads have direct effects on torsion tensor. This contribution reveals an original point of view on gravitation while still being compatible with standard approaches and recent cosmological observations. Theoretical calculations of galactic velocities and critical radii can be made without need of dark matter. The difference between geometric drift vector coefficients are strong candidates to explain dark energy.
Article
Physical Sciences
Theoretical Physics

G. Furne Gouveia

Abstract: We explore a theoretical framework addressing the supernova shock revival problem through matter phase transitions during core collapse. Extending Lockyer's geometric model, we investigate how extreme compression might induce reversible energy storage in modified nucleonic states. The proposed mechanism exhibits 2:1 energy release upon decompression, naturally yielding $\sim$1\% mass-to-energy conversion, matching observed explosion energies without fine-tuning. Analytical estimates suggest that the collapse energies could reach thresholds for such transitions, potentially explaining both shock revival and the supernova/black-hole dichotomy.
Article
Physical Sciences
Theoretical Physics

Filip Dambi

Abstract: The study of the emission, propagation, and reflection of balls leads to the ballistic law, which applies to balls with mass and to hypothetical massless balls. A natural extension of the ballistic law is to encompass massless entities such as light. The phenomenon that a ball or a light emitted by a source inherits the velocity of its source has the mathematical expression given by the ballistic law: In the absolute frame, the propagation velocity of a ball or light is the vector sum of the velocity of the ball or light emitted by the source and the source's velocity. The emission of balls is a discrete phenomenon, whereas the emission of light is a continuous phenomenon. The ballistic law explains and proves in each inertial frame where a source of light and a mirror are at rest, why the speed of light is the universal constant c of electromagnetic nature, why each law of physics has the same form, and why no experiment in such a frame can prove its motion. By the end of the 18th century and afterward, there was insufficient knowledge of light behavior to explain the Michelson-Morley experiment. The ballistic law presents the emission, propagation, reflection, and observation of light, without which the study of light cannot be achieved correctly. Thus, the ballistic law exposes the irrationality of the Lorentz transformation and of special relativity, revealing unacceptable conclusions that do not even respect the principle of relativity as understood at the time. For example, the theory of special relativity incorrectly applies the symmetry seen in certain phenomena to two inertial frames. Thus, it duplicates a spherical wavefront from one inertial frame, considered stationary, to another inertial frame. But the duplication looks like an ellipsoid with one of its foci in the origin of the inertial frame; therefore, it requires another equation different from a sphere, which contradicts Einstein’s first postulate. Lorentz’s transformation mixes different times in the relativistic inertial frame, which is unacceptable. He makes the speed of each wavefront along its new radius/path equal to c, but the wavefronts travel along their radii at different times. Thus, in the relativistic inertial frame, each wave requires its own clock synchronization, which is unacceptable. All the lengths in the relativistic inertial frame are absolute, including the radii ending on the fictive ellipsoid. There are no length contractions as special relativity claims. These conclusions from the Lorentz transformation and special relativity refute them. Instead, the constancy of time passage in the universe and the variability of the propagation speed of light are confirmed. Reproduced from Physics Essays, Vol. 38, Page No. 222, Year 2025, with permission from Physics Essays Publication.
Article
Physical Sciences
Theoretical Physics

Mohamed Khorwat

Abstract: This paper develops the Entropic Resonance Principle (ERP) as a unified informational framework for understanding how organized systems persist across physical, biological, cognitive, and engineered domains. ERP proposes that stability arises not from resisting entropy but from a regulated co-variation between coherence (R) and entropy (H), expressed by the proportionality dR/dH≈ λ ,where the resonance parameter λ=ln⁡φ≈0.4812 is derived from a minimal self-similar renewal model. This proportionality admits both a flux form and a variational form, δ(R- λH)=0 ,which together define persistent trajectories in an informational state space. ERP does not modify microphysical laws; rather, it functions as a meta-theoretical constraint that may emerge under appropriate coarse-graining. The paper clarifies the mathematical structure of ERP, analyzes its conceptual implications, and outlines empirical predictions that render the framework testable and falsifiable. Applications are explored in quantum decoherence, non-equilibrium chemistry, neural dynamics, adaptive computation, and complex engineered systems. A methodological protocol is proposed for estimating effective slopes dR/dH in real data using sliding-window regression, bootstrap uncertainty quantification, and model comparison. ERP is ultimately positioned as the nucleus of a research programme whose validity hinges on whether λ-like proportionalities recur across systems and scales. If supported, ERP may reveal a previously unrecognized informational invariant governing the persistence of structure; if not, it offers a precise template for evaluating how coherence and entropy jointly shape organized behavior.
Article
Physical Sciences
Theoretical Physics

Haiqiang Wu

Abstract: This paper initiates a dual-route derivations of Hubble's Law within standard relativistic and Newtonian frameworks, grounded in the intrinsic perfect symmetry of the local inertial frame. This reveals that the fundamental aim of a local inertial frame's motion is to restore Poincaré symmetry, from which it follows that the flow velocity of a strong gravitational field equals its gravitational acceleration multiplied by one second. Thus the standard theories inherently contain the seed of dynamic field theory. Mass Complex Space Theory (MCST) is the guidance to push the standard theories toward full dynamization and higher-dimensionalization. MCST introduces the “Quadruple State of the Planck Quantum ☯(h, Φ_ih, Φ_h, ih )” as the elementary complex-space generating element, whose hyper-cycle dynamically unifies matter and spacetime. A key finding is that gravitational field space is constituted by diverging negative-energy “positron state ( )”. Consequently, dark matter and dark energy are unified as field effects of hyper-cycling mass complex space. Emergent from this framework is a gravitational circulation field ( ). This field makes the apparent Keplerian mass exhibits a counter-intuitive monotonic decrease (mass inversion) in the outer halo of the Milky Way (22.5–26.5 kpc). Moreover, owing to its intrinsic “negative-energy positron state” property, naturally generates huge magnetic field toriods, the structure of which aligns remarkably with the observed galactic magnetic field, thereby achieving a preliminary structural unification of gravitation and electromagnetism on galactic scales without additional assumptions, and further providing theoretical foundation for the quantization research on the unification at the subatomic scale.
Article
Physical Sciences
Theoretical Physics

Piotr Ogonowski

Abstract: Alena Tensor is a recently discovered class of energy-momentum tensors that proposes a general equivalence of the curved path and geodesic for analyzed spacetimes which allows the analysis of physical systems in curvilinear (GR), classical and quantum descriptions. This paper demonstrates that extending the existing dust description to a form that provides a full matter energy-momentum tensor in GR, naturally leads to the development of a halo effect for continuum media. This result provides a good approximation of the galaxy rotation curve for approximately 100 analyzed objects from the SPARC catalog and allows for further adjustments dependent on anisotropy and energy flux. The same equations in flat spacetime allow for the inclusion of rotation-related effects in the quantum description, model quantum vortices and reproduce Mashhoon effect. This provides a physical interpretation of mass generation as an emergent property of the phase-spin equilibrium and enables a reconstruction of the Yukawa and Higgs mechanisms as consequence of the stability conditions of quantum vortices.
Article
Physical Sciences
Theoretical Physics

Mário Sérgio Guilherme Junior

Abstract: This work introduces the theoretical framework of Momentary Quantum Tunneling (MQT), proposing that the final state of a rotating black hole (Kerr geometry) is not a classical singularity, but rather a \emph{quantum bounce} of finite curvature, described by Loop Quantum Gravity (LQG). The classical metric function $\Delta(r)$ is regularized through \textbf{effective coupled functions of mass ($M$) and angular momentum ($a$)}, expressed as $\Delta_{q}(r) = r^2 - 2m_{\mathrm{eff}}(r)\,r + a_{\mathrm{eff}}^{2}(r)$, producing a nonsingular core. The resulting dynamics, derived from the effective Hamiltonian constraints of LQG, reveal a transient contraction–expansion cycle, in which the collapsing region undergoes a momentary tunneling into an expanding white-hole domain. Although this transition is ultrafast in internal proper time ($\tau$), it appears cosmologically long for an external observer due to extreme gravitational time dilation. This model provides a continuous gravitational evolution (collapse, bounce, and expansion), offering a semiclassical bridge between General Relativity and Quantum Mechanics. Potential astrophysical signatures and connections to cosmological bounces are discussed, suggesting a new route for resolving the black-hole information paradox.
Article
Physical Sciences
Theoretical Physics

Anatolie Croitoru

Abstract: This paper proposes a fractal model of the universe, which includes and the additional dimensions. The universe's fractal is distinct from known fractals, it is adapted to the proposed theory. The cells of each fractal level is the self-similar components of the fractal and they are analogous to galaxies. A direct similarity exists between Hadrons and Galaxies, as they represent the self-similar cells of the fractal structure. These cells coexist in the universe only within groups, known as Atoms on the microscopic scale and Galaxy Groups/ Clusters on the cosmic scale. At each fractal level, the cells are initiated by nodes inspired by the stars. Electrons are the same hadrons, and hadrons's Nodes are formed from undetectable lower fractal levels, which replaces the currently assumed extra dimensions. In this model, quantum energy arises at any fractal level from nodes; they appear as excitations of equilibrium within fractal cells. In the universe, there exists both a fractal structure of matter accumulation and a parallel fractal of quanta. The unification of quanta with matter results in the emergence of material substance, thus requiring multiple frames of reference. Once the self-similar part of the fractal universe is correctly identified, a coherent model of the universe is obtained, one that also applies to undetectable subatomic levels. Through the rigorous definition of reference frames and by dividing the universe into multiple relative surfaces, a Theory of Everything can be derived. At every level of the fractal structure of the universe, there exist nodes (similar to stars) that emit their own sub-particles, which are absorbed by “nests” (molecular clouds). For this reason, the universe is dynamic and regenerative, on each fractal level. But the simple nature of the dynamics is followed by a large amount of different adjacent natures.
Article
Physical Sciences
Theoretical Physics

Mohamed Sacha

Abstract: We develop a quantitative framework linking quantum information copy time (QICT), gauge-coded quantum cellular automata (QCA), asymptotically safe gravity, and singlet-scalar dark matter. On the microscopic side, we consider an effectively one-dimensional diffusive channel embedded in a gauge-coded QCA with an emergent SU(3)$\times$SU(2)$\times$U(1) structure. For a conserved charge $Q$, we define an operational copy time $\tcopy(Q)$ and show, under explicit locality and hydrodynamic assumptions, that %\[ $\tcopy(Q)\;\propto\;\bigl(\chisqmicro\bigr)^{-1/2},$ %\] where $\chisqmicro$ is an information-theoretic susceptibility built from the Kubo--Mori metric and the inverse Liouvillian squared. A conditional theorem establishing this scaling, together with numerical tests on stabiliser-code models up to linear size $L=96$, is formulated below and proved in a Supplemental Material. Within a gauge-coded QCA that realises a single Standard-Model-like generation, we identify hypercharge $Y$ as the unique non-trivial anomaly-free Abelian direction that couples to both quark and lepton sectors, and we exhibit explicitly how, in the $(B,L,Y)$ charge space, anomaly cancellation singles out the hypercharge direction. We further show that, on the anomaly-free subspace, a quadratic susceptibility functional is extremised along the hypercharge direction. We then match the microscopic QICT parameters to a thermal Standard Model plasma at a benchmark temperature $T_\star = 3.1~\text{GeV}$, using ideal-gas expressions for susceptibilities, and adopt an asymptotically safe functional renormalisation group (FRG) benchmark for gravity + SM + neutrinos + a real singlet scalar $S$, summarised in a dimensionless mass parameter $\kappaeff$. Here $\kappaeff$ is treated as a phenomenological parameter, computed in a concrete truncation and then propagated as a prior with quantified uncertainty. Combining these ingredients yields a Golden Relation %\[ $m_S = \CLambda \sqrt{\kappaeff\,\chisqY},$ %\] which connects the physical mass $m_S$ of the singlet scalar to a QICT constant $\CLambda$, the hypercharge susceptibility $\chisqY$ at $T_\star$, and the FRG parameter $\kappaeff$. Using explicit numerical benchmarks %\[ $a = 0.197~\text{GeV}^{-1},\quad D_Y \simeq 0.10~\text{GeV}^{-1},\quad \frac{\chisqY}{T_\star^2} = 0.145 \pm 0.010,\quad \kappaeff = 0.136 \pm 0.019,\quad \CLambda = 1.6 \pm 0.2~\text{GeV}^{-1},$ %\] we obtain a mass band %\[ $m_S = 58.1 \pm 1.5~\text{GeV},$ %\] with a conservative interval %\[ $m_S \in [56.6,59.6]~\text{GeV}.$ %\] We then perform a minimal but complete phenomenological scan of the $Z_2$ singlet-scalar Higgs-portal model in the $(m_S,\lambda_{HS})$ plane, solving the Boltzmann equation for the relic density and applying current direct-detection and Higgs-invisible constraints. A set of representative viable points lies in the immediate vicinity of the Golden-Relation band near the Higgs resonance.
Article
Physical Sciences
Theoretical Physics

Yong Bao

Abstract: In this paper we study the quantization of the cosmic critical density. Applying the generalized relational expression, we derive a quantized formula for the cosmic critical density and subsequently prove. We compare a graph of the three components of the formula, it reveals that the gravitational quantization term dominates during the very early universe and near the Planck time, suggesting it may be a consequence of a complete theory of quantum gravity. Last we find a primitive function which the Taylor expansion is this quantized formula. Our discussion is intriguing and heuristically valuable.
Article
Physical Sciences
Theoretical Physics

Henry Arellano-Peña

Abstract: The Timeless Counterspace & Shadow Gravity (TCGS) framework postulates that the observable three-dimensional (3-D) universe is a shadow manifold Σ embedded in a four-dimensional (4-D) Counterspace (C, GAB,Ψ) that contains the full content of all apparent “time stages”. In previous work, this ontology was applied to cosmology and biological evolution (SEQUENTION), and later extended to a geometric description of atomic orbitals as “electronic filaments” anchored to a single singular origin p0 ∈ C. The present manuscript consolidates that atomic programme in light of three recent classes of empirical evidence: (i) the experimental demonstration of the Pusey–Barrett–Rudolph (PBR) theorem on superconducting processors, (ii) the first observation of solar neutrino chargedcurrent interactions on 13C in SNO+, and (iii) the ALICE Collaboration’s tomographic reconstruction of deuteron formation from short-lived Δ resonances in high-energy proton–proton collisions. We show that these results jointly provide an “ontological license” to abandon purely probabilistic atoms. The PBR test rules out ψ-epistemic models and—when reinterpreted cartographically—forces hidden variables to reside in the 4-D bulk rather than in the 3-D shadow. Within TCGS, the quantum state is redefined as a tomographic map of a rigid 4-D filament, not a standalone 3-D object. The SNO+ data clarifies the role of neutrinos: rather than passive agents in a collapse process, they act as topological torsion operators that perform geometric surgery on nuclear knots, pushing stable isotopes into metastable corridors in Counterspace. The ALICE analysis then reveals that stable light nuclei (deuterons) are crystallized products of specific resonance ancestries, confirming that nuclear stability is a property of projection geometry and knot ancestry, not of pointlike constituents. On this basis we construct a unified “Crystallography of the Atom” in which: (1) the wavefunction is a 3-D tomogram of a 4-D isopotential filament; (2) the nucleus is a geometric knot with isotope-dependent docking admissibility; (3) neutrinos carry quantized torsion that re-anchors the singular set S ⊂ C; and (4) halflives and delays are reinterpreted as arc lengths along metastable corridors in the foliation parameter. The framework preserves all standard quantum predictions but replaces ontic probabilities with geometric rigidity and corridor depth. We outline empirical tests—from modified PBR experiments to neutrino interactions on additional isotopes—that can discriminate this geometric atom from conventional probabilistic interpretations.
Article
Physical Sciences
Theoretical Physics

Raoul Bianchetti

Abstract: We develop a variational principle in which spacetime curvature emerges from the preservation of informational identity along dynamical trajectories. The approach is motivated by the Viscous Time Theory (VTT) framework, where finite informational latency replaces an assumed geometric background. Instead of postulating a metric structure a priori, informational geodesics are defined as the paths that minimize a latency functional representing the local cost of identity reorganization in viscous time. The second-order structure of this action induces a symmetric bilinear form that behaves as an emergent metric tensor. Classical geodesic motion and the Einstein field equation are recovered in the limit of uniform latency density, showing that General Relativity arises as a special case of the more general informational action. The framework predicts curvature-like effects in regimes with negligible mass–energy but strong identity constraints, including coherent condensed-matter phases and entangled quantum systems. These predictions outline a testable research program connecting differential geometry with informational dynamics.
Essay
Physical Sciences
Theoretical Physics

Emilio Elizalde

Abstract: What does “Big Bang” actually mean? What was the origin of these two words? It has often been said that the expression “Big Bang” began as an insult. Even if this were true, it would be just an irrelevant part of the whole issue. There are many more as-pects hidden under this name, and which are seldom explained. They will be discussed in this work. In order to frame the analysis, help will be sought from the highly au-thoritative voices of two exceptional writers: William Shakespeare and Umberto Eco. Both Shakespeare and Eco have explored the tension existing between words and the realities they name. With the conclusion that names are, in general, just labels, simple stickers put to identify things. And this includes those given to great theorems or spectacular discoveries. Not even “Pythagoras’ theorem” was discovered by Pythago-ras, as is now well-known. Stigler's law of eponymy is recalled to further substantiate those statements. These points will be at the heart of the investigation carried out here, concerning the very important concept of “Big Bang”. Everybody thinks to know what “the Big Bang” is, but only very few do know it, in fact. When Fred Hoyle first pro-nounced these two words together, on a BBC radio program, listeners were actually left with the false image that Hoyle was trying to destroy. That is, the tremendous ex-plosion of Lemaître’s primeval atom (or cosmic egg), which scattered all its enormous matter and energy content throughout the rest of the Universe. This image is abso-lutely wrong! As will be concluded, today the label “Big Bang” is used in several dif-ferent contexts: (a) the Big Bang Singularity; (b) as the equivalent of cosmic inflation; (c) speaking of the Big Bang cosmological model; (d) to name a very popular TV pro-gram; and more.
Article
Physical Sciences
Theoretical Physics

Henry Arellano-Peña

Abstract: The standard relativistic ontology treats time as an additional coordinate in a four-dimensional space-time manifold. Since Minkowski's 1908 formulation, ``dimension'' has been tacitly identified with ``vector direction in a manifold'', and the temporal coordinate has been assimilated into that vectorial catalogue. This move proved mathematically powerful but ontologically misleading. In this article I argue, within the Timeless Counterspace \& Shadow Gravity---SEQUENTION (TCGS--SEQUENTION) framework, that identifying time with a geometric dimension is a \emph{category error}. ``Time'' is a foliation parameter, a gauge label on a family of admissible projections of a single four-dimensional counterspace; it cannot be a dimension on the same footing as the geometric directions of that counterspace. Conversely, the fourth dimension in TCGS is not temporal but \emph{counter-spatial}: a geometric structure of informational content, populated by singularities and extrinsic relations, whose projections generate the three-dimensional (3-D) shadow we call the physical world.I first analyse the ``Minkowski trap'': the historical path by which the success of tensor calculus turned the coordinate index $x^0$ into a surrogate for ontic time, and ``dimension'' into a purely algebraic notion. I show how this trap is reproduced, rather than avoided, in more recent multi-dimensional proposals, including $(1+3)$-dimensional ``three-dimensional time'' models. I then develop the TCGS--SEQUENTION alternative: a static four-dimensional counterspace $(\Csp,\gbulk,\PsiField)$ containing the full content of all so-called ``time stages'', and an embedded shadow manifold $(\Sshadow,\gshadow)$ obtained via an immersion $\Xmap:\Sshadow\to\Csp$, with observables given by pullbacks $(\gshadow,\psi)=\Xmap^*(\gbulk,\PsiField)$. Within this ontology, time is a foliation artifact---a parameter labelling a one-parameter family of embeddings $\Xmap_\lambda$---and all genuine dynamics are recast as consistency conditions between slices.Using the Baierlein--Sharp--Wheeler (BSW) action and subsequent constraint analyses, I demonstrate how General Relativity (GR) can be reconstructed without ontic time, thereby disentangling its empirical success from the Minkowskian ontology. I then show how the same projection geometry, equipped with a single extrinsic constitutive law, accounts for dark-matter phenomenology, cosmological anisotropies, and the biological homology encapsulated in SEQUENTION, without invoking dark sectors or stochastic deep time. Finally, I contrast counter-spatial dimensionality with ``3-D time'' and argue that any vectorial treatment of time---even with multiple temporal axes---remains trapped in the same categorical mistake: it re-labels the coordinates instead of changing the ontology. In TCGS--SEQUENTION, there is no temporal dimension at all; the only fundamental dimension beyond the familiar three is geometric and informational, not temporal.
Article
Physical Sciences
Theoretical Physics

Hou Jianchao

Abstract: This paper proposes a novel unified physical theory, the Xuan-Liang theory, which resolves three major challenges in modern physics through geometric-topological unification [3][5]: (1) Dark matter effects originate from velocity-curvature topological coupling; (2) Cosmic inflation and late-time accelerated expansion are unified via dynamic Euler characteristic evolution; (3) The black hole information paradox is resolved through holographic Xuan-Liang flux quantization. Compared to string theory (28+ parameters) and loop quantum gravity (complex discrete geometry), this theory requires only three fundamental constants to achieve mathematical simplicity (∼1/10 complexity) and experimental verifiability (explicit predictions for gravitational wave polarization modifications), providing a potential framework for next-generation physical paradigms.
Article
Physical Sciences
Theoretical Physics

Henry Arellano-Peña

Abstract: The double slit experiment is usually presented as a paradoxical manifestation of “wave–particle duality”: a single physical system appears to display mutually exclusive properties, depending on the measurementcontext. In this article I argue that, once one adopts the TCGS–SEQUENTIONontology— a static four-dimensional (4-D) counterspace C whose three-dimensional (3-D) shadows Σ are generated by an immersion X — the double slit is not a paradox but a geometric theorem. Complementarity becomes a necessary consequence of projection geometry rather than a mysterious axiom of quantum theory. Within this framework, “wave” and “particle” descriptions are incompatible 3-D silhouettes of a single 4-D structure anchored on a singular set SC; they cannot coexist on any one shadow, but they coexist without tension in the counterspace. Building on the TCGS axioms for gravity and biology, and on the analysis of time as a foliation gauge rather than a dimension, I formulate a Cartographic Exclusion Principle: whenever a physical system admits two fully consistent but mutually exclusive descriptions in the same 3-D manifold, the data signal an embedding into a higher-dimensional content space. I then apply this principle to quantum interference. Using two recent experiments as empirical anchors — a tunable Einstein–Bohr recoiling-slit realization at the quantum limit, and measurements of coherent vs. incoherent light scattering by single-atom wavepackets — I show that the observed visibility–which-path trade-offs are best interpreted as changes in the rigidity of the projection X, not as a system that “sometimes is a wave and sometimes is a particle”. The analysis closes a logical loop in the TCGS–SEQUENTIONprogram. Earlier work demonstrated that dark matter and Darwinian chance can both be reinterpreted as projection artifacts of a single 4-D counterspace. Here I argue that quantum complementarity belongs to the same family: it is the quantum-scale expression of the same geometric constraint that shapes cosmological cartography and biological evolution. Under mild assumptions, the double slit experiment thus functions as a topological proof that our 3-D world is a shadow of a 4-D counterspace, and that time is a foliation parameter rather than a fundamental dimension.
Article
Physical Sciences
Theoretical Physics

Henry Arellano-Peña

Abstract: Contemporary theories of consciousness are fractured between three incompatible ontologies. Quantum proposals such as Penrose--Hameroff's Orchestrated Objective Reduction (Orch–OR) treat conscious episodes as gravitationally induced state reductions in tubulin superpositions. Harmonic-field models describe consciousness as a macroscopic interference pattern in a continuous field over the cortex. Dynamical-systems neuroscience, in turn, locates affective ``internal states'' in low-dimensional attractor manifolds embedded in high-dimensional hypothalamic activity. None of these programmes, however, resolves the deeper ontological tension: all three are written as temporal dynamics on a 3-D brain, whereas their own mathematics quietly presupposes a static, higher-dimensional content.In this paper I embed these three families of models inside the Timeless Counterspace & Shadow Gravity—SEQUENTION (TCGS–SEQUENTION) framework. Time is treated as a foliation artefact: there exists a single four-dimensional counterspace (C, \( G_{AB},\Psi \)) containing the full content of all so-called "time stages'', while the observable neurobiological world is a 3-D shadow manifold obtained by an immersion X: ΣbioC. Apparent evolution of brain states is not ontic dynamics but the comparison between admissible projections of one static whole.The main result is a 4-layer vertical architecture that harmonizes quantum, neurogeometric, and harmonic-field descriptions without relinquishing the identity-of-source axiom of TCGS--SEQUENTION. At Layer 1, Penrose's EG is reinterpreted as a static geometric functional on C, not a time-dependent collapse trigger. At Layer 2, a symplectic resolution limit transforms EG into an effective action for q-desics---quantum generalizations of geodesics---in the sense of Koch et al., now embedded in the same counterspace. At Layer 3, a Diophantine filter selects a discrete, φ-scaled spectrum of microtubular eigenmodes constrained by the TCGS constitutive law. At Layer 4, the macroscopic harmonic field and cortical gamma synchrony are recognized as Moiré interference patterns in the projection geometry, not ontic wave-objects in the brain.By eliminating ontic time and enforcing a single 4-D source manifold, we show that: (i) the Orch--OR threshold becomes a gauge choice of foliation rather than a physical "moment"; (ii) the Harmonic Field Model is a 3-D shadow of a static interference structure on C; and (iii) hypothalamic line attractors are geometric corridors in the same projection class that governs galaxy-scale anomalies in TCGS cosmology. Consciousness, in this framework, is the registration of a slice-invariant content on a 3-D shadow whose degrees of freedom are geometrically constrained, not locally generated. The hard problem is therefore not ``how matter generates experience'', but how a single timeless content field projects as both physics and biology under the same extrinsic law.
Article
Physical Sciences
Theoretical Physics

Yake Li

,

Wei Chen

Abstract: Based on the principle of constant speed of light and the principle of minimum energy, this paper constructs a theoretical system of vacuum dynamics, revealing a complete physical causal chain: "light speed gradient → rest energy difference → spontaneous force → spontaneous motion". The core dynamic equation is derived as . Applying this theory to the study of gravitational mechanisms, we successfully deduce Newton's gravitational expression from the light speed distribution function, achieving the unification of the mathematical form and physical mechanism of gravity at the level of energy conversion mechanism for the first time. This research not only provides a novel dynamic perspective for analyzing the physical origin of gravity but also verifies the feasibility of propellantless propulsion technology in vacuum in principle.

of 38

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated