Physical Sciences

Sort by

Article
Physical Sciences
Particle and Field Physics

Andrew Michael Brilliant

Abstract: Peer review of empirical patterns in high-precision, low-dimensionality param- eter spaces relies on implicit evaluation standards. When N = 3 parameters at 2% precision permit thousands of statistically significant formulas, reviewers must distinguish structure from coincidence, but the criteria for doing so remain unar- ticulated. We found no published record of community debate establishing explicit standards, despite decades of informal application. This paper proposes one such articulation: seven criteria emphasizing tempo- ral convergence through timestamped predictions. We offer specific thresholds not because we believe them correct, but because explicit proposals can be calibrated while implicit standards cannot. The need for explicit standards is timely. Lattice QCD has only recently achieved the precision necessary for discriminatory tests of quark mass relations. Historical precedents from lepton phenomenology (Koide, Gell-Mann–Okubo) provide limited guidance: leptons offer ∼35,000× greater discriminatory power than light quarks, in- volve no RG running, and constitute a fundamentally different measurement regime. The historical record is further compromised by survivorship bias: patterns that di- verged are largely unrecorded. Historical cases motivate the problem by illustrating why implicit evaluation proved adequate for leptons but may prove inadequate for quarks. They cannot validate the proposed solution. Validation is prospective by design: starting from this publication, patterns evaluated under this framework will be tracked publicly. The framework succeeds if it proves predictively useful; it fails if it requires constant post-hoc adjustment, judged by its own temporal convergence criterion. If this proposal provokes disagreement that leads to better criteria, it will have served its purpose. If it is ignored, the current system of implicit evaluation contin- ues unchanged. We consider both engagement and refinement to be success.
Article
Physical Sciences
Astronomy and Astrophysics

Hai Huang

Abstract: We propose a non-perturbative quantum gravity framework using quantum vortices (statistical average topological structures of microscopic particles) embedded in AdS/CFT holographic duality, resolving black hole singularities without renormalization. Thus, this constitutes a singularity-resolution mechanism grounded in physical processes rather than mathematical techniques. The quantum vortex field generates a repulsive potential within the critical radius r∗ ≈ 8.792 × 10−11m, dynamically preventing matter from reaching r = 0 and avoiding curvature divergence. The derived Huang metric (Schwarzschild metric with quantum corrections) enables parameter-free prediction of black hole shadow angular diameters, without post-observation fitting of Kerr black hole spin. Observational verification shows: the theoretical shadow of Sgr A* is 53.3 μas (EHT: 51.8 ± 2.3 μas), and that of M87* is 46.2 μas (EHT: 42 ± 3 μas), resolving contradictions of the Kerr model. This framework unifies singularity elimination, information conservation, and shadow prediction, providing a testable quantum gravity paradigm.
Article
Physical Sciences
Quantum Science and Technology

Mohamed Haj Yousef

Abstract: We formulate a geometric framework in which observable spatial geometry and temporal directionality emerge from the intersection of two orthogonal Lorentzian temporal domains, identified as objective (physical) and subjective (informational). Each domain carries a dual-time structure consisting of a generative temporal coordinate and a manifest temporal coordinate, and is modeled using split-complex geometry that encodes conjugate Lorentzian temporal orientations. Observation is described as an intersection process in which the two Lorentzian domains meet at a Euclidean interface: oppositely oriented manifest temporal components cancel, while generative components combine into an effective temporal magnitude. This intersection yields a three-dimensional Euclidean spatial geometry accompanied by a scalar temporal parameter. The interaction between the domains is formulated using a bi-fibered temporal bundle equipped with independent temporal gauge connections. The associated gauge curvatures encode generative desynchronization, geometric phases, and topological sectors. A discrete temporal interchange symmetry exchanging the two domains is spontaneously broken by a composite temporal order parameter, resulting in an emergent arrow of time. Variation of the action yields effective gravitational field equations in which spacetime curvature receives contributions from the temporal gauge and phase fields. This construction provides a consistent geometric setting in which Euclidean space arises as an observational intersection of conjugate Lorentzian temporal structures, while temporal asymmetry, gauge curvature, and topological quantization emerge from the underlying bi-temporal geometry.
Article
Physical Sciences
Theoretical Physics

Sergiu Vasili Lazarev

Abstract: We present New Subquantum Informational Mechanics (NMSI), a comprehensive theoretical framework proposing that information—not matter or energy—constitutes the fundamental substrate of physical reality. The framework introduces the Riemann Oscillatory Network (RON), comprising N ≈ 10¹² nodes corresponding to non-trivial zeros of the Riemann zeta function ζ(s), serving as the computational substrate underlying observable physics.Central to NMSI is the π-indexing mechanism, wherein blocks of decimal digits from π provide deterministic addresses into RON. We derive the architectural threshold L* = 2·log₁₀(N) = 24, demonstrating that for block lengths L > 24, collision frequencies undergo structural transition from statistical independence to correlated behavior. This threshold emerges not as an arbitrary choice but as a mathematical necessity dictated by finite register addressing in RON.The framework introduces the DZO-OPF-RON triad as the minimal irreducible architecture for coherent physical systems: the Dynamic Zero Operator (DZO) provides dynamic regulation maintaining balance condition G[Ψ*] = 0, the Operational Phase Funnel (OPF) implements geometric mode selection via Gabriel Horn topology with aperture A(x) = A₀/x², and RON supplies the finite oscillatory substrate. We prove via six-case exhaustive analysis that elimination of any component leads either to persistent chaos or trivial collapse.Physical implementations include: CMB low-ℓ anomalies as OPF transition signatures at ℓc ≈ 24, where spectral entropy H(ℓ) exhibits regime change; BAO drift as DZO cyclic regulation with amplitude ε ≈ 1% tied to cosmic cycle parameter Z ∈ [−20, +20]; and early JWST high-redshift galaxies at z > 10 as structures inherited from previous cosmic cycles through baryon recycling mechanism at turnaround Z = −20.The tornado vortex serves as a terrestrial laboratory for validating the predicted constraint accumulation integral J(rc) = 55.26 ± 10 nats at the coherence transition radius, where J(r) = ∫ |∂Ω/∂r|/Ωref dr measures accumulated geometric constraint. Three coherence indicators I₁ (turbulence intensity), I₂ (normalized shear), and Ω (enstrophy) simultaneously satisfy threshold criteria at rc, providing direct experimental access to OPF-DZO dynamics.We provide twelve falsifiable predictions testable during 2025–2035 using DESI, JWST, LISA, CMB-S4, and Einstein Telescope, with explicit numerical thresholds and statistical confidence levels. Three computational tests using publicly available π digits (10¹² available) and CMB data (Planck 2018) are executable immediately: (1) CMB spectral entropy transition at ℓc = 24 ± 5, (2) π-block χ² transition at L = 24 ± 2, (3) π-ζ GUE correlation emergence for L ≥ 26. The framework challenges ΛCDM cosmology not through modification but through fundamental replacement, offering coherent alternatives to dark matter, dark energy, and the Big Bang singularity through cyclic informational dynamics.
Article
Physical Sciences
Astronomy and Astrophysics

Yuxuan Zhang

,

Weitong Hu

,

Wei Zhang

Abstract: We propose an algebraic framework constructed from a finite-dimensional 19-dimensional Z3-graded Lie superalgebra g = g0 ⊕ g1 ⊕ g2 (dimensions 12+4+3), featuring exact closure of the graded Jacobi identities (verified symbolically in key sectors and numerically in a faithful matrix representation, with residuals ≲ 10−12 across 107 random combinations) and a unique (up to scale) invariant cubic form on the grade-2 sector, driving a triality symmetry on the vacuum sector. Interpreting the grade-2 sector as the physical vacuum state, we explore whether representation-theoretic invariants and contractions within this algebraic structure can account for observed Standard Model parameters—including fermion masses, mixing angles, and gauge couplings—as well as the magnitude of the cosmological constant, black-hole entropy scaling, and certain qualitative features of quantum entanglement. The framework yields twelve quantitative predictions amenable to experimental scrutiny at forthcoming facilities such as the High-Luminosity LHC, Hyper-Kamiokande, DARWIN/XLZD, and LiteBIRD.
Article
Physical Sciences
Astronomy and Astrophysics

Junli Chen

Abstract: This article reviews the same-frequency mutual interference explanation of gravitational bending light, gravitational lensing and light bending, analyzes the observation data of HerS-3 Einstein Cross, and uses the same-frequency mutual interference explanation of light bending to deduce the rationality of the formation of the HerS-3 Einstein Cross. This article believes that the light starts from the large When passing by a massive luminous planet (galaxy), the light bends due to the influence of electromagnetic waves continuously emitted by the massive luminous planet (galaxy). The degree of bending of the light is directly proportional to the brightness of the massive luminous planet and inversely proportional to the shortest distance between the light and the planet, regardless of the mass of the planet. Generally, the mass-to-light ratio (excluding dark matter) of planets (galaxies) in the universe is much smaller than the mass-to-light ratio of the sun. Therefore, the degree of light bending calculated using the gravitational lens principle is much smaller than the actual value. At this time, we have to use non-existent dark matter to supplement it. The HerS-3 Einstein cross derivation of dark matter is another example. However, using the same-frequency mutual interference explanation of light bending will not deduce dark matter, which is consistent with observational reality.
Article
Physical Sciences
Mathematical Physics

Wawrzyniec Bieniawski

,

Andrzej Tomski

,

Szymon Łukaszyk

,

Piotr Masierak

,

Szymon Tworz

Abstract: Assembly theory defines structural complexity as the minimum number of steps required to construct an object in an assembly space. We formalize the assembly space as an acyclic digraph of strings. Key results include analytical bounds on the minimum and maximum assembly indices as functions of string length and alphabet size, and relations between the assembly index (ASI), assembly depth, depth index, Shannon entropy, and expected waiting times for strings drawn from uniform distributions. We identify patterns in minimum- and maximum-ASI strings and provide construction methods for the latter. While computing ASI is NP-complete, we develop efficient implementations that enable ASI computation of long strings. We establish a counterintuitive, inverse relationship between a string ASI and its expected waiting time. Geometric visualizations reveal that ordered decimal representations of low ASI bitstrings of even length N naturally cluster on diagonals and oblique lines of the squares with sides equal to 2N/2. Comparison with grammar-based compression (Re-Pair) shows that ASI provides superior compression by exploiting global combinatorial patterns. These findings advance complexity measures with applications in computational biology (where DNA sequences must violate Chargaff's rules to achieve minimum ASI), graph theory, and data compression.
Article
Physical Sciences
Chemical Physics

Vyacheslav A. Kuznetsov

Abstract: This paper presents a concept of a new class of multifunctional adaptive elements for neuromorphic electronics, based on the mathematical framework of the Kuznetsov tensor. The proposed element integrates the functions of information storage, processing, and redistribution, providing high adaptability to changing system conditions while preventing overloads, singular states, and data losses. The Kuznetsov tensor enables modeling of multidimensional metrics of local and global flows of energy and information within neuromorphic networks, ensuring optimization of computational processes at both individual node and network-wide levels.The element demonstrates the potential of a self-regulating redistribution architecture, capable of dynamically adapting to workload variations and changes in connection topology, maintaining system stability and enhancing energy efficiency. This concept can be applied in neuromorphic processors, quantum computing devices, and artificial intelligence systems requiring predictable and reliable operation of complex multidimensional networks.The paper discusses the fundamental operating principles of the element, the mechanisms of interaction with information and energy flows, and integration possibilities within modern computational architectures. The proposed approach opens new avenues for the development of intelligent adaptive devices, capable of managing information and energy dynamics considering singularities and entropy-driven processes, which is of interest for both fundamental and applied research in neuromorphic electronics and information technology.
Article
Physical Sciences
Optics and Photonics

Xin Li

,

Dan Song

,

Yu-Xia Fan

,

Rong Miao

,

Dan Wang

,

Bao-Dong Yang

,

Hai-Tao Zhou

,

Jun-Xiang Zhang

Abstract: Optical amplification and spatial multiplexing technologies have important applications in quantum communication, quantum networks, and optical information processing. In this paper, based on the non-reciprocal amplification of a pair of co-propagating conjugate four-wave mixing (FWM) signals induced by a one-way pump field in a double-Λ-type hot atomic system, we demonstrate a spatially multiplexed multiple FWM processes by introducing a counter-propagating collinear pump field. This configuration enables simultaneous amplification of bidirectional four-channel FWM signals. Furthermore, when the injected signal and pump beams are modulated to Laguerre-Gaussian beams carrying different optical orbital angular momentum (OAM), the OAM of the pump beam is transferred to each amplified field. Through the tilted lens method, we experimentally demonstrate that the OAM of the amplified signal light remains identical to that of the original injected signal light. In contrast, the OAM of the other three newly generated FWM fields are governed by the angular momentum conservation law of their respective FWM processes, which enables the precise manipulation of the OAM for the other generated amplified fields. Theoretical analysis of the dynamical transport equation for the density operator in light-matter interaction processes fully corroborates the experimental results. These findings establish a robust framework for developing OAM-compatible optical non-reciprocal devices based on complex structured light.
Article
Physical Sciences
Theoretical Physics

Paulo Jorge Adriano

Abstract: We present a consolidated, test-driven account of geometric electric dipole moments (EDMs) and CP violation within the MMA-DMF framework, compiled from the December 2025 audit archive and its Gold/Platinum/Diamond validation artifacts. The central claim is operational: CP violation is dynamically active during the electroweak window (ϕ˙≠0\dot{\phi} \neq 0ϕ˙​=0) but becomes effectively static and screened at late times (ϕ˙→0\dot{\phi} \to 0ϕ˙​→0), so present-day EDM searches must target transient spectra rather than only DC offsets. Crucially, the operational kernel is rigid and degree-of-freedom-free: the analysis is executed with a fixed “Golden” parameter set (no tunable degrees of freedom in the pipeline), and all detection statements are framed as falsifiable pass/fail criteria. We show how the density–time scaling law τ(ρenv)\tau(\rho_{\mathrm{env}})τ(ρenv​) induces a mandatory downward-chirp “Sad Trombone” transient, and we specify a matched-filter protocol with density-aware templates. We also provide a laboratory handoff for the T-Environment density-swap experiment, including hardware requirements, timing constraints, logging schema, and acceptance criteria needed for an independent replication campaign.
Article
Physical Sciences
Other

Johel Padilla

Abstract: The Discrete Extramental Clock Law proposes that objective time in chaotic systems emerges discretely from statistically significant ordinal conjunctions across multiple trajectories, modulated by a universal gating function g(τs)g(τs​) rooted in Kendall's rank correlation and Feigenbaum universality. This study provides numerical evidence for the ontological hierarchy: high local chaotic activity (e.g., positive Lyapunov exponents) does not advance objective time; only global ordinal coherence (high ∣τs∣∣τs​∣) generates effective temporal ticks. Using coupled logistic maps, the Lorenz attractor, fractional-order extensions, and empirical \textit{Aedes aegypti} population data, we demonstrate negative correlation between local variance/Lyapunov activity and the rate of emergent time advance, fractal inheritance in tntn​ (Dtn≈1.98Dtn​​≈1.98), and robust noise tolerance. These results challenge the universality of Newtonian time in chaotic regimes, supporting emergent discreteness even in classical chaos.
Short Note
Physical Sciences
Thermodynamics

Jordan Barton

Abstract:

This paper advances Coherence Thermodynamics for understanding systems composed purely of information and coherence. It derives five laws of coherence thermodynamics and applies them to two case studies. Three canonical modes of coherent informational systems are developed: Standing State, Computation Crucible, and Holographic Projection. Each mode has its own dynamics and natural units, with thermodynamic coherence defined as the reciprocal of the entropy–temperature product. Within this theory, reasoning is proposed to emerge as an ordered, work‑performing process that locally resists entropy and generates coherent structure across universal features.

Review
Physical Sciences
Space Science

Simon Evetts

,

Beth Healey

,

Tessa Morris-Paterson

,

Vladimir Pletser

Abstract: The rapid expansion of commercial human spaceflight is forcing a re-examination of how we decide who is “fit to fly” in space. For six decades, astronaut selection has been dominated by national space agencies using stringent, mission-driven criteria grounded in risk minimisation and long-duration operational demands. Contemporary standards such as NASA-STD-3001 and agency-specific medical regulations embed a philosophy in which astronauts are rare, heavily trained national assets expected to tolerate extreme environments with minimal performance degradation. In contrast, commercial operators aim to fly large numbers of spaceflight participants (SFPs) with highly heterogeneous medical and psychological profiles, under a US regulatory regime that emphasises informed consent and currently imposes very limited prescriptive health requirements on passengers. This article reviews the evolution and structure of traditional astronaut selection, outlines emerging approaches to screening and certify-ing commercial spaceflight customers, and explores the conceptual and practical gap between “selection” and “screening”. Drawing on agency standards, psychological se-lection research, and recent proposals for commercial medical guidelines, it proposes a risk-informed, mission-specific framework that adapts lessons from government as-tronaut corps to the needs of commercial spaceflight. We argue that future practice must balance inclusion and market growth with transparent, evidence-based risk manage-ment, supported by systematic data collection across government and commercial flights.
Article
Physical Sciences
Condensed Matter Physics

Jian-Hua Wang

Abstract: The conventional framework for quantum statistics is built upon gauge theory, where particle exchanges generate path-dependent phases. However, the apparent consistency of this approach masks a deeper question: is gauge invariance truly sufficient to satisfy the physical requirement of indistinguishability? We demonstrate that gauge transformations, while preserving probabilities in a formal sense, are inadequate to capture the full constraints of identical particles, thereby allowing for unphysical statistical outcomes. This critical limitation necessitates a reconstruction of the theory by strictly enforcing indistinguishability as the foundational principle, thus moving beyond the conventional topological paradigm. This shift yields a radically simplified framework in which the statistical phase emerges as a path-independent quantity, \( \alpha = e^{\pm i\theta} \), unifying bosons, fermions, and anyons within a single consistent description. Building upon the operator-based formalism of Series I and the dual-phase theory of Series II, we further present an exact and computationally tractable approach for solving N-anyon systems.
Article
Physical Sciences
Theoretical Physics

Vladlen Shvedov

Abstract: We propose a geometrically motivated framework in which the large-scale evolution of the Universe is described by a coherent multidimensional wavefunction possessing a preferred direction of propagation. Within this formulation, the scalar envelope of the wavefunction defines a critical hypersurface whose temporal evolution provides an effective geometric description of cosmic expansion. The resulting picture naturally incorporates an arrow of time, large-scale homogeneity, and a nonsingular expansion history, without invoking an inflationary phase, a cosmological constant, or an initial singularity. The critical hypersurface takes the form of a three-dimensional sphere whose radius plays the role of a cosmological scale factor. Its evolution leads to a time-dependent expansion rate with a positive but gradually decreasing acceleration. The associated density evolution follows a well-defined scaling law that is consistent with the standard stress–energy continuity equation and corresponds to an effective equation-of-state parameter w = -1/3. As a consequence, the total mass–energy contained within the expanding hypersurface increases with time in a manner that remains fully compatible with the continuity relation. Analytical estimates derived from the model yield values for the present expansion rate and mean density that are in close agreement with current observational constraints. Within this geometric interpretation, the gravitational constant emerges as an invariant global potential associated with the critical hypersurface, linking the conserved properties of the wavefunction to observable gravitational coupling. The framework therefore provides a self-consistent, effective description in which cosmic expansion and gravitational dynamics arise from the geometry of a universal wavefunction, suggesting a deep connection between quantum structure, spacetime geometry, and cosmological evolution.
Article
Physical Sciences
Quantum Science and Technology

Jussi Lindgren

Abstract: The Stueckelberg wave equation is solved for unitary solutions, which links the eigenvalues of the Hamiltonian directly to the oscillation frequency. As it has been showed previously that this PDE relates to the Dirac operator, and on the other hand it is a linearized Hamilton-Jacobi-Bellman PDE, from which the Schrödinger equation can be deduced in a nonrelativistic limit, it is clear that it is the key equation in relativistic quantum mechanics. We give a stationary solution for the quantum telegraph equation and a Bayesian interpretation for the measurement problem. The stationary solution is understood as a maximum entropy prior distribution and measurement is understood as Bayesian update. We discuss the interpretation of the single electron experiments in the light of finite speed propagation of the transition probability field and how it relates the interpretation of quantum mechanics more broadly.
Article
Physical Sciences
Astronomy and Astrophysics

Grichshenko Valentina

,

Alibi Baden

,

Asemkhan Mukushev

,

Aigerim Kalybekova

,

Marat Nurtas

Abstract: The paper analyzes the energy spectra (ES) of cosmic ray (CR) nuclei H, Ne, Si, Fe in the energy range from E = 1 MeV nucleon-1 to 1000 MeV nucleon-1. The calculated values of the ES are compared with experimental data obtained from the GOES and ACE spacecraft over 7 years of operation. A new effect has been discovered in near-Earth space: a bend in the energy spectrum of nuclei in the energy range from 8 to 50 MeV nucleon-1. A possible mechanism for the complex influence of space factors on CR fluxes in near-Earth space is discussed.
Article
Physical Sciences
Theoretical Physics

Mohamed Sacha

Abstract: This note clarifies an apparent tension between a low “structural” mass scale predicted by the QICT Golden Relation and the much higher mass scales foregrounded in collider publications. In the QICT framework, the Golden Relation fixes a reference band for the singlet-scalar mass around m_0 = 58.1 \pm 1.5 GeV, interpreted as a baseline (matching-regime) branch. By contrast, values such as 335 GeV, 470 GeV, 790 GeV, and 910 GeV (ATLAS) and 840–880 GeV (CMS) arise in type-III seesaw heavy-lepton searches as 95% confidence-level exclusion lower limits, not as reconstructed resonance peaks. The note argues that, operationally, QICT associates the experimentally “highlighted” scale with a regime-dependent effective mass m_{\mathrm{eff}} governed by audit depth and copy/certification latency. Introducing a synchronization gain \kappa \ge 1 via \tau_{\mathrm{copy}}=\tau_0/\kappa, one obtains m_{\mathrm{eff}}=\kappa m_0, so high quoted scales can be read as latency-compressed regimes (\kappa \gg 1). A speed-limit bound \tau_{\mathrm{copy}}\ge\tau_{\min} then implies an upper plateau, providing a natural mechanism for “plateau selection” across analyses. The specific emergence of 470 GeV in ATLAS Run-1 is attributed to channel expansion (notably inclusion of the three-lepton channel), consistent with the thesis that the foregrounded number is sensitivity- and procedure-dependent rather than an intrinsic single mass.
Article
Physical Sciences
Theoretical Physics

Azzam AlMosallami

Abstract: We present a detailed investigation of Planck-scale black holes within the frame-work of Causal Lorentzian Theory (CLT), built upon velocity-dependent con-formal Lorentz transformations (VDC-LT). CLT provides a singularity-free, causal,and energy-conserving classical background suitable for semi-classical quantumanalysis. We derive smoothed mass distributions to regularize curvature, computecausal Hawking radiation, and evaluate gravitationally induced phase accumula-tion in quantum particles. Extending to multiple particles, we construct N-particlegravitational correlation networks. CLT resolves divergences, enforces causal prop-agation at speed c, and provides a predictive framework for gravitational correla-tions mimicking entanglement, without requiring gravitons. The framework offersmeasurable predictions for micro-scale quantum experiments and early-universescenarios.
Article
Physical Sciences
Theoretical Physics

Mauro Duarte

,

Thais Sanomiya

,

Fábio Dahia

,

Carlos Romero

Abstract: We present a detailed study of the geometrization of the Proca field in the so-called Weyl Invariant Theory, shedding new light on the physical interpretation of the Weyl field. We first describe the field equations of the theory. We then obtain a solution for the weak field using a spherically symmetric and static approximate metric. Our analysis revealed that the Weyl field, in the weak field approximation, exhibits a behaviour identical to the Yukawa potential, similar to the Proca field. Furthermore, the obtained metric solution is equivalent to the Einstein-Proca case, demonstrating that the description of the Weyl field in the Weyl Invariant Theory is consistent with Proca theory in the context of General Relativity. Finally, we conclude that the Weyl field can be formally interpreted as a Proca field of geometrical nature.

of 305

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated