Sort by

Review
Environmental and Earth Sciences
Pollution

Soledad González-Juárez

,

Nora Ruiz-Ordaz

,

Juvencio Galindez-Mayer

Abstract: Diffuse pollution from agricultural runoff, characterized by intermittent discharges of complex contaminant mixtures—including nutrients, pesticides, and heavy metals (HMs)—poses a persistent threat to global water quality. Conventional "end-of-pipe" strategies often fail to address these decentralized, nonpoint sources. This review examines the evolution of Permeable Reactive Barriers (PRBs) from static, abiotic filters into modern Permeable Reactive Bio-Barriers (PRBBs), engineered as dynamic, fixed-bed biofilm reactors. A key advancement in PRBB efficacy is the exploitation of biofilm plasticity, particularly in response to coexistence with organic and inorganic pollutants. While heavy metals are traditionally viewed as inhibitors, this review synthesizes evidence showing that sub-inhibitory HM levels can act as structural and functional drivers. These metals induce the upregulation of Extracellular Polymeric Substances (EPS), creating a "protective shield" that sequesters metals and confers functional resilience on the microbial consortia responsible for nutrient removal and pesticide biodegradation. The review analyzes contaminant removal mechanisms, highlighting the bio-chemo synergy between reactive media and biofilms, and proposes a classification framework based on target contaminants, media, and technological integration. Significant focus is placed on emerging hybrid multi-media systems designed to protect the "biological engine" from toxic metal shocks, alongside the integration of artificial intelligence for predictive control. While challenges in hydraulic sustainability and field validation remain, PRBBs represent a compact, low-energy, and scalable eco-technology. They offer a strategically targeted solution within the Nature-Based Solutions toolkit for building resilient protection of aquatic ecosystems at the critical land-water interface.

Article
Engineering
Other

Kevin MacG. Adams

,

Irfan Ibrahim

,

Steven L. Krahn

Abstract: This paper proposes a formal method and associated techniques for completing the ISO/IEC/IEEE Standard 15288 technical process 6.4.2 – stakeholder needs and requirements definition within the 15288-SysML grid framework. The paper is a companion work to Engineering systems with standards and digital models: Development of a 15288-SysML Grid, which describes an engineering design method that supports the tenets of the Industry 4.0 paradigm. The formal method presented here is grounded using established constructs from systems science; specifically, the systems principles of hierarchy, emergence, requisite parsimony, minimum critical specification, and requisite saliency. The application of accepted principles ensures that stakeholders are able to objectively specify measurable criterion that can satisfy stakeholder needs and capabilities. The method uses: (1) international standards for systems (e.g., ISO/IEC/IEEE 15288); (2) adopts the four fundamental aspects of system design supported by model-based systems engineering (MBSE); (3) invokes the international standard for the systems modeling language (SysML); and (4) adopts a hierarchical requirements tree that specifies Mission, Goals, Objectives, and Sub-objectives (MGOS) to provide the stakeholder-analysis process for the articulation of system-level engineering requirements. Utilization of the MGOS is intended to have a positive impact on the system design process by ensuring reproducibility, replicability, transparency, and generalization.

Hypothesis
Biology and Life Sciences
Biophysics

C. Leonard Neatu

Abstract: Biological coherence arises from coordinated integration of redox chemistry, hydration dynamics, electromagnetic interactions, and bioenergetic flux. Although substantial progress has been made in characterizing these processes individually, current frameworks do not fully explain how distributed biochemical events achieve stable temporal coordination across scales. In thermally noisy, dissipative environments, energy alone cannot account for sustained biological organization. A missing element is the establishment and renewal of phase reference - the temporal alignment that enables spatially distributed processes to act in synchrony. Here we propose a physical mechanism for phase reference access and anchoring based on cyclic nanodomain dynamics at a nanoscale redox-photonic interface previously termed the Redox Photonic Coupling System (RPCS). This interface supports an additional functional modality - phase breathing - a process mediated by molecular nitrogen (N₂) through which cyclic nanodomain nucleation and collapse anchors and sustains phase reference in living systems. Nitrogen-mediated oscillatory boundary dynamics create transient coherence windows that permit local access to phase reference, enabling phase-aligned oxidative-reductive resolution and anchoring of phase onto redox-generated Photonic Activation Quanta (PAQs). Absorption of phase-conditioned PAQs by adjacent hydration shells enables generation and accumulation of centropy, defined as stored organizational capacity that supports coordinated biological work.This framework identifies phase breathing as a previously unrecognized mechanism sustaining biological coherence and assigns molecular nitrogen a structural organizational role beyond respiratory dilution. By integrating nanodomain mechanics, photonic phase conditioning, and redox dynamics within a single interface, it provides a mechanistic basis for how coherent biological function is generated and maintained.

Review
Public Health and Healthcare
Other

Nikolaou I. Aikaterini

,

Soldatou Alexandra

,

Grantzi Georgia-Christiana

,

Giapros Vasileios

,

Ladomenou Fani

Abstract: Maternal vaccination against respiratory syncytial virus (RSV) represents a major advance in early-life infection prevention. Although clinical efficacy and early real-world effectiveness are well established, sustained population-level impact depends on equitable uptake. This review synthesizes determinants influencing maternal RSV vaccination within the evolving dual-strategy landscape that includes both maternal vaccination and infant monoclonal antibody prophylaxis. A structured narrative review was conducted following PRISMA principles. PubMed/MEDLINE and Google Scholar were searched for studies published between January 2022 and February 2026. Eligible studies examined behavioral, interpersonal, structural, economic, and policy determinants of maternal RSV vaccination uptake, as well as early implementation and modelling evidence. Findings were integrated within a multilevel analytical framework. Maternal uptake is shaped by interacting determinants across individual, healthcare provider, and health system domains. Key drivers include perceived infant disease severity, vaccine safety confidence, perceived effectiveness, and prior antenatal vaccination behavior. Healthcare provider recommendation consistently emerges as the strongest facilitator. Coverage variability reflects differences in reimbursement, antenatal care integration, and national policy endorsement. The coexistence of maternal vaccination and infant monoclonal antibody strategies introduces additional comparative decision-making complexity. Early implementation data indicate heterogeneous uptake and socioeconomic gradients, while modelling demonstrates sensitivity to coverage, timing, epidemiology, and cost. Translating biological efficacy into sustained public health benefit requires coordinated behavioral, structural, and policy strategies, strong provider engagement, and context-sensitive implementation frameworks to ensure equitable coverage.

Hypothesis
Biology and Life Sciences
Life Sciences

Cheng Wang

Abstract: The Central Dogma has provided a foundational framework for biological information flow, yet it does not fully explain how living systems preserve stable identity, functional robustness, and recoverability under continuous molecular noise and environmental perturbation. Here, I propose the Central Homeostatic Principle (CHP) as a complementary first-principle framework that shifts the explanatory center from information execution alone to the physical constraint architecture that makes biological execution possible. The CHP posits that, in living cells, a central homeostatic state functions as a system-level coordinating layer that defines the feasible state space within which genetic and biochemical programs can operate.This framework is motivated by convergent evidence across mechanical confinement, electrophysiological coupling, membrane contact-site transduction, phase-state regulation, and non-genetic phenotypic heterogeneity, all of which indicate that global physical states can gate, reshape, or buffer molecular outcomes. Building from systemic prerequisites and material constraints, I further argue through an exclusionary first-principle analysis that lipid-organized boundary systems occupy a near-irreplaceable physical position in implementing this central homeostatic constraint in aqueous cellular life-not as exclusive causal authors, but as the dominant substrate of feasibility control.To render the theory scientifically actionable, this manuscript provides a formal articulation of CHP, a three-tier realization model, operational corollaries, and a rule typology that distinguishes stronger and weaker forms. It then derives a set of falsifiable hypotheses spanning temporal commitment dynamics, non-genetic resistance, aging-related resilience loss, state-engineering-based reprogramming, and evolutionary primacy in prebiotic systems. By reframing life as a problem of constrained state maintainability rather than information flow alone, the CHP offers a testable theoretical scaffold for integrating molecular biology, biophysics, systems biology, and translational state engineering.

Data Descriptor
Engineering
Mechanical Engineering

Krisztian Horvath

Abstract: This data descriptor provides a standardized and reproducible subsystem-level represen-tation of the NREL wind turbine gearbox condition monitoring benchmarking dataset. The released records are derived from Healthy (H1–H10) and Damaged (D1–D10) meas-urement files and include subsystem-level standardized indices (KHI_HS, KHI_IMS, KHI_PL) together with a calibrated 0–1 Gearbox Health Index (GHI). The indices are gen-erated using a fully specified and deterministic feature extraction and aggregation work-flow based on established vibration indicators and healthy-referenced normalization. The Zenodo deposit contains machine-readable CSV tables intended to support transparent benchmarking across supervised classification and anomaly detection studies. The pro-posed GHI is introduced as an interpretable and reproducible reference baseline rather than an optimized diagnostic model. Technical validation demonstrates condition-level separability within the analyzed dataset while emphasizing the descriptive nature of the index. By releasing structured derived records and a documented regeneration procedure, this work enables implementation-independent comparison of gearbox condition moni-toring approaches and supports reproducible evaluation of alternative health index formulations.

Article
Physical Sciences
Astronomy and Astrophysics

Shoude Li

Abstract: If the invariance of light speed is absolutely universal in covariant space rather than contra variant space, the following researches and conclusions on general covariance would be so far as to catch up with very high probabilities and reliabilities. As has been verified, it is. General covariance is tested controversial after the investigations on gravitational redshift and acceleration. Further inspections on differential geometry indicate the opportunities of inequality of mixed derivatives of bases for the transformations between Riemannian spaces that will then lead to the inequality of Christoffel symbols of alternative sub-index and then the failure of the classical equations of Christoffel symbols. That is one of the reasons that causes controversies on general covariance. Even the negative form of time metric could be proved to be a false setting in that space transformations do nothing with the negative sign inherited from that of Minkowski space. Nevertheless, after discussions on transformations between original spherical space, distance expressed spherical space and Cartesian space, it has been seen that the distance factors for angular coordinates of a spherical space are improper to be employed the metrics in general relativity, instead of that, the concept of gravitational metrics were suggested. In fact, Christoffel symbols and base derivatives both are valid methodologies for analysis in a non-Euclidean space. The concept of trajectory derivatives was carried out to define the derivations on trajectory of matter motions that could help to revise those equations and calculations. Measurable experiments on gravitational redshifts and accelerations have been carried out to support the theoretical results. Conclusions have been drawn that light speed keeps general covariance in gravitational fields but light energy momentum would not, may as well, the motions of massive matters in gravitational fields do not perform general covariance thoroughly. It is impossible to geometrize the gravity effects of massive matters with position depending metrics in that the variable velocities cannot be eliminated thoroughly. Consequently, inferences on kinematics and relativistic release were carried out, which might have been forcefully verified in applications. With the concept of gravitational metrics, the so-called geodesic equations have been falsified to be kinematic equations anymore. Velocities, mass energy and momentum all should have their conservative forms. To seek for these conservative forms is the critical route to create kinematic or dynamic equations of motions of light rays and massive matters. The conservativeness of light angular momentum has been discovered in most surprising form. Renovated solutions for light rays as well as massive matters have been carried out that forcefully impact the traditional methodologies on kinematics of trajectory and time delay because they are the correct interpretations of the realities. It should be highlighted that the renovated mass equation, the general mass equation for free motions, could completely demonstrate energy variations, especially the variations in gravitational fields, that help to create more general dynamic equations, so as to cause an irrelativistic solution of planet perihelion precession. As a proof by contradictions, the traditional solution of that must be involved with errors. Newtonian ballistic method was put forward to make numerical analyses on close-to-light-speed motions, and the invalidity of that method on light propagation could be seen as another support to the conclusions and inferences. Another forceful falsification on energy momentum conservativeness carried out in the discussions on the traditional treatment of time delay of close-to-light-speed particles, has thoroughly exposed the essential mistakes in the way of methodologies. Dynamic models of fluid planet rings were founded to interpret the evolutions of accretions of quasars and active galactic nuclei and the mechanism of relativistic release. It is predicted that the peak release of an inflow is at 1.5 of gravitational radius and the peak luminosity of an accretion may locate at the position of about 1.33 of gravitational radius. Relativistic frequency shift interprets the mechanism of giant redshifts that predict the probability of observational redshifts might be up to a particularly higher level with respect to that have been observed in the past years. The width equations of emission and absorption lines indicate the mechanism and the positions of broad line regions and narrow line regions. It could be imagined that the relativistic emissions and relativistic absorptions with relativistic redshifts would have been involved with fantastic mystery of intrinsic structures of matters that we know less.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Jineng Ren

Abstract: Since the beginning of modern computer history, the Turing machine has been a dominant architecture for most computational devices, which consists of three essential components: an infinite tape for input, a read/write head, and finite control. In this structure, what the head can read (i.e., bits) is the same as what it has written/outputted. This is actually different from the ways in which humans think or do thought/tool experiments. More precisely, what humans imagine/write on paper are images or texts, and they are not the abstract concepts that they represent in the human brain. This difference is neglected by the Turing machine, but it actually plays an important role in abstraction, analogy, and generalization, which are crucial in artificial intelligence. Compared with this architecture, the proposed architecture uses two different types of heads and tapes, one for traditional abstract bit inputs/outputs and the other for specific visual ones. The mapping rules among the abstract bits and the specific images/texts can be realized by neural networks with a high accuracy rate. Logical reasoning is thus performed through the transfer of mapping rules. The statistical decidability of the Halting Problem with an imperceptibly small error rate in reasoning steps is established for this type of machines. As an example, this paper presents how the new computer architecture (what we call ``Ren machine" for simplicity here) autonomously learns a distributive property/rule of multiplication in the specific domain and further uses the rule to generate a general method (mixed in both the abstract domain and the specific domain) to compute the multiplication of any positive integers based on images/texts. The machine's strong reasoning ability is also corroborated in proving a theorem in Plane Geometry. Moreover, a robotic architecture based on Ren machine is proposed to address the challenges faced by the Vision-Language-Action (VLA) models in unsound reasoning ability and high computational cost.

Article
Physical Sciences
Theoretical Physics

Dennis Kahan

Abstract: Foundational tensions between special relativity and quantum mechanics, together with conflicts between general relativity and quantum gravity, and unresolved cosmogonical and cosmological anomalies, block theoretical unification and limit explanatory depth. Based on ontological first principles rather than mathematical constructs, this analysis integrates a “discrete,” background-independent, relativistic 4D spacetime with a physically co-located Planck Domain. Through a one-to-one identity, the Planck Domain mirrors the discrete spatial elements of 4D spacetime, enabling a single, unified set of physical laws across quantum and classical regimes. Under this framework, ontic single- and N-body quantum states evolve deterministically in 4D spacetime and collapse instantaneously in the Planck Domain. Current theoretical tensions between special relativity and quantum mechanics, including nonlocality, separability, time, simultaneity, total energy scaling, and probability conservation, are reappraised by replacing the Hilbert-space wavefunction with an ontic energy field and a single energy-based operator that governs both motion and gravitational response. The identical ontological framework and dynamical laws apply unchanged across general relativity and quantum gravity, recasting gravity as the relational dynamics of discrete energy rather than the coupling of the stress-energy and metric tensors and reappraising the equivalence principle and the black hole information paradox. Cosmogonically, the same model re-examines the origin of 4D spacetime, accounting for near-homogeneity, isotropy, and low gravitational entropy without ad hoc assumptions, fine-tuning, or perturbative techniques, and provides ontological foundations for the cosmological constant and global energy conservation. Eight descriptive mathematical validations, derived from a unified evolution law, Planck Domain collapse rule, and the relational gravity law, support (but do not govern) the analysis: (i) the low-ℓ CMB TT shape generated from a field with one global amplitude on power; (ii) CHSH correlations at Tsirelson’s bound from collapse; and (iii–viii) hard-mass relational dynamics, highlighted by a tilted Earth–Moon orbit, a tilted hierarchical three-body system, and a high-energy Mercury–Sun analog, all sustained for 1000 orbits or inner orbits.

Article
Physical Sciences
Quantum Science and Technology

Zhaoxu Ji

,

Huanguo Zhang

Abstract: Since its establishment, quantum mechanics has developed for a century and has a very large theoretical system, but the phenomenon of quantum mechanics still lacks a generally accepted explanation, which undoubtedly shows that the existing theoretical system is incomplete. Inspired by ancient Chinese philosophy, we propose a theoretical framework in this paper, which provides a new perspective for explaining quantum mechanical phenomena including superposition and entanglement. In addition, the proposed framework contributes to a profound understanding of the law of conservation of energy. We show through examples how basic superposition states and entangled states are constructed. Our work can inspire people to think deeply about the mysteries of nature, especially quantum mechanical phenomena.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Jinyu Chen

,

Feiyang Wang

,

Tian Guan

,

Yumeng Ma

,

Linghao Yang

,

Yutong Wang

Abstract: Large language model (LLM)-based multi-agent systems have demonstrated remarkable capabilities in collaborative task solving. Although the mechanisms that facilitate seamless cooperation, such as shared contexts, role assignments, and iterative message passing, present significant risks of unintentional information disclosure. We present MIN-Trust, a trust orchestration framework that enforces Minimum Necessary Information (MNI) constraints, an operationalization of the data minimization principle for inter-agent communication—while maintaining task effectiveness. Our approach introduces an MNI-Gate that automatically classifies and filters information into essential, summarized, or pointer-referenced subsets before transmission. Additionally, we propose a Trust-Gated Channel (TGC) that counterintuitively increases verification requirements rather than relaxing information access as inter-agent trust elevates. Through experiments on four collaborative tasks using public benchmarks, we demonstrate that MIN-Trust reduces sensitive information exposure by 67.8% compared to baseline multi-agent frameworks while maintaining 93.3% of task success rates. Our evidence traceability mechanism achieves 84.2% claim-to-source attribution, significantly outperforming conventional approaches. These results suggest that privacy-preserving multi-agent collaboration is achievable under synthetic benchmark conditions with moderate performance trade-offs.

Brief Report
Medicine and Pharmacology
Obstetrics and Gynaecology

Prajwal Shetty

,

B A Sujeewa Fernando

,

B Anuthi Fernando

,

Sindhu Sekar

,

Lakshmi Jayaraj

Abstract: Background: Population ageing is accelerating worldwide, accompanied by a rising prevalence of multimorbidity and polypharmacy. Medicines with anticholinergic properties are commonly prescribed to older adults for a wide range of conditions, including depression, urinary incontinence, Parkinson’s disease, allergies, and respiratory disorders. While short-term anticholinergic effects such as dry mouth and constipation are well recognised, increasing evidence suggests that cumulative anticholinergic exposure may contribute to adverse cognitive outcomes in older populations.Objective: This review aims to discuss the concept of anticholinergic burden, outline commonly used tools to quantify exposure, and examine the evidence linking cumulative anticholinergic exposure with cognitive decline and other adverse health outcomes. It also explores strategies to identify and mitigate anticholinergic burden in clinical practice.Methods: Relevant literature on anticholinergic medications, burden scales, and associated clinical outcomes was reviewed. Attention was given to validated measurement tools such as the Anticholinergic Cognitive Burden (ACB) scale, Anticholinergic Risk Scale (ARS), and Anticholinergic Drug Scale (ADS), as well as studies examining associations between anticholinergic exposure and cognitive and functional outcomes.Results: Evidence from observational studies indicates that higher cumulative anticholinergic burden is associated with increased risks of cognitive impairment, delirium, falls, functional decline, and possibly dementia. Measurement tools allow clinicians and researchers to estimate cumulative exposure, with several studies identifying clinically meaningful risk at moderate to high burden scores. Conclusion: Anticholinergic burden represents a potentially modifiable contributor to adverse outcomes in ageing populations. Routine assessment of anticholinergic exposure, careful medication review, and deprescribing strategies where appropriate may help reduce avoidable cognitive and functional harm in older adults. Integrating burden assessment into prescribing systems and clinical decision support tools may further support safer pharmacotherapy in an ageing society.

Article
Computer Science and Mathematics
Computational Mathematics

Basem Ajarmah

,

Saber Syouri

Abstract: Managing inventory for perishable goods remains a persistent operational challenge, largely because conventional exponential decay models struggle to capture the irregular deterioration patterns observed in practice. This paper develops the Reliable Fractional Derivative (RFD) framework, which incorporates memory effects into the modeling of product decay through a time-shifted kernel. Unlike standard approaches that assume constant deterioration, this formulation accommodates both accelerating and decelerating patterns depending on product characteristics and storage conditions. We derive closed-form expressions for optimal ordering quantities under both deterministic and stochastic demand, then test the framework's performance through numerical experiments spanning two thousand parameter combinations. The analysis reveals that RFD models deliver the greatest improvements when deterioration rates are steep, holding costs are substantial, or storage horizons are extended—conditions under which switching from conventional methods yields average cost reductions approaching nineteen percent, with substantially larger gains in certain cases. A pharmaceutical application confirms savings between 3.6 and 9.1 percent relative to misspecified traditional models. These findings connect with recent industry movements toward more sophisticated safety-stock practices, offering managers a principled basis for selecting inventory policies aligned with actual product behavior rather than assuming decay conforms to simpler theoretical forms.

Review
Public Health and Healthcare
Public Health and Health Services

Angyiba Serge Andigema

,

Dimalla Paola Aphrodite Olive

Abstract: Vaccine hesitancy has evolved from episodic resistance to a structural threat to global health systems. Although opposition to vaccination has accompanied immunisation since its inception, contemporary hesitancy reflects a transformation driven by digital information ecosystems, political polarisation, institutional mistrust, and shifting risk perceptions. Its consequences extend beyond individual vaccine refusal to systemic vulnerabilities within immunisation programs. Here, wesynthesisee historical and contemporary evidence to examine vaccine hesitancy as a multilevel phenomenon shaped by sociocultural identities, psychological heuristics, and political governance structures. Tracing its trajectory from early smallpox resistance to COVID-19–era polarization, we identify recurring patterns of mistrust, moral framing, and autonomy-based resistance that re-emerge across contexts. We argue that vaccine hesitancy operates not merely as an attitudinal deficit but as a reflection of broader fractures in social trust and institutional legitimacy. We further analyse how clustering of under-immunised populations, digital misinformation amplification, and politicisation of public health undermine immunisation resilience. Evidence suggests that durable solutions require trust-centred governance, community co-production of health strategies, behavioral insight informed communication, and structural reforms that address inequity and historical injustice. Reconceptualising vaccine hesitancy as a systems-level vulnerability reframes immunisation programs as social contracts as much as biomedical interventions. Strengthening these contracts will be central to sustaining global vaccination gains in an era defined by misinformation, institutional fragility, and recurrent pandemic threats.

Case Report
Medicine and Pharmacology
Anesthesiology and Pain Medicine

Jeongsoo Choi

,

Ho Soon Jung

,

Da Hyung Kim

,

Yong Han Seo

,

Hea Rim Chun

,

Hyung Yoon Gong

,

Jae Young Ji

,

Jin Soo Park

,

Sangwoo Im

Abstract: Background and Clinical Significant: Patent ductus arteriosus (PDA) is a common car-diovascular disorder in extremely low birth weight(ELBW) infants, for which surgical ligation is indicated when pharmacologic closure fails. Sudden increases in afterload combined with immature myocardial contractility can lead to post-ligation cardiac syn-drome, which usually occurs within hours after surgery. However, acute intraoperative hemodynamic collapse during PDA ligation has rarely been described. Case Presenta-tion: A preterm infant born at 24 weeks and 3 days of gestation with a birth weight of 890 g underwent emergency PDA ligation for a hemodynamically significant PDA refractory to pharmacological treatment. Fifteen minutes after skin incision, the infant developed severe hypoxemia, bradycardia, and non-measurable noninvasive blood pressure, which required immediate hemodynamic resuscitation with manual ventilation, fluid admin-istration, and dopamine and dobutamine infusions. Hemodynamics gradually recovered after completion of ductal ligation, whereas hypoxemia persisted. Postoperative chest radiography revealed a left-sided pneumothorax, and oxygen saturation stabilized after pleural air aspiration. The subsequent clinical course was uneventful, and typical post-ligation cardiac syndrome did not develop. Conclusions: This case suggests that intraoperative hemodynamic collapse during PDA ligation may share pathophysiologic features with post-ligation cardiac syndrome, and that concomitant pneumothorax can further aggravate hemodynamic instability by worsening hypoxemia and reducing venous return.

Article
Physical Sciences
Astronomy and Astrophysics

Mohamed Sacha

Abstract:

We develop an information-theoretic route from microscopic conserved-charge dynamics to an infrared mass prediction in the minimal Z2 singlet-scalar Higgs-portal dark-matter model. We define an operational quantum information copy time \( \tau_{\mathrm{copy}}(Q)\ \) for a conserved charge Q and introduce a Liouvillian-squared information susceptibility \( \chi^{(2)} \) based on the Kubo--Mori metric. Empirically, across several decades in \( \chi^{(2)} \) we find the robust scaling \( \tau_{\mathrm{copy}}(Q)\propto (\chi^{(2)}_{Q})^{-1/2}\ \) (Table 1 and Figure 1). Analytically, a general linear-response/Cauchy-Schwarz inequality bounds the growth rate of any receiver-optimised overlap by \( \sqrt{\chi^{(2)}_Q}\ \); for a fixed operational threshold \( \eta\ \) and normalised sender/receiver operators this implies the conditional lower bound \( \tau_{\mathrm{copy}}\gtrsim \eta/\sqrt{\chi^{(2)}_Q}\ \) under mild regularity/monotonicity assumptions (Closure Supplement, Section "Copy-time bound''). We also provide stabiliser-code diffusion benchmarks that illustrate the scaling and help calibrate normalisations in the diffusive universality class. We then argue that spatially varying copy times naturally define an ``optical'' geometry for coarse-grained information propagation: a local information speed \( v_{\mathrm{info}}(x)\propto \tau_{\mathrm{copy}}(x)^{-1}\ \)induces an effective metric, and diffeomorphism invariance in the long-wavelength description implies that the Einstein--Hilbert term is the leading infrared operator, with higher-derivative corrections controlled by gradients of \( \tau_{\mathrm{copy}}\ \). In this perspective, we define the scalar dressing parameter \( \kappa_{\text{eff}} \) intrinsically from microscopic QICT susceptibilities in the electroweak-symmetric regime; asymptotic-safety FRG results, when invoked, serve only as an external cross-check rather than as a foundational assumption. Within a gauge-coded QCA realising a Standard-Model-like generation, anomaly cancellation singles out hypercharge Y Yas the unique non-trivial anomaly-free Abelian factor coupling to both quarks and leptons; we also provide a self-contained anomaly calculation (see the Closure Supplement, "Hypercharge from anomaly constraints'') and emphasise that this selects a one-dimensional anomaly-free direction; it does not exclude embeddings or additional hidden sectors. This is a minimal-factor selection under stated assumptions and does not exclude embeddings, additional hidden sectors, or discrete quotients. Matching to a thermal Standard Model plasma at a reference temperature \( T_\star\ \)in the electroweak-symmetric regime \( T_\star\gtrsim T_{\rm EW} \), and adopting benchmark inputs (with an explicit operational construction of \( T_\star\ \) given in the Closure Supplement (Point~(6)) and an explicit interacting thermal-QCA susceptibility protocol given in the Closure Supplement (Copy-time bound / Point~(6))), \( \frac{\chi_Y}{T_\star^2} = 0.145 \pm 0.010,\qquad \) \( \kappa_{\mathrm{eff}} = 0.1356 \pm 0.0714,\qquad \) \( C_\Lambda = 1.606 \pm 0.044 \), we obtain the Golden Relation \( m_S = C_\Lambda \sqrt{\kappa_{\mathrm{eff}}\,\chi_Y} \) and the prediction \( m_S = 58.5 \pm 15.6~\text{GeV},\qquad \) \( m_S \in [43,74]~\text{GeV}\ \text{(conservative)} \). We provide a minimal, fully analytic phenomenological consistency check of the Higgs-portal model in the vicinity of the Higgs resonance, using closed-form expressions for the Higgs invisible width and the spin-independent nucleon cross section. The mass prediction is conditional on the explicit benchmark intervals and on the stated matching assumptions; the copy--susceptibility exponent is universal in the variational sense above, while the overall normalisation entering the benchmark closure is calibrated using a diffusive benchmark class (a separate step, not used in the unconditional bound).

Article
Medicine and Pharmacology
Gastroenterology and Hepatology

Mihaela Cristina Brisc

,

Elena Emilia Babes

,

Sabina Florina Călugăr-Șolea

,

Simona Bota

,

Laura Maghiar

,

Ciprian Mihai Brisc

,

Ciprian Brisc

Abstract: During routine evaluation of hospitalized patients, discrepancies were frequently observed between the degree of liver steatosis assessed by conventional B-mode ultrasonography and Vibration-Controlled Transient Elastography (VCTE) with Controlled Attenuation Parameter (CAP). This study aimed to identify the factors contributing to these differences and to determine whether both imaging methods should be expected to produce comparable steatosis classifications. We conducted an observational retrospective cross-sectional study including 130 patients admitted over a two-year period who underwent laboratory testing, abdominal ultrasonography, and transient elastography. Variables analyzed included age, sex, environment, nutritional status, comorbidities, biochemical parameters (ALAT, total cholesterol, triglycerides, GGT), calculated FIB-4 score. Patients were classified in two groups: 61 with concordant steatosis grades across both methods and 69 with discordant results. Concordant results were more common in individuals with serum total cholesterol >200 mg/dL (45.9%), and those with a Fib-4 score between 1.45–3.25 (44.2%). Additionally, a trend toward a stronger correlation was observed in patients with elevated triglycerides. Viral liver disease showed a significantly higher rate of discordant results (26.2%). Total serum cholesterol >200 mg/dL, and a FIB-4 score between 1.45–3.25 can be significantly associated with concordance in steatosis grading, while serum triglyceride levels showed a nonsignificant trend toward concordance. In contrast, viral hepatitis with concomitant steatosis can be associated with discordant findings between the two imaging modalities. Although not statistically significant, a value F ≥ 2 measured by VCTE measured fibrosis and a FIB-4 score >3.25 also showed a trend toward discordance, suggesting they may contribute to variability in steatosis assessment.

Article
Physical Sciences
Theoretical Physics

Sacha Mohamed

Abstract: We develop an information-theoretic route from microscopic conserved-charge dynamics to an infrared mass prediction in the minimal Z2 singlet-scalar Higgs-portal dark-matter model. We define an operational quantum information copy time \( \tau_{\mathrm{copy}}(Q)\ \) for a conserved charge Q and introduce a Liouvillian-squared information susceptibility \( \chi^{(2)}_{\mathrm{micro},Q}\ \) based on the Kubo--Mori metric. Under explicit locality, spectral-gap and hydrodynamic assumptions, we formulate a conditional scaling theorem implying \( \tau_{\mathrm{copy}}(Q)\propto \bigl(\chi^{(2)}_{\mathrm{micro},Q}\bigr)^{-1/2}\ \); we provide numerical evidence for this scaling in stabiliser-code diffusion models (Supplemental Material). We then argue that spatially varying copy times naturally define an "optical'' geometry for coarse-grained information propagation: a local information speed \( v_{\mathrm{info}}(x)\propto \tau_{\mathrm{copy}}(x)^{-1}\ \) induces an effective metric, and diffeomorphism invariance in the long-wavelength description implies that the Einstein-Hilbert term is the leading infrared operator, with higher-derivative corrections controlled by gradients of\( \tau_{\mathrm{copy}}\ \). In this perspective, we define the scalar dressing parameter \( \kappa_{\text{eff}} \) intrinsically from microscopic QICT susceptibilities in the electroweak-symmetric regime; asymptotic-safety FRG results, when invoked, serve only as an external cross-check rather than as a foundational assumption. Within a gauge-coded QCA realising a Standard-Model-like generation, anomaly cancellation singles out hypercharge Y as the unique non-trivial anomaly-free Abelian factor coupling to both quarks and leptons. Matching to a thermal Standard Model plasma at a reference temperature T⋆ in the electroweak-symmetric regime (T⋆≳TEW), and adopting benchmark inputs (with an explicit operational construction of T⋆ given in Supplement~S7), \( \frac{\chi_Y}{T_\star^2} = 0.145 \pm 0.010,\qquad \) \( \kappa_{\mathrm{eff}} = 0.136 \pm 0.019,\qquad \) \( C_\Lambda = 1.6 \pm 0.2 \), we obtain the Golden Relation \( m_S = C_\Lambda \sqrt{\kappa_{\mathrm{eff}}\,\chi_Y} \) and the prediction \( m_S = 58.4 \pm 8.6~\text{GeV},\qquad m_S \in [50,67]~\text{GeV}\ \text{(conservative)} \). We provide a minimal, fully analytic phenomenological consistency check of the Higgs-portal model in the vicinity of the Higgs resonance, using the closed-form expressions for the Higgs invisible width and the spin-independent nucleon cross section. We emphasise that the mass prediction is conditional on the input benchmark intervals and on the diffusive QICT universality class assumptions.

Article
Environmental and Earth Sciences
Sustainable Science and Technology

Jacek Biskupski A. Biskupski

,

Miroslaw Dechnik

Abstract: The increasing prevalence of rooftop photovoltaics on European buildings has sparked interest in using façades and balconies as alternative surfaces for generating solar energy. This study examines the technical and economic performance of building-integrated photovoltaic (BIPV) installations on façades and balconies under real operating conditions. Four case studies from Poland are analysed using a combination of measured energy production data and simulations performed with the PVGIS tool. The analysis focuses on annual and seasonal energy yield, self-consumption potential, system costs, simple payback time and the role of module-level power electronics (MLPE) in mitigating the effects of shading and non-optimal orientations. The results demonstrate that, while façade-mounted PV systems generally have lower annual yields than optimally tilted rooftop installations, balcony and façade BAPV systems with MLPE can achieve high self-consumption rates, short payback periods (3–10 years) and favourable winter performance. These findings demonstrate that BIPV and BAPV systems on façades should be assessed using distinct technical and economic criteria, and highlight their potential to extend prosumer participation to apartment dwellers, thereby supporting a more inclusive urban energy transition.

Article
Social Sciences
Political Science

Irfan Ananda Ismail

Abstract: This paper proposes mengolah, a culturally embedded Indonesian term describing informal grassroots lobbying and political brokerage, as a decolonial methodology and medium of political communication for understanding youth political participation in Indonesia. Grounded in the everyday practices of Indonesian political culture, mengolah represents a distinct form of political engagement that operates through personal networks, informal negotiation, and relational trust rather than formal institutional channels. This study explicitly positions mengolah not as an inherently corrupt practice but as a legitimate cultural medium through which citizens engage with democratic processes, functioning analogously to constituent services and political networking in Western democracies while reflecting Indonesian values of kebersamaan (togetherness) and gotong royong (mutual cooperation). Drawing on data from the 2024 Indonesian general elections, where youth voters comprised 56% of the electorate, this study examines how mengolah functions as both a grassroots political methodology and a structured pathway for political mobility. Skilled practitioners of mengolah (pengolah) typically progress from grassroots volunteers to organizational leaders in organisasi masyarakat (mass organizations) and eventually to formal party cadres or elected officials. This trajectory demonstrates that mengolah serves as political apprenticeship, a medium for cultivating democratic capacities and connecting informal community leadership with institutional politics. Through analysis of social media data, electoral brokerage patterns, and youth political behavior, this study contributes to the project of decolonizing political science by centering indigenous Indonesian political practices as legitimate, functional, and epistemologically significant objects of scholarly inquiry.

of 5,647

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated