Sort by

Article
Biology and Life Sciences
Toxicology

Sakthivela Anandhan

,

Kavitha K

Abstract: Parkinson’s disease (PD) is a neurodegenerative disorder with limited disease-modifying therapies. Computational models can provide predictive insights into drug properties, although critically limited datasets pose challenges. Fifteen FDA-approved Parkinson’s disease drugs were represented as hydrogen-suppressed molecular graphs. Twelve degree-based topological indices were computed and used as descriptors for predicting seven physicochemical properties (MR, P, MV, MW, nHA, nRotB, Complexity). Multi-layer perceptron artificial neural network (ANN) and Random Forest (RF) models were trained. Model performance was evaluated using Leave-One-Out Cross-Validation (LOOCV). The statistical robustness of the models was verified using a Y-randomization test. Shapley Additive Explanations (SHAP) were applied for interpretability. The ANN demonstrated high predictive correlation on the small dataset for MR (R² = 0.876), P (R² = 0.875), MW (R² = 0.837), and nHA (R² = 0.901). Lower predictive performance was observed for MV (R² = 0.729), molecular Complexity (R² = 0.706), and nRotB (R² = 0.308). RF provided comparable results but was generally outperformed by ANN. The Y-randomization test yielded consistently negative average R²rand values (lowest R²rand = -1.708), confirming the absence of chance correlation. SHAP analysis identified the most influential topological indices for each property in ANN. ANN-based QSPR modeling with degree-based descriptors can accurately predict physicochemical properties of PD drugs for certain endpoints. These models were proven statistically robust through Y-randomization validation. Limitations include the small dataset size and high-dimensional descriptor space, highlighting the need for external validation, larger datasets, and inclusion of additional 3D/quantum descriptors for more complex pharmacokinetic endpoints.

Essay
Arts and Humanities
Philosophy

D. John Doyle

Abstract: The rapid emergence of artificial intelligence (AI) language models has generated intense debate regarding their appropriate role in scholarly communication. Critics frequently argue that AI-assisted writing undermines intellectual authenticity by bypassing the traditional labor associated with authorship. This commentary proposes an analogy between AI-assisted writing and laboratory-grown diamonds. Both produce artifacts that are materially indistinguishable from their traditional counterparts—classically written prose and mined diamonds—yet provoke cultural discomfort because their provenance differs. By examining this analogy through the lenses of technological history, epistemic responsibility, and evolving definitions of craftsmanship, this paper argues that resistance to AI-assisted writing largely reflects cultural attachment to narratives of effort rather than objective differences in intellectual value. Historical parallels—including the adoption of statistical software, word processors, and digital literature databases—demonstrate that scholarly practices often undergo initial moral panic followed by normalization. AI does not eliminate authorship but relocates the locus of scholarly mastery from mechanical production toward conceptual clarity, judgment, and interpretive accountability. The critical ethical question is therefore not whether AI tools participate in writing, but whether authors retain responsibility for accuracy, reasoning, and intellectual integrity. Understanding this shift may help academic institutions develop policies that promote transparency without conflating technological assistance with intellectual fraud.

Review
Biology and Life Sciences
Cell and Developmental Biology

Dong-Joon Lee

,

Hyung-Jin Won

,

Jeong-Oh Shin

Abstract: Tooth development or odontogenesis is a complex morphogenetic process that requires tightly regulated interactions between the oral epithelium and mesenchyme of neural crest origin. In this narrative review we compile existing knowledge regarding gene regulatory networks and epigenetic factors throughout tooth development from initiation to eruption. Signaling between epithelium and mesenchyme is mediated by four conserved pathways—Wnt/β-catenin, bone morphogenetic protein (BMP), fibroblast growth factor (FGF), and Sonic hedgehog (Shh)—which operate iteratively and interact through extensive crosstalk at each developmental stage. Transcription factors such as PAX9, MSX1, PITX2 and LEF1 interpret these signals to control cell fate decisions and differentiation. Epigenetic modifications, including DNA methylation, histone modifications, and microRNA-mediated regulation, provide additional layers of control that fine-tune gene expression programs. Unlike existing reviews that address these regulatory mechanisms separately, here we integrate signaling pathways, transcription factor networks, epigenetic regulation, human genetic disorders, dental stem cell biology, and recent single-cell transcriptomic insights into a unified framework. We discuss opportunities to apply developmental biology knowledge towards regenerative dentistry goals, including iPSC-derived dental models and spatially resolved multi-omics approaches, while acknowledging the considerable gap between preclinical findings and clinical application.

Article
Biology and Life Sciences
Biophysics

Enrique Rosario Aloma

,

Luis Rodriguez

,

Maymunah Ray

Abstract: Background: Tumor microenvironments (TMEs) frequently exhibit extracellular acidity (pH ~6.5), a biophysical feature known to play a critical role in cellular behavior, tumor progression, immune suppression, and altered therapeutic response. While synthetic regulatory circuits capable of sensing acidity have been proposed, quantitative frameworks describing how microenvironmental pH dynamics interact with tumor–immune systems remain limited. Methods: We developed a computational modeling framework describing acidity-mediated regulatory dynamics in coupled tumor–immune systems. The model integrates interacting processes including tumor population dynamics, effector T-cell activity under acid-dependent suppression, regulatory vector dynamics, pH-responsive promoter activation, buffering or alkalinization mechanisms, cytokine-mediated feedback, and proton concentration kinetics calibrated to physiological pH ranges (6.0–7.4). Alternative acidity-modulating strategies, including substrate-dependent and substrate-independent buffering mechanisms, were examined through parameter sweeps, sensitivity analysis, and spatial reaction–diffusion extensions. System behavior was analyzed using stability and regime characterization methods. Results: The model exhibits distinct dynamical regimes in which acidity modulation reshapes tumor–immune interactions. Simulation of the acidity-responsive regulatory module demonstrated that promoter-driven therapeutic activation reduces tumor burden through two mechanistically distinct pathways. The alkalinization strategy elevated steady-state pH (ΔpH ≈ 0.2–0.6), partially restoring immune activity and reducing tumor persistence via microenvironmental feedback. In contrast, immune reactivation enhanced cytotoxic pressure directly, producing more rapid tumor suppression without substantially normalizing extracellular pH. In both architectures, therapeutic output increased under acidic conditions and diminished as pH approached physiologic levels, demonstrating dynamically coupled and self-limiting behavior. Sensitivity and scaling analyses further revealed hierarchical parameter control and architectural differences between substrate-dependent and substrate-independent buffering mechanisms. Conclusions: This study provides a quantitative theoretical framework for understanding how microenvironmental acidity functions as a regulatory variable in tumor–immune dynamics. The results highlight generalizable principles governing acidity-mediated feedback, system stability, and scaling behavior, offering mechanistic insights relevant to microenvironment-responsive regulatory systems. These findings emphasize the importance of biophysical microenvironmental factors in shaping cellular system dynamics and provide a basis for future experimental investigation of acidity-responsive biological regulation.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Mohsen Mostafa

Abstract: Understanding how gradient descent shapes neural network representations remains a fun-damental challenge in deep learning theory. Recent work has revealed that neural networks behave as “racing” systems: neurons compete to align with task-relevant directions, and those that succeed experience exponential norm growth. However, the geometric principles govern-ing this race—particularly when data lies on low-dimensional manifolds and networks employ adaptive normalization—remain poorly understood. This paper establishes a mathematical framework that unifies and extends these insights. We prove three fundamental theorems: (1) neuron weight vectors converge exponentially to the tangent space of the data manifold, with a rate determined by local curvature and gating dynamics; (2) for rotation-equivariant tasks, an angular momentum tensor is conserved under gradient flow, imposing topological constraints on neuronal rearrangements; (3) the distribution of high-norm “winning” neurons follows a von Mises-Fisher concentration on the manifold, with concentration parameter linked to initial angular variance. As a case study, we integrate Bayesian R-LayerNorm—a provably stable nor- malization method—into our framework, deriving a modified norm growth law that explains its empirical robustness on corrupted datasets. Together, these results provide a geometric foun-dation for understanding capacity adaptation, lottery tickets, and uncertainty-aware learning in neural networks.

Article
Medicine and Pharmacology
Obstetrics and Gynaecology

Nabeelah Mukadam

,

Lynne Emmerton

,

Petra Czarniak

,

Oksana Burford

,

Stephanie Wai Khuan Teoh

,

Tamara Lebedevs

Abstract: (1) Background: Access to reliable medicines information is essential to support safe medi-cine use during pregnancy and breastfeeding, where concerns regarding fetal and neonatal safety complicate clinical decision-making. Analgesics are widely used during these peri-ods, yet uncertainty regarding safety persists due to evolving evidence, regulatory changes, and inconsistent information sources. Obstetric medicines information services play a critical role in addressing these information needs. This study aimed to evaluate patterns and characteristics of analgesic-related enquiries to a specialist obstetric medicines infor-mation service over a 20-year period. (2) Methods: A retrospective observational study was conducted using enquiry data from the King Ed-ward Memorial Hospital Obstetric Medicines Information Service (KEMH OMIS), Western Australia. All enquiries recorded between 1 January 2001 and 31 December 2020 were ex-tracted from the Microsoft Access® database. Records with incomplete data were excluded. Data were standardised, coded, and analysed using Microsoft Excel® and SPSS® Version 25. Descriptive statistics were used to summarise enquiry characteristics, caller type, tim-ing of exposure, and analgesic medicines involved. Trends over time were analysed. (3) Results: A total of 48,458 enquiries were analysed, of which 4,978 (10.3%) related to anal-gesics, making this the third most common medicine class. Most enquiries related to breastfeeding (62.1%), followed by pregnancy (32.7%). The public accounted for 60.9% of calls, while health professionals contributed 39.1%. The highest frequency of breastfeeding enquiries occurred within the first four weeks postpartum, and pregnancy enquiries were most common in the second trimester. Paracetamol was the most frequently enquired an-algesic (24.5%), followed by codeine (19.8%), ibuprofen (14.4%), diclofenac (7.2%), and tramadol (9.3%). Analgesic-related enquiries declined significantly over time (p< 0.001), particularly codeine-related enquiries following regulatory safety warnings. (4) Conclusion: Analgesics represent a substantial proportion of medicines information enquiries in preg-nancy and breastfeeding, reflecting widespread use and ongoing safety concerns. Pharma-cist-led medicines information services play a critical role in supporting safe analgesic use. Continued surveillance and targeted education are essential to optimise maternal and in-fant medication safety.

Article
Public Health and Healthcare
Public Health and Health Services

Fernanda Dias Alves

,

Jacqueline de Torres Boesso

,

Renato Pereira de Torres

,

Elton Euler da Silva Reis

Abstract: Background: Depression is a major public health concern and remains a challenge despite traditional care approaches. This study aimed to describe perceptions of changes associated with depressive experience reported by participants in a program grounded in Permission Theory. Methods: This exploratory and descriptive study employed a quantitative and qualitative approach, grounded in Bardin’s Content Analysis to analyze 23 spontaneous accounts from participants who reported experiences related to depression. The participants evaluated their lives before and during the program. Results: The quantitative analysis showed an increase in self-reported scores of overall life evaluation during the program. Overall, the participants’ accounts indicated that they subjectively perceived changes in emotional, relational, and functional aspects of their everyday lives. Conclusions: These findings emphasize how participants interpret and describe changes in their emotional, relational, and functional lives, aspects that are often less visible in conventional mental health outcome research. These perceptions do not allow for inference of clinical effects or a causal relationship with program participation, reinforcing the need for controlled studies to investigate potential impacts on mental health outcomes.

Article
Physical Sciences
Astronomy and Astrophysics

Remi Cornwall

Abstract: This paper completes a series of earlier papers on the Cosmological Constant as compressible fluid-like zeropoint energy, which acts as a 2nd order perturbation in the stress-energy tensor; this preceded an earlier sketch for a means for Dark-Energy to gravitate. We suggest a replacement to ad-hoc MOND type theories with the theory developed herein, in the light of recent DESI and JWIST discoveries, with a fully covariant version of our earlier gravitating dark-energy model: we show that dark-energy can gravitate if it is considered to be in compression by tidal effects; also it sets a natural cusp-like size limit to galaxies and clusters. A relation is found between the slope of the galactic rotation curve and the slope of the dark-energy/matter zone is found too. An earlier proof by the author that MOND type theories are not Heisenberg Uncertainty Principle compatible is cited here. Finally we ask if an extra-repulsive form of dark-energy may prevent collapse to singularity in Black hole solutions. All-in-all, this paper is a semi-classical treatment of gravity with vacuum corrections that might manifest on the large-scale.

Review
Environmental and Earth Sciences
Waste Management and Disposal

Felipe Anchieta-Silva

,

Amélia de Santana Cartaxo

,

Antônio Demouthié de Sales Rolim Esmeraldo

,

Elaine Meireles Senra

,

José Carlos Pinto

Abstract: The widespread utilization of plastic materials across various sectors has led to significant increase of plastics demand over the decades. This growth has been accompanied by a mounting challenge related to managing of generated plastic waste, as substantial portions of the plastic residual end up in landfills due to limited recycling efforts. Addressing this global concern demands the development of innovative strategies to better assess and recover polymer waste, which should be treated as a different feedstock. In order to do that, efficient sorting techniques are crucial to integrate valuable materials like plastics into municipal solid waste management and improve recycling outcomes. As a matter of fact, technological innovations in this area have given rise to more sophisticated sorting methods, exploring automated sorting techniques to enhance recycling efficiency. Nevertheless, among traditional and modern sorting approaches, manual strategies are still used to perform plastic waste segregation. In this context, the present study aims to comprehensively review and assess pre-treatment classification techniques employed to transform waste streams into valuable compounds, specifically focusing on polyolefin materials present in large quantities in urban solid waste treatment environments.

Article
Chemistry and Materials Science
Metals, Alloys and Metallurgy

Lixin Fang

,

Liqin Qin

,

Limin Zhang

,

Hao Zhou

,

Xudong He

,

Zekun Ren

,

Tongyi Zhang

,

Yi Liu

Abstract: Machine learning interatomic potentials (MLIPs) are typically constructed for homogeneous crystalline systems that exhibit only minimal local deviations from equilibrium configurations. However, substitutional alloying elements in multicomponent engineering alloys are often distributed in a locally heterogeneous form. To address this, we develop a fine tuned MLIP based on the MACE foundation model, specifically tailored for Mo based dilute alloys containing one or two out of 20 substitutional elements: Cr, Fe, Mn, Nb, Re, Ta, Ti, V, W, Y, Zr, Al, Zn, Cu, Ag, Au, Hg, Co, Ni, and Hf. The model is trained on more than 7,000 non equilibrium structures derived from first principles density functional theory (DFT) calculations. The optimized large scale fine tuned model attains state of the art accuracy, with mean absolute error (MAE) and root mean square error (RMSE) of 2.27 meV/atom and 3.79 meV/atom for energy predictions, and 13.83 meV/Å and 24.26 meV/Å for force predictions, respectively. Systematic evaluation of model transferability to unseen alloying elements under different data splitting protocols demonstrates that incorporating even a modest set of new element DFT data during refinement reduces the energy MAE below ~20 meV/atom. The fine tuned models reduce the MAE by approximately 7–10 times compared to models trained from scratch, and by 10–20 times relative to zero shot foundation models. This performance gain remains consistent across varying dataset sizes (equilibrium vs. non equilibrium structures) and model scales. Our work illustrates the efficacy of transfer learning from globally homogeneous systems to locally heterogeneous multi element alloy environments, delivering a robust MLIP tool for the accelerated design of multicomponent alloys.

Review
Physical Sciences
Theoretical Physics

Joel Almeida

Abstract: The Dead Universe Theory (DUT) proposes a fundamental re-examination of cosmic dynamics, replacing the standard paradigm of an expanding universe from a hot singularity with a model of asymmetric thermodynamic retraction within a viscoelastic spacetime continuum. In this framework, the observable cosmos constitutes a localized photonic anomaly — a transient luminous fluctuation — embedded within the collapsed gravitational geometry of a prior cosmological phase. This work presents the complete mathematical foundation of DUT, deriving the entropic deformation tensor Ξ_μν from a variational principle and incorporating it into modified Einstein field equations. The central result is the emergence of a unique, non-adjustable growth index γ = (√5 − 1)/2 ≈ 0.6180339887, derived as the asymptotic attractor of the perturbation dynamics rather than as a free parameter fitted to observational data. This value — the golden ratio — arises directly from the characteristic equation governing irreversible thermodynamic asymmetry. We present a complete, gap-free derivation of this result in Appendix A, where the golden ratio emerges as the unique fixed point of the scale-invariant dissipation/organization partition of the viscoelastic vacuum — a geometric consequence requiring no phenomenological ansatz or external prescription. The theory yields additional testable predictions including a mildly negative curvature parameter Ω_K ≈ −0.07 ± 0.02, a cosmic energy exhaustion timescale of approximately 166 Gyr, and specific signatures in high-redshift galaxy populations consistent with JWST deep-field results. Decisive falsification tests are provided for the Euclid and Roman Space Telescopes.

Review
Public Health and Healthcare
Primary Health Care

Erik Nilssen

,

James Thorp

,

Claire Rogers

,

Kirstin Cosgrove

,

Steven Hatfill

,

Drew Pinsky

,

Kelly Victory

,

Alejandro Diaz-Villalobos

,

Nicolas Hulscher

,

Peter McCullough

Abstract: Introduction: COVID-19 mRNA vaccines are associated with the development of a wide range of autoimmune diseases. Rare autoimmune conditions, such as polymyalgia rheumatica (PMR), have received limited attention in medical literature. The purpose of this study is to review PMR, examine reports of PMR in the government database monitoring vaccine safety, and evaluate the potential association between PMR, COVID-19 vaccination, and spike protein antibody levels.Methods: Data were obtained from the U.S. Centers for Disease Control and Prevention (CDC) and the U.S. Food and Drug Administration (FDA). The CDC/FDA Vaccine Adverse Event Reporting System (VAERS) was queried for reports of polymyalgia rheumatica (PMR) from January 1, 1990, through January 30, 2026. This period encompasses 433 months for all vaccines; however, COVID-19 vaccines were available to the public for only 61 of those 433 months (January 1, 2021, through January 30, 2026). Odds ratios over time (ORt) were calculated by comparing the occurrence of PMR following the administration of specific vaccinations to including COVID-19, influenza, and all other vaccines combined. The CDC/FDA defines a safety signal as a disproportionality measure of ≥ 2. Data are presented as odds ratios over time ORt with corresponding 95% confidence intervals, p-values, and Z statistics. Three cases of PMR from the authors’ recent clinical practices were reviewed. A literature review on PMR was conducted using PubMed, MEDLINE, and Google Scholar.Results: Significant safety signals were observed when comparing reports of PMR following COVID-19 vaccination with those following influenza vaccination. This association persisted when PMR following COVID-19 vaccination was compared with PMR following all other vaccines combined. There were 2,227 reported cases of PMR following COVID-19 vaccination during the 61 months after vaccine rollout. In comparison, 233 cases were reported following influenza vaccination, and 526 cases were reported following all other vaccines combined over a 433-month period. The ORt for COVID-19 vaccination compared with influenza vaccination was 69.4, 95% CI 51.4 - 93.6, p < 0.0001, Z statistic 27.7. When comparing PMR following COVID-19 vaccination with PMR following all other vaccines combined including influenza, a significant safety signal persisted: 30.7, 95% CI 23.1 - 40.8, p < 0.0001, 23.6. Three exemplary cases and a review of the literature are also presented. Conclusions: Strong safety signals were detected when comparing polymyalgia rheumatica (PMR) following COVID-19 vaccination with PMR following influenza vaccination and, when compared to all other vaccines combined. The strength of the signal, its statistical robustness, and its consistency with observed clinical cases and biologically plausible immunoinflammatory mechanisms suggest the need for heightened clinical awareness of PMR occurring temporally following COVID-19 vaccination. These findings corroborate other research documenting the occurrence of PMR after COVID-19 vaccination and the pathophysiological pathway for spike protein-induced autoimmunity. Future research should prioritize validation of direct assays for spike protein detection rather than relying solely on surrogate antibody measurements. Additional investigation is also warranted to clarify the role of COVID-19 vaccinations and spike protein in musculoskeletal pathology and to evaluate preventive and therapeutic strategies.

Article
Environmental and Earth Sciences
Remote Sensing

Xiaoxia Xu

,

Wujian Yan

,

Ruixin Xiao

,

Xiaofeng Liu

,

Jie Hao

Abstract: This paper takes Tongwei, Gangu, Wushan and Qin'an counties involved in the 1718 Tongwei earthquake as the study area, combines with the case investigation of earthquake damage and historical landslide data, and uses the statistical analysis model to select 10 impact factors related to topography, geology, earthquakes and human activities in the study area.Finally, the relative contribution and coupling effect of different influencing factors to the spatial distribution of earthquake-induced loess landslides are discussed by principal component analysis. The results show that: (1) The loess seismic landslides in the study area are concentrated in the south-facing slope areas with elevation of 1000-1300 m, slope of 10-20°, topographic relief of 0-30 m, 1200-1600 m away from rivers, 2-8 km away from active faults, intensity of X, land use types of grass and farmland, and relatively weak lithology. (2)Further verified that the prone areas of loess landslides in the study area are the areas near rivers with an elevation of 1600-1900m and a slope of 10-20 degrees, and the areas covered by thick loess layers with a distance of 4-12 km from active faults.

Review
Medicine and Pharmacology
Dermatology

Jaap-Jan Roukens

Abstract: Hidradenitis suppurativa (HS) is a severe inflammatory dermatosis characterized by profound localized pain. While current pathophysiological feedback loops (vicious cycles) focus on microscopic molecular networks and microbiological dysbiosis, the psychosocial and behavioral burdens of HS are individually well-documented. However, these behavioral maladaptations have not been integrated into a unified macroscopic model. This self-sustaining system operates across three interacting domains: (1) a biomechanical-metabolic loop, where sustained immobility accelerates visceral adiposity and insulin resistance; (2) a psychosocial-physiological loop, where pain-induced sleep disruption and chronic stress drive neuroendocrine dysregulation and maladaptive coping behaviors; and (3) a socioeconomic loop, where economic instability decreases healthcare security. Consequently, these behavioral, psychological, and socioeconomic burdens structurally feed back into the systemic inflammatory core, perpetuating disease chronicity. Moreover, this review explores kinesiophobia (the anticipatory fear of movement) as a potentially critical and overlooked component of the biomechanical-metabolic feedback loop. Currently, there is a notable absence of primary psychometric data quantifying kinesiophobia in the HS population. Future research must first measure this phenomenon to establish its prevalence and role. On a macroscopic level, clinicians should aim to systematically break the broader interconnected behavioral feedback loops through multidisciplinary interventions, including cognitive-behavioral therapy and structured patient education. Ultimately, dismantling these psychological and behavioral barriers may prove biologically imperative to halt systemic inflammatory amplification and improve long-term clinical outcomes.

Article
Physical Sciences
Astronomy and Astrophysics

Thomas J. Buckholtz

Abstract: We discuss gravitational concepts and candidate specifications for dark matter that, together, can help explain known ratios of dark-matter effects to ordinary-matter effects and can help explain eras in the rate of expansion of the universe. The ratios pertain to galaxies and galaxy evolution, galaxy clusters, and densities of the universe. The candidate specifications for dark matter reuse, with variations, a set of known elementary particles. Regarding galaxy evolution and the rate of expansion of the universe, we deploy multipole-expansion methods that combine Newtonian gravity, aspects of motions of sub-objects of gravitationally interacting objects, and Lorentz invariance. One outgrowth from our work suggests relationships among some physics constants. Another outgrowth from our work suggests a basis for a candidate specification for quantum gravity.

Article
Social Sciences
Political Science

Yiping Cheng

Abstract: This paper proposes Scheme M, a new presidential design that evolves the American model by introducing flexibility in election timing while preserving executive stability. Its flexible elements draw inspiration from the post-2017 Turkish presidential system, where variable terms are enabled by early general elections. However, unlike Türkiye—where the Assembly can also trigger early presidential elections, creating perceived insecurity—Scheme M removes this reciprocal power, assigning sole responsibility to the president to identify, assess, and resolve executive-legislative deadlocks. The scheme adapts the established American practice of midterm elections by adding contingent, flexible-timing elements: the mechanism is triggered exclusively by presidential decree, limited to once per five-year term and only within the first three years. It keeps the president's fixed term secure while allowing strategic timing—or avoidance—of midterm legislative elections to refresh or realign parliament at low personal cost. Additional safeguards include a mixed SMDP-PR electoral system to prevent chronic presidential majorities, parliamentary confirmation for the vice-presidential nominee, narrowly defined decree powers, and robust term limits. The scheme has two variants: Scheme FM and Scheme VM. Scheme FM features fixed-time general elections, enhancing predictability, cost efficiency, and campaign depth. Scheme VM introduces variable terms, ensuring near-certain same-party succession, empowering a lame-duck president to renew both branches—avoiding paralysis or premature resignation—and allowing strategic general election timing akin to Westminster practices. Scheme M therefore offers a viable blueprint for stable yet responsive presidential governance.

Article
Chemistry and Materials Science
Medicinal Chemistry

Shrikant S Nilewar

,

Apurva D. Chavan

,

Ankita R. Pradhan

,

Anshuman A. Tripathy

,

Nagaraju Bandaru

,

Prashik Dudhe

,

Perli Kranti Kumar

,

Sandesh Lodha

,

Ghazala Muteeb

,

Ivan Peredo-Valderrama

+2 authors

Abstract: Alzheimer’s disease (AD) represents a escalating global neuropharmacological crisis, with prevalence in high-growth demographic regions such as India projected to exceed 14 million by 2040. This study addresses the urgent need for high-potency, dual-site acetylcholinesterase (AChE) inhibitors through an integrated computational pipeline. Background: We address the failure of mono-target paradigms by designing scaffolds capable of simultaneously anchoring the Catalytic Active Site (CAS) and the Peripheral Anionic Site (PAS). Methods: A robust GA-MLR QSAR model was developed from 115 quinoline analogues using 11,135 descriptors. Lead candidates were prioritized via blind molecular docking (7XN1) and 100-ns molecular dynamics (MD) simulations. Results: The five-descriptor model (R2 = 0.7569, QLOO2 = 0.7244) was validated by an external set of 8 experimental compounds (Rext2 = 0.8620). Lead Compound 19 emerged as a superior candidate (ΔG = -11.1 kcal/mol), exhibiting a stable MD trajectory (PL-RMSD ≈ 2.4 Å) and preserving essential Gly121-His447 catalytic anti-correlations. Conclusions: This study provides a statistically validated scaffold and mechanistic foundation for future biomimetic chromatography validation, advancing the high-throughput screening of neuroprotective agents on a global scale.

Article
Physical Sciences
Astronomy and Astrophysics

Jiazheng Liu

Abstract: We prove that \delta^{(4)}(x - y) \notin L^{2}(\mathbb{R}^{4}) is not a legitimate physical Green's function under the quantum- mechanical postulate of finite energy (A1). A fourth postulate of closed sourcelessness (A4)—methodologically analogous to Einstein's postulate of the constancy of the speed of light—is derived as a theorem from the quantum- gravity result \dim \mathcal{H}_{\mathrm{universe}} = 1 [1- 8]. Under three independent postulates A1- A3 together with this result, we derive the unique physical Green's function G = \sin (\Omega \sqrt{-\sigma^{2} - i\epsilon}) / (\Omega \sqrt{-\sigma^{2} - i\epsilon}) , \Omega = \pi /t_{P} . The bandlimited two- point function K of the resulting Paley- Wiener space \mathrm{PW}_{\pi /t_{P}} admits the spherical Bessel decomposition
K(x,x^{\prime}) = \frac{\Omega^{3}}{2\pi^{2}}\sum_{l = 0}^{\infty}(2l + 1)j_{l}(\Omega r)j_{l}(\Omega r^{\prime})P_{l}(\cos \theta).
We prove: (i) the l = 0,1,2 sectors are precisely the scalar, photon, and graviton propagators; (ii) gauge symmetry emerges as the zero- set geometry of j_{l} ; (iii) restriction to the light cone \sigma^{2} = 0 yields the celestial sphere S^{2} with 2D CFT two- point structure and conformal dimensions \Delta_{l} = l + 1 , parameter- free; (iv) tensor structure \Pi_{l} follows from the \mathrm{SO}(4,2) representation theory of massless fields on the six- dimensional light cone [9, 10]; (v) fermions arise necessarily from the spinor representations of \mathrm{SO}(4,2) via \mathcal{H}_{\mathrm{tot}} = \mathcal{H}_{\mathrm{pos}}\otimes \mathcal{H}_{\mathrm{int}} .
All four physical regimes (QFT, quantum gravity, gauge fields, dissipation) are restrictions of the single entire function f(z) = \sin (z) / z to different domains of \mathbb{C} . Bandlimitedness is a theorem, not an assumption. Since all cosmological observables—CMB (TT,TE,EE) , large- scale structure, and weak lensing—are recorded along null geodesics (\sigma^{2} = 0 where G = 1 exactly, with no dimensional suppression), they jointly probe the same \Delta_{l} = l + 1 structure on the celestial sphere. Their combined Bayesian posterior P(\theta |\mathrm{data})\propto \prod_{i}\mathcal{L}_{i} compresses the posterior width as 1 / \sqrt{N_{\mathrm{datasets}}} , providing a simultaneous, parameter- free observational test.

Hypothesis
Biology and Life Sciences
Biochemistry and Molecular Biology

Jiaxing Liao

Abstract: Irreversible loss of neurons in the adult mammalian central nervous system is a core driver of cognitive decline, yet existing "repair after damage" strategies cannot reverse established injury. Here, we propose a disruptive hypothesis: utilizing mitochondrial outer membrane permeabilization (MOMP) as the molecular switch for "irreversible" apoptosis, we construct a closed-loop system for real-time seamless replacement of apoptotic neurons. The system comprises two core modules: a labeling module that performs specific membrane modification (PS acetylation) on neurons at the earliest stage of irreversible apoptosis, and a replacement module (engineered autologous neural progenitor cells) that precisely targets apoptotic sites via dual-signal recognition (modified PS + chemokine CX3CL1), accomplishing timed clearance of apoptotic debris and in situ neuronal differentiation before cellular disintegration, achieving "zero-latency replacement." The core innovation of this hypothesis lies in not pursuing "pixel-level replication" of the apoptotic neuron's connections. Instead, it relies on the nervous system's inherent plasticity: after precise delivery of newborn neurons to the apoptotic site, subsequent synapse outgrowth, competition, and stabilization are accomplished by the neuron's intrinsic growth programs and local network activity-dependent plasticity. The human nervous system is inherently in a state of continuous synaptic turnover and remodeling; newborn neurons, as participants in this dynamic process, will manifest their functional contributions over time. Therefore, even partial synaptic functional replacement is sufficient to make a substantial contribution to neural network homeostasis—this itself represents a paradigm shift from 0 to 1. All core designs of this hypothesis are grounded in established consensus findings, with clear stepwise validation pathways and strict falsifiability, providing a novel theoretical framework for neural repair and intervention in cognitive aging.

Article
Social Sciences
Geography, Planning and Development

Khang The Nguyen

Abstract: This study investigates the relationship between economic growth, technological innovation, renewable energy consumption, and CO₂ emissions in Vietnam from 1988 to 2021, using a Vector Error Correction Model. Three key findings emerged. First, economic growth remains strongly coupled with carbon emissions in the long run, indicating a fossil fuel-dependent economic structure. Second, technological innovation yields positive but limited short-term effects, requiring extended periods to achieve a full impact. Third, renewable energy exerts strong positive short-term effects, but negative long-term effects, reflecting structural economic shifts. This study proposes five policy recommendations: commercializing patent innovations, rapidly expanding renewable energy for immediate growth, decoupling growth from emissions, combining clean energy with technological advancement, and implementing policy reforms immediately rather than relying on long-term strategies alone.

of 5,669

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated