Sort by

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Marina Barulina

,

Sergey Okunkov

,

Ivan Ulitin

Abstract: This study examines the impact of data augmentation on machine learning perfor-mance, focusing on how synthetic data influences various neural network architec-tures. Common issues such as limited data, class imbalance, and poor coverage often lead to low model metrics, and data augmentation is frequently used to address these problems. The research aims to identify the optimal proportion of synthetic data, assess its effects across different architectures, and analyze the impact of augmenting only specific classes in a multi-class medical image classification task. Twelve widely used architectures were selected for the experiments, including classical convolutional networks, visual transformers, and the hybrid ConvNeXt model. Results showed that no universal optimal augmentation ratio exists, as model robust-ness to synthetic data varies, even within the same architecture family. Transformer and hybrid models demonstrated greater stability, while convolutional networks exhibited inconsistent behavior, likely due to higher sensitivity to data bias.

Hypothesis
Medicine and Pharmacology
Medicine and Pharmacology

Stuart G. Ashbaugh

Abstract: Severe COVID-19 follows a cliff-edge trajectory: patients appear stable, then deteriorate rapidly and irreversibly. This paper identifies molecular oxygen as the dual control variable governing two previously unconnected biological systems: the DUOX-Lactoperoxidase-Iodine (DLI) airway antiviral defense and HIF-1α, the transcription factor that drives COVID-19 severity. Both share the same oxygen-dependent enzymes (DUOX and PHD, Km approximately 20 μM O₂ corresponding to approximately 94% SpO₂). When SpO₂ falls below this threshold via AT2 cell destruction with surfactant loss, ventilation-perfusion mismatch, and microvascular thrombosis, both systems fail simultaneously, initiating three concurrent cascade arms: (1) collapse of DLI mucosal defense through O₂ substrate depletion; (2) HIF-1α-driven Furin upregulation accelerating viral spike cleavage and entry, with a viral amplification feedback loop; and (3) IL-6-mediated cytokine storm depleting thyroid iodide reserves. These three arms interact multiplicatively, not additively. A Monte Carlo simulation across four populations demonstrates a 40.3% steeper cliff-edge signature than an additive null model. The framework generates three falsifiable clinical predictions and identifies supplemental oxygen initiated before the HIF-1α threshold (SpO₂ 94–95%) as the primary actionable intervention, suppressing all three cascade arms simultaneously.

Article
Environmental and Earth Sciences
Waste Management and Disposal

Jizhong Gan

,

Xiantao Liang

,

Yang Song

,

Bingxu Chen

,

Dongsheng Liu

,

Wanzhi Cao

,

Danhua Chen

Abstract: Gravelly soil is widely distributed in the central and western regions of China and serves as a crucial fill material for transportation infrastructure. However, its poor gradation, poor water stability, and low freeze - thaw resistance limit its direct application. To address the problems of high energy consumption and high carbon emissions of existing solidifying agents (such as cement, lime) and achieve the resource utilization of waste foam concrete, this study took waste foam concrete as the raw material, prepared a novel gravel soil stabilizer through crushing, ball milling, and high - temperature calcination, and systematically studied the solidification performance (unconfined compressive strength, water stability, freeze - thaw resistance) of the prepared stabilizer on gravelly soil and its solidification mechanism. The results show that the prepared stabilizer can significantly improve the mechanical properties of gravelly soil. At a dosage of 30%, the unconfined compressive strength reached 6.5 MPa after 28 days, an increase of 333% compared to the control group. The water stability is enhanced with the increase of dosage, and the water stability coefficient is significantly improved at a dosage of 30%. In terms of freeze - thaw resistance, at a dosage of 30%, the mass loss rate was only 2% after 5 freeze - thaw cycles, and the unconfined compressive strength reached 9.56 MPa, an increase of 437% compared to the control group. XRD and SEM analysis indicate that the stabilizer generates cementitious products such as calcium silicate hydrate gel and katoite through hydration reactions, which fill the pores of gravelly soil, cement particles, and optimize the microstructure, thereby improving its mechanical properties, water stability, and freeze - thaw resistance. This study provides a new way for the efficient resource utilization of waste foam concrete and also offers a low - energy and environmentally friendly novel stabilizer for the reinforcement of gravel soil subgrades in cold regions.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Andrey Timofeev

,

Alexander Anufriev

,

Oleg Ergashev

,

Irina Isakova-Sivak

Abstract: Hemagglutinin (HA) is the primary surface protein of the influenza A virus, determining its subtype and antigenic properties. Traditional subtype classification methods rely on DNA or amino acid sequence analysis, which does not account for protein spatial folding. In this work, we propose EpitopeGNN — a graph neural network (GNN) that constructs a residue interaction network (RIN) from the 3D structure of HA and classifies the virus subtype. The model was trained on 249 structures from the Protein Data Bank (PDB), containing H1N1, H3N2, H5N1, and other subtypes. By utilizing physicochemical properties of amino acids and topological centrality measures, we achieved 100% classification accuracy on the test set and 97.6% with five-fold cross-validation. A significant correlation was found between the obtained structural embeddings and phylogenetic distances (r = 0.48, p < 0.001), confirming their biological relevance and opening opportunities for structural monitoring of virus evolution, as well as rapid analog searching for novel strains.

Article
Computer Science and Mathematics
Mathematics

Alexandros S. Kalafatelis

Abstract: We study shell kernels for the odd-to-odd Syracuse dynamics generated by uniformly distributed initial windows. For backstepped first-passage shells, we prove short-time localization, derive an exact inverse-affine representation of the fixed-time kernel, and reduce the shell-slice discrepancy to weighted primitive-frequency correlations. We also prove a quantitative boundary-layer estimate and identify a formal renewal model for the corresponding shell mechanism. On the arithmetic side, we obtain an exact block decomposition for the primitive-frequency transfer operator, prove that no naive operator gap is available, and reduce the unresolved step to explicit incomplete principal-unit exponential sums modulo powers of 3. Thus the paper is unconditional up to a final primitive-frequency estimate, which is formulated explicitly.

Article
Computer Science and Mathematics
Computer Vision and Graphics

Xiaoming Zhang

,

Rundong Zhuang

Abstract: In unmanned pharmacy and home-care medicine management applications, reliable pillbox localization is a prerequisite for automated dispensing and grasping. However, existing detectors still perform poorly in complex environments where dense stacking, occlusion, weak illumination, and high inter-class similarity occur simultaneously. To address this problem, GSPM-YOLO is proposed as an improved detector built on the YOLOv11 framework for complex pillbox recognition, and four novel plug-and-play lightweight modules are developed: GSimConv, a lightweight dual-branch convolution module that incorporates the Attention Weight Calculation Algorithm in HardSAM for edge-preserving feature extraction, PSCAM for position-sensitive coordinate attention, MSAAM, a multi-scale strip-pooling module that integrates the Horizontal Context-Aware Attention weight calculation algorithm to strengthen occluded targets, and LGPFH for bidirectional ghost pyramid fusion. To simulate the complex operating environments of dispensing robots, we construct MBox-Complex, a dataset of 3{,}041 images with 8{,}153 annotations across 25 drug categories. Ablation experiments first validate the effectiveness of the four-module composition, with F1 rising from 0.641 to 0.714, and each module is then individually compared with advanced replacement schemes in dedicated substitution experiments to verify its own effectiveness. The integrated model is then benchmarked against advanced detectors and domain-specific methods on the self-constructed MBox-Complex dataset, achieving 0.727 mAP@50 and 0.427 mAP@50-95 with 3.8M parameters and surpassing YOLOv11 by 7.1 and 4.0 percentage points and YOLOv12 by 4.3 and 3.1 percentage points, respectively. Further cross-dataset evaluation on the VOC and Brain Tumor benchmark datasets verifies the transferability of the proposed model. Grad-CAM is adopted to visualize the detector's attention distribution, and the resulting heatmaps together with detection visualizations confirm that the proposed model focuses more precisely on stacked and occluded regions.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Bing Han

,

Jian Kang

,

Meng Zhang

,

Qian Wu

Abstract: This study proposes a novel hybrid prediction model (QGCN-LSTM) that combines Quantum Graph Convolutional Networks (QGCN) with classical Long Short-Term Memory (LSTM). The model takes space-time data as input and achieves quantum information conversion through a quantum encoding layer. Multi-scale features are extracted through the collaborative computation of QGCN and quantum gated loop units, and a quantum attention module is introduced to dynamically screen key information. Finally, the prediction results are generated through quantum measurement and a classical output layer. In the space-time data prediction task of urban traffic flow, a benchmark model system covering classical, cutting-edge, and traditional architectures was constructed. The experimental results show that QGCN-LSTM utilizes quantum entanglement gates to establish non-local road network associations, dynamically allocate feature weights to enhance the impact of critical time steps, and achieves deep compression of lines through quantum line pruning technology, effectively alleviating the common problem of “poor plateau” in quantum neural network training. In terms of prediction accuracy, the average absolute error (MAE) of its key hub nodes is reduced by 34.1% compared to the graph convolution LSTM (GCN-LSTM) model, and the Spatial Correlation Index (SCI) is improved to 0.89. In addition, it also shows excellent performance in dynamic response, edge computing efficiency, and other aspects, meeting the real-time requirements of the traffic signal control system. This study provides an effective paradigm for the application of quantum collaborative architecture in complex spatiotemporal prediction tasks.

Article
Computer Science and Mathematics
Mathematics

Michel Planat

Abstract: The nontrivial zeros of the Riemann zeta function are parameterized by the spectral variable \( s\in\mathbb{C} \), and the isomonodromic deformation parameter t of the Painlevé III equation of type \( D_6 \) is connected to s by \( t=s(1-s) \), which maps the critical line \( \Re(s)=\frac12 \) to the positive real ray \( t\in[\frac14,\infty) \). Any de Branges realization of the Riemann Hypothesis within this framework requires four explicit conditions: (C1) geometric feasibility ---the positive lambda-length slice of the \( \mathrm{PIII}_{D_6} \) character variety defines a real form of the wild Stokes and monodromy data; (C2) global positivity---the Riemann--Hilbert jump matrices yield a Herglotz Weyl--Titchmarsh function; (C3) embedding compatibility---the functional equation involution \( s\mapsto 1-\bar{s} \) preserves the positive slice; and (C4) analytic regularity---the tau-function composed with \( t=s(1-s) \) is entire of finite order after gauge removal. We prove all four conditions unconditionally. For (C1), an explicit birational map \( \Phi \) expresses all Stokes multipliers as positive monomials in the lambda-lengths. For (C2), the Painlevé/gauge theory correspondence identifies the \( \mathrm{PIII}_{D_6} \) oper with a Schrödinger operator whose real coefficients force \( \Im m(\lambda,t)>0 \) via a Wronskian argument; isomonodromic uniqueness and Remling's inverse theorem complete the proof. For (C4), integrality of the local exponent \( \alpha\in\mathbb{Z}_{\ge0} \) is the precise criterion, satisfied on an explicit sublocus of the positive slice. With all four conditions established, the Riemann Hypothesis reduces to the Bridge Conjecture alone. We test the direct form of the Bridge Conjecture---the identification \( E_{D_6}(s)=C\,\xi(s) \)---and show it fails for all constant monodromy phases and for all Dirichlet L-functions, because the tau-zero counting \( \mathcal{N}_{D_6}(T)\sim 2T/\pi \) lacks the \( \log T \) factor of the Riemann--von Mangoldt law. This leads to the identification of \( E_{D_6}(s) \) as a new explicit element of the Hermite--Biehler class \( \mathcal{HB}(1/2) \), whose canonical form is the isomonodromic cosine \( F(s)=\cos(2\sqrt{s(1-s)}) \). We prove that \( F\in\mathcal{HB}(1/2) \) is entire of order 1, satisfies \( F(s)=F(1-s) \), has all zeros on \( \Re s=\frac12 \) at \( \gamma_n=\sqrt{(2n-1)^2\pi^2-4}\,/\,4 \), with asymptotic spacing \( \pi/2 \) identified as the WKB semiclassical level spacing of the \( \mathrm{PIII}_{D_6} \) oper arising from the Seiberg--Witten period \( a_{D_6}(t)=2\sqrt{t} \). A four-tier falsifiability diagnostic and the character \( \chi_4 \) scorecard are presented.

Article
Physical Sciences
Theoretical Physics

Markolf H. Niemz

Abstract: Physics makes two questionable assumptions: (1) Distant galaxies are accelerating relative to Earth. (2) Entangled objects are spatially separated from each other. Why questionable? Acceleration relative to Earth has never been observed in a single galaxy. Observers perceive entangled objects as spatially separated, yet 3D space is relative. We show that physical realities are projections of a mathematical background reality: 4D Euclidean space (ES). In Euclidean relativity (ER), all objects move through ES at the speed C. There is no time coordinate in ES. All action is due to a monotonically increasing, absolute, external evolution parameter θ. An observer experiences two projections of ES as space and time. The axis of his current 4D motion is his proper time τ. Three orthogonal axes form his 3D space x1, x2, x3. His physical reality is his spacetime x1(ϑ), x2(ϑ), x3(ϑ), τ(ϑ), where τ is a natural time coordinate and θ converts to absolute parameter time ϑ. Without gravity, his spacetime is Minkowski-like. As in general relativity (GR), gravity in ER is the curvature of spacetime. Since coordinates in GR are merely labels, the Einstein field equations also hold in systems that use τ as the time coordinate. ER predicts time’s arrow, relativistic effects, galactic motion, the Hubble tension, and entanglement. Remarkably, ER manages without cosmic inflation, expanding space, dark energy, and non-locality. ER tells us: (1) Distant galaxies maintain their recession speeds. (2) From their perspective, entangled objects have never been spatially separated, yet their proper time flows in opposite 4D directions.

Article
Social Sciences
Education

Boris Gorelik

Abstract: Generative AI has not created the governance crisis in higher education credentialing. It has forced it into view. The academic degree is the principal instrument through which higher education systems govern access to occupations and distribute social recognition. In many fields, it can no longer perform that function reliably. When AI-generated work consistently receives first-class grades and detection tools remain unreliable, the inference from submitted artifact to certified competence collapses. Strengthening surveillance restores procedural control at the cost of assessment validity. This paper proposes a degree-free model as a governance intervention. Collins (1979) and Dore (1976) established credentialism as an administrative proxy for competence that serves institutional convenience more than it measures capability. Spence’s (1973) signaling framework specifies the conditions under which credentials function as information devices. Generative AI systematically violates those conditions. The governance implication is institutional redesign, not pedagogical adjustment. The proposal draws on the yeshiva as a historical existence proof: a non-credentialing institution organized around formation, community, and recognized mastery. It is supported by two well-evidenced findings. AI has substantially weakened the validity of conventional assessment formats. Employers already discount the degree, substituting direct performance evaluation within three to five years of hire. The degree-free model formalizes what labor markets have already enacted. Three policy recommendations follow. In non-safety-critical fields, institutions should cease issuing degrees; teaching and formation continue. Public investment in surveillance-based assessment should be redirected toward authentic evaluation. Reform must be field-differentiated: mandatory credentialing remains justified in licensed and safety-critical professions. The degree was a historically contingent governance solution. Its limits are now structurally visible.

Article
Physical Sciences
Fluids and Plasmas Physics

Shin-ichi Inage

Abstract: We develop a unified dynamical framework for the three-dimensional incompressible Navier–Stokes equations in which global regularity and turbulent inertial-range structure emerge from a common underlying mechanism. Building on a recent result establishing global regularity via coherent-core reduction and phase non-persistence, we reformulate the nonlinear dynamics in terms of triadic interactions and their associated phase evolution. We show that nonlinear amplification is confined to a High–High interaction channel, which can be further localized to a coherent core characterized by low phase drift. The phase dynamics within this core exhibits a curvature-driven instability, implying that persistent phase coherence is dynamically impossible. As a consequence, nonlinear transfer is temporally localized, preventing cumulative growth and ensuring global regularity. Using this structure, we derive the inertial-range energy cascade directly from deterministic dynamics. The combination of time-localized interactions and scale-dependent triadic multiplicity yields a constant energy flux across scales without invoking statistical assumptions or closure models, leading to a first-principles derivation of the Kolmogorov −5/3 scaling law. Furthermore, we show that the Kolmogorov constant is not an empirical parameter but a dynamically determined quantity arising from phase-averaged triadic interactions. At the continuum level, the theory yields a structural formula together with a finite admissible interval. This remaining indeterminacy is resolved by extracting the coherent-phase quantities from a GOY shell model, used as a dynamically consistent reduced system that preserves local triadic interactions. The resulting value is thereby obtained without introducing phenomenological closure assumptions. These results establish that Navier–Stokes regularity, inertial-range cascade, and the determination of the Kolmogorov constant are not independent phenomena, but three manifestations of a single triadic phase dynamic. The mechanism that suppresses finite-time blow-up is identical to the mechanism that generates energy transfer across scales and fixes the Kolmogorov constant, providing a unified deterministic foundation for fluid dynamics.

Article
Environmental and Earth Sciences
Water Science and Technology

Joseph Higginbotham

,

John Walker

Abstract: We describe a harmonic analysis system for predicting annual peak snow water equivalent (SWE) at SNOTEL monitoring stations operated by the Natural Resources Conservation Service (NRCS) across the western United States. The algorithm, frqsrchX, performs greedy harmonic regression on historical SWE records, identifying persistent periodic climate signals and superimposing volcanic impulse functions to account for episodic radiative forcing from major eruptions. A rigorous five-phase characterization pipeline applies distinct band-limited search strategies per site, and a two-winner selection system identifies optimal configurations by both maximum pass rate and a reliability score that balances accuracy with period stability. Validation uses out-of-sample holdout testing across 15–18 years (2008–2025), graded by an asymmetric scale that penalizes over-prediction more harshly than under-prediction. We report results for 771 SNOTEL and SNOW SENSOR stations across eight western states. Average pass rates range from 88.4% (Montana, 94 sites) to 49.3% (California, 122 sites, including 87 SNOW SENSOR stations). The three commercially targeted states—Colorado (113 sites), Montana (94 sites), and Wyoming (87 sites)—achieve average pass rates of 86.4%, 88.4%, and 84.2% respectively, with 84–90% of sites meeting the ≥80% operational pass-rate threshold using identical universal parameter search procedures and no state-specific tuning. Idaho (85 sites) and Washington (76 sites) show strong intermediate performance at 83.3% and 81.5%. Utah and Oregon show mixed results, while California falls well below operational thresholds. Period stability analysis indicates that 55–62% of qualifying sites in the five strongest states achieve stable signal detection, demonstrating consistent identification of physical climate periodicities. These results demonstrate that periodic climate signals—principally in the ENSO band (2,700–2,900 mY), a mid-range band (~6,000–7,500 mY), and an extended long-period band (10,500–17,000 mY)—carry actionable predictive information about annual peak snowpack at individual station scale.

Article
Medicine and Pharmacology
Cardiac and Cardiovascular Systems

Victor F. F. Joseph

,

Edmond L. Jim

Abstract: Cardiovascular diseases remain the leading cause of mortality worldwide, with coronary artery disease being the most significant contributor. The management of coronary artery disease, including stable ischemic heart disease and acute coronary syndrome, through non-surgical revascularization procedures has been widely practiced and extensively discussed in the literature, particularly regarding the benefits of complete revascularization. Complete revascularization has been associated with better prognostic outcomes and improved functional capacity in patients compared to incomplete revascularization. This study aims to compare the functional capacity, as measured by the six-minute walk test (6MWT), between patients undergoing complete and incomplete revascularization. The study employed a cross-sectional design and was conducted at Prof. Dr. R.D. Kandou General Hospital, Manado, within the Division of Cardiac Prevention and Rehabilitation. The study population consisted of hospitalized coronary artery disease patients who had undergone revascularization procedures and completed the 6MWT. Data collection took place from October 2020 to October 2023, yielding a total sample of 303 patients. The findings of this study demonstrate a significant difference in the functional capacity, as assessed by the 6MWT, between patients who underwent complete and incomplete revascularization procedures. Patients who underwent complete revascularization exhibited better functional capacity, as indicated by the greater distance covered during the 6MWT, compared to those who underwent incomplete revascularization.

Review
Computer Science and Mathematics
Security Systems

Chetan Mhaske

,

Sarthak Dharam

,

Kalpesh Mali

,

Kalyani Zore

Abstract: Cyberattacks have grown in sophistication with the emergence of advanced ransomware, zero-day payloads, and complex network intrusions. Existing security systems often focus only on detection, lacking comprehensive real-time response mechanisms. This survey explores the state of the art in AI-powered network monitoring, intrusion detection and prevention, ransomware detection, automated backup and recovery, and autonomous AI-driven ransom negotiation. By analyzing recent IEEE research on ransomware recovery [1], ML-based intrusion detection [2], proactive defense [3], network traffic analysis [4], anti-ransomware vulnerabilities [5], targeted ransomware mitigation [6], and Windows forensic investigations [7], this paper presents a unified framework that integrates machine learning, local large language models (LLMs) via Ollama, and automated self-healing processes. The proposed architecture offers a scalable, privacy-preserving, and intelligent approach to modern cybersecurity challenges.

Communication
Public Health and Healthcare
Primary Health Care

Michael Williams

,

Raeed Kabir

,

Tariq Nakhooda

Abstract: Objective: This perspective piece examines the role of Large Language Models (LLMs) in healthcare, arguing that despite significant investment, these models have had only a limited impact. Moreover, we argue that LLMs must replicate key phases of primary healthcare delivery to be a force multiplier, a necessary condition to address the global burden of disease. Discussion: We argue that LLMs lack the metacognitive capacity for ranked, dynamic reasoning. This is evidenced by clinically dangerous hallucinations and inability to perform unless complete information is provided. We extend clinical critiques with a statistical argument and a simulation exercise demonstrating that LLM-based diagnosis is not merely impractical but structurally incapable of converging on correct diagnoses in realistic clinical settings. Conclusion: Unless LLMs can independently collect patient history and triage, eliminate differential diagnoses, provide a treatment plan, and generate encounter notes, these models will not succeed in improving the efficiency of primary care delivery by human doctors. A different approach grounded in cognitive AI and structured reasoning is necessary. AI models should instead be seeded with weights provided by a panel of expert physicians to approximate an independent robot doctor.

Article
Engineering
Control and Systems Engineering

Yuelinyi Ma

,

Zonghao Seven Zhang

,

Camus Hu

,

Chengxi Wei

,

Yanzhe Xiao

Abstract: Texas's AI data centers face dual existential threats: ERCOT's independent grid operation and vulnerability to extreme weather—as demonstrated by Winter Storm Uri's 52 GW outage—coupled with escalating water scarcity threatening cooling system reliability. Existing microgrid solutions address power balance through gas turbines and energy storage but neglect cooling-water constraints as a co-equal design challenge. This study proposes a resilient microgrid architecture with water-electricity coupling for a 100 MW AI data center in West Texas, centered on a 50 MW gas turbine, 20 MWh BESS, and closed-loop cooling reservoir (310 kL) to ensure both power continuity and water security during grid outages. Nonlinear modeling reveals water as the binding constraint: an initial 120 kL design yields only 1.55 h operation under 70 MW critical load, necessitating 309 kL for 4 h survival. A three-phase rule-based IF-THEN strategy governs operation: (1) immediate grid-to-island transition with non-critical load shedding; (2) staged load reductions triggered by SOC and water thresholds; and (3) seamless grid reconnection. MATLAB/Simulink simulations replicating the 2021 Winter Storm Uri blackout scenario validate 100% critical-load (70 MW) supply over 4.2 h of islanded operation with zero external water consumption. Emergency costs amount to only 0.33% of conventional outage losses, while an optional Local Exchange Interface captures $922,500 in annual arbitrage value. The proposed framework transforms resilience from a cost center into a dual-purpose asset supporting both extreme events and daily economic optimization.

Article
Computer Science and Mathematics
Mathematics

Ward Blondé

Abstract: This paper proposes an axiomatization of the absolute infinite and argues against width and height potentialism in set theory. It builds on an unrestricted language and a non-recursively enumerable class theory, called MKmeta, that extends the formal MK: Morse-Kelley with global choice (GC). Class ordinals and class cardinals avoid the Burali-Forti paradox and GC is assumed to warrant comparability of class cardinals. Meta-formality subsequently gets a maximal fixed-point definition under consistency filtering of recursively enumerable formality. By showing that the concept of maximal meta-consistent height (MMH) of an axiom is theory-independent, it follows that no Ord can exceed Ordmeta, the proper class ordinal of MKmeta, such that the absolute infinite Ωmeta = Ordmeta. Unlike formal and infinitary formal-based theories, which are fundamentally incomplete, MKmeta achieves completeness by having absolutely infinitely many formal-based axioms. Moreover, potentialism is countered by MKmeta, which accepts those formal axioms that maximize its models, all of which are elementarily equivalent to the representative Vmeta. At last, only the meta-formal level can capture the entire mathematical reality in a single theory and thus give definite answers.

Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Lahiru Dilshan Peellawalage

,

Sayanton Dibbo

,

Sudip Vhaduri

Abstract: Backdoor attacks enable adversaries to embed malicious behavior into machine learning models by poisoning training data with triggers. Researchers focused largely on backdoors in unimodal models. However, the rise of multimodal systems, e.g., vision–language models (VLMs) and multimodal large language models (MLLMs), has significantly increased the attack surface. Multimodal backdoors can exploit cross-modal triggers, representation-level manipulation, instruction-conditioned behaviors, and test-time activation pathways that are not available in unimodal models. Nevertheless, quantifying progress in this field remains challenging due to fragmented datasets, inconsistent threat models, and the lack of standardized evaluation protocols. This methodological inconsistency limits comparative analysis and impedes a systematic understanding of robustness in multimodal settings. This paper presents a meta-research on multimodal backdoor attacks and analyzes how methodological fragmentation undermines reproducibility and cumulative scientific understanding. We argue that standardized benchmarks and backward compatible evaluation protocols are necessary for a reliable and systematic advancement in multimodal backdoor research.

Article
Environmental and Earth Sciences
Environmental Science

Guilherme D. S. Rios

,

Joaquim E. B. Ayer

,

Derielsen B. Santana

,

Victor H. F. D. Silva

,

Marcelo A. R. Pires

,

Talyson D. M. Bolleli

,

Fellipe S. Gomes

,

Mariana Raniero

,

Pedro F. R. Grande

,

Velibor Spalevic

+2 authors

Abstract: This study assessed the spatial and temporal variability of rainfall erosivity (R factor) and its implications for potential soil loss in the Velhas River Basin, Minas Gerais, Brazil. Rainfall erosivity was estimated using data from 48 rain gauge stations and precipitation derived from CHIRPS product, processed in a cloud-based environment Google Earth Engine. Between 2014 and 2024, annual R values exhibited high variability, ranging from 3,900 to more than 9,000 MJ mm ha⁻¹ h⁻¹ yr⁻¹, with peak values recorded in the wettest year (2022) and the lowest values in 2014. Potential soil loss was estimated using the RUSLE model for the years of minimum and maximum erosivity, yielding values between 0.60 and 274.17 Mg ha⁻¹ yr⁻¹. The highest soil losses occurred in areas of exposed soil and agricultural land, whereas forest formations exhibited lower rates even under high rainfall erosivity conditions. The comparison between observed and estimated datasets revealed strong spatial and statistical agreement according to the Pearson correlation coefficient (r ≈ 0.999), although CHIRPS slightly underestimated extreme values. These results demonstrate the strong potential for integrating observed and remote sensing data in hydrosedimentological analyses at the basin scale.

Article
Physical Sciences
Theoretical Physics

Li Yazhe

Abstract: Based on 1000 sets of cross-scale experimental data from public authoritative databases (including measurements of microcosmic particle vibration characteristics, observations of macrocosmic celestial gravity-vibration coupling, and detection of consciousness activity vibration correlation), this paper systematically verifies the core hypothesis that spacetime quantum vibration is the fundamental interactive carrier of the universe, and constructs a full-scale vibration unified field theoretical system. The quantitative coupling deviation between particle vibration frequency and rest mass is less than 5%, the coincidence degree of the inverse proportional correlation between celestial vibration period and gravitational field strength is over 89%, and the non-local correlation between consciousness vibration and quantum entanglement breaks the Bell inequality limit (S=2.87). The vibration unified field equation derived from the above data integrates the properties of microcosmic particles, macrocosmic gravitational phenomena and the laws of consciousness activities into different evolutionary forms of spacetime quantum vibration parameters (frequency, amplitude, phase), realizing the cross-disciplinary unification of physics and cognitive science for the first time. This theory innovatively proposes that dark matter is “spacetime quanta with reversed vibration phase”, and predicts the specific deflection effect of ultra-high-energy cosmic ray trajectories and the vacuum modulation effect of collective consciousness. It provides a brand-new path for solving cutting-edge problems such as the essence of dark matter/dark energy, the scale gap between quantum mechanics and relativity, and the consciousness-matter interaction. All the adopted experimental data are sourced from authoritative platforms including the International Vibration Physics Database (No. Vib-Unity-2024), with complete traceability and verifiability.

of 5,767

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated