Sort by

Article
Medicine and Pharmacology
Pharmacology and Toxicology

Sharhabil Amgad Eltahir

,

Mukhtar Ibrahim Yousef

Abstract:

Drug-induced toxicity remains a principal driver of attrition in pharmaceutical development, yet conventional screening paradigms typically address individual toxicity endpoints in isolation. Here, we introduce MultiEndpointTox, a chemoinformatics platform that simultaneously predicts seven critical drug toxicity endpoints—hERG cardiotoxicity, hepatotoxicity (DILI), nephrotoxicity (DIKI), Ames mutagenicity, skin sensitization, cytotoxicity, and reproductive toxicity (exploratory)—from molecular structures using curated datasets totaling over 18,000 compounds. The platform employs optimized classical machine learning models with systematic benchmarking of 2D topological descriptors (2240 features), enhanced multi-conformer 3D descriptors (1975 features from 5-conformer ensembles incorporating AUTOCORR3D, RDF, WHIM, and pharmacophore fingerprints), and hybrid representations. Under the tested conditions, 2D descriptors achieved the highest classification performance (AUC-ROC 0.859 ± 0.02), while enhanced 3D descriptors substantially narrowed the previously reported gap (AUC-ROC 0.833 ± 0.03 versus 0.69–0.73 for basic 14-feature 3D). Scaffold-based splitting provided rigorous generalization assessment, with an average performance reduction of approximately 8%. A multi-task learning framework via stacked generalization demonstrated cross-endpoint information sharing improves performance for 5 of 6 endpoints (average +2.1% AUC). The platform integrates leverage-based applicability domain assessment (31–100% coverage), SHAP-based feature importance analysis, and a confidence-weighted multi-endpoint risk scoring system validated on known drugs (AUC = 0.83, p = 4.06 × 10−14, Cliff’s δ = 0.66), with sensitivity analysis confirming robustness across five weight configurations (AUC range 0.72–0.98). External validation on independent benchmark datasets revealed the challenge of cross-dataset domain shift in computational toxicology. MultiEndpointTox is deployed as a production-ready REST API and publicly available at https://github.com/sharhabileltahir/MultiEndpointTox.

Review
Chemistry and Materials Science
Electrochemistry

Paolo Yammine

,

Ali Al-Zein

,

Tony Tannous

,

Hanna El-Nakat

,

Doris Homsi

,

Elie Atieh

,

Zeinab Matar

,

Pierre J. Obeid

,

Ayman Chmayssem

Abstract: This paper serves as a practical guide to help the readers select electrochemical instruments with a focus on potentiostats / galvanostats. It is dedicated to professionals in industry, students and researchers from various fields. We provide an overview of the main potentiostats / galvanostats and related electrochemical instruments currently available on the market and the main suppliers worldwide. For each device, we summarize its technical specifications including current and potential ranges as well as the methods the instrument is able to support. We also discuss the limitations of each instrument in order to provide the readers with a clear and comprehensive understanding. Finally, the paper aims to help the readers in selecting the most suitable instrument for their needs while considering performance and budget.

Review
Social Sciences
Behavior Sciences

Narcis Eduard Mitu

,

George Teodor Mitu

,

Mihaela Zglavoci

Abstract: The European Union relies heavily on voluntary tax compliance, yet evidence on how tax literacy (TL) and tax education (TE) relate to tax morale (TM) and voluntary tax compliance or compliance intentions (VTC) remains fragmented across partly disconnected strands of literature. This systematic review examined EU-relevant evidence on the stakeholder contexts in which TL/TE are discussed in relation to TM and VTC, with particular attention to schools, communities, and public institutions. Following PRISMA 2020, searches in Scopus and Web of Science (2000–2025) applied two complementary query streams focused on TL/TE and TM/VTC-related mechanisms. The searches identified 1327 records; after deduplication and screening, 402 studies were included. Based on structured coding of titles, abstracts, and author keywords, the review maps patterns of emphasis and framing rather than causal effects. Public-institutional and education-related contexts were the most frequently signposted stakeholder environments, while digital and outreach-oriented delivery cues were more visible than classroom-based cues. Trust and fairness/justice dominated the explanatory vocabulary. Overall, the review supports an ecosystem-oriented interpretation of stakeholder coordination in EU tax literacy research.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Mohsen Mostafa

Abstract: The choice of activation function is a fundamental design decision in deep learning, yet most popular options like ReLU, GELU, or Swish are static and treat all inputs uniformly. This one-size-fits-all approach can be suboptimal when data contains heterogeneous noise, where the ideal non-linearity might depend on the input’s statistical context. In this paper, we introduce Bayesian Probabilistic Adaptive Sigmoidal Activation (Bayesian PASA), an activation function that adapts its behavior based on input uncertainty. The method frames activation selection as a Bayesian model averaging problem, adaptively mixing sigmoidal, linear, and noise-aware behaviors. Mixing weights are derived from a variational evidence lower bound (ELBO) and regularized by a ψ-function that bounds the influence of local noise estimates. We provide theo- retical analysis showing Lipschitz continuity, gradient bounds, and convergence under standard assumptions. Due to computational constraints (Google Colab, limited epochs), we evaluate Bayesian PASA on CIFAR-100 (50 epochs, 3 seeds) and CIFAR-10-C (100 epochs, 3 seeds). Despite these limitations, Bayesian PASA achieves 76.38% accuracy on CIFAR-100, slightly outperforming ReLU (75.68%) and GELU (75.98%) under the same constrained conditions. On corrupted CIFAR-10-C, Bayesian PASA combined with Bayesian R-LayerNorm achieves an average accuracy of 53.91%, a +1.87% improvement over the ReLU+LayerNorm baseline. These results, though modest, are consistent across seeds and suggest that Bayesian PASA offers a promising direction for uncertainty-aware activation functions, particularly when training re-sources are limited. Code is available at: https://github.com/BayesianPASA/BayesianPASA/

Article
Engineering
Aerospace Engineering

Stephen A. Whitmore

,

Jared S. Coen

,

Ryan J. Thibaudeau

Abstract: Utah State University has developed a high-performance "green" hybrid propulsion technology based on the unique electrical breakdown properties of 3-D printed acrylonitrile butadiene styrene. Using 3-D printed ABS as fuel, typical startup sequences require approximately 5-15 joules; and once started, the system can be sequentially fired with no additional energy inputs required. The number of possible ignitions is limited only by the amount of fuel. The most technologically mature version uses gaseous oxygen (GOX) as oxidizer and 3-D printed ABS as fuel. While GOX is mass efficient, it lacks volumetric efficiency unless highly pressurized. Nytrox, a blend of GOX and nitrous oxide, improves propellant density and volumetric efficiency, while maintaining acceptable levels of mass efficiency (specific impulse). Nytrox can safely self-pressurize, eliminating the need for a separate oxidizer pressurization system and reducing overall complexity. However, using Nytrox as a direct replacement for GOX presents ignition decreases ignition reliably, significantly increasing cold-start ignition latency. This paper quantifies the latency, explores its sources, and analyzes expected behaviors. Solutions include raising combustion and storage pressures to boost oxygen content in Nitrox’s liquid phase and increasing combustion chamber pressure to reduce ignition delays.

Article
Computer Science and Mathematics
Probability and Statistics

Peter Gács

Abstract: In the context of the dynamical systems of classical mechanics, we introduce two new notions called “algorithmic fine-grain and coarse-grain entropy”. The fine-grain algorithmic entropy is, on the one hand, a simple variant of the randomness tests of Martin-L¨of (and others) and is, on the other hand, a connecting link between description (Kolmogorov) complexity, Gibbs entropy and Boltzmann entropy. The coarse-grain entropy is a slight correction to Boltzmann’s coarse-grain entropy. Its main advantage is its less partition-dependence, due to the fact that algorithmic entropies for different coarse-grainings are approximations of one and the same fine-grain entropy. It has the desirable properties of Boltzmann entropy in a wider range of systems, including those of interest in the “thermodynamics of computation”. It also helps explaining the behavior of some unusual spin systems arising from cellular automata.

Review
Medicine and Pharmacology
Anesthesiology and Pain Medicine

Daniel John Doyle

Abstract: Respiratory diseases such as chronic obstructive pulmonary disease (COPD), asthma, tuberculosis, and acute respiratory distress syndrome (ARDS) remain leading causes of morbidity and mortality worldwide. Traditional respiratory care faces challenges in early diagnosis, personalized treatment, efficient resource allocation, and optimal mechanical ventilation management. Artificial intelligence (AI) has emerged as a transformative tool, offering applications in diagnosis, monitoring, treatment optimization, critical care support, and automated ventilator control. This comprehensive review examines AI's role across the respiratory care continuum, from diagnostic imaging and spirometry interpretation to autonomous multiparameter ventilator adjustment during maintenance and weaning phases. The article highlights quantitative evidence of clinical impact, regulatory status, challenges including algorithmic bias and health equity concerns, implementation strategies, and detailed analysis of AI-driven mechanical ventilation systems. Case studies illustrate real-world outcomes with specific effect sizes. The discussion emphasizes both the promise and limitations of AI, preparing healthcare professionals and students to critically evaluate its role in clinical practice.

Review
Medicine and Pharmacology
Other

Simona Wójcik

,

Monika Tomaszewska

,

Anna Rulkiewicz

Abstract: Background/Objectives: Obesity is a chronic, relapsing disease with a widening gap between clinical need and the availability of specialist care. Artificial intelligence (AI) may enable earlier risk detection, more precise phenotyping, and scalable behavioural support across obesity treatment pathways. This narrative review synthesises con-temporary AI applications across the obesity care continuum and evaluates their translational readiness. Methods: A targeted search of PubMed/MEDLINE and Google Scholar (January 2024–January 2026) was conducted, complemented by citation chaining. Evidence was syn-thesised across four domains: (1) risk prediction and screening, (2) environmental and behavioural determinants, (3) multimodal phenotyping and precision stratification, and (4) AI-enabled lifestyle interventions and behavioural coaching (AIBC). Results: EHR-based models demonstrate clinically useful discrimination for early risk identification. Multimodal approaches refine stratification beyond BMI-centric classi-fication. AIBC platforms show emerging evidence of clinically meaningful weight loss, including non-inferiority to human coaching, but long-term effectiveness, generalisa-bility, and equity remain insufficiently established. Conclusions: AI is positioned to become a core enabler of personalised obesity path-ways. Safe translation requires external validation, bias auditing, transparent report-ing, human oversight, and post-deployment surveillance aligned with clinical guide-lines and regulatory expectations.

Article
Engineering
Bioengineering

Lafi Hamidat

,

Dilber Uzun Ozsahin

,

Berna Uzun

Abstract: The development of biodegradable scaffolds for load-bearing bone tissue engineering (BTE) presents a fundamental multi-criteria optimization challenge, requiring a simultaneous balance among mechanical performance, biological integration, and degradation kinetics. These criteria are inherently conflicting: composite formulations with the highest compressive strength frequently exhibit suboptimal porosity, while those with superior osteoconductivity often lack sufficient load-bearing capacity. To address this challenge rigorously, this study establishes a hybrid Fuzzy Analytic Hierarchy Process–Technique for Order of Preference by Similarity to Ideal Solution (Fuzzy AHP-TOPSIS) framework to evaluate and rank five clinically relevant biodegradable polymer–ceramic composite candidates: PLA/Hydroxyapatite (PLA/HA), PCL/Hydroxyapatite (PCL/HA), PLGA/Bioactive Glass (PLGA/BG), PLA/Carbon Nanotubes (PLA/CNT), and PLA/Magnesium (PLA/Mg). Quantitative property data were systematically extracted from ten peer-reviewed experimental studies published between 2021 and 2025, and converted into Triangular Fuzzy Numbers (TFNs) to explicitly model inter-study variability arising from differences in fabrication methods, filler loading, and testing conditions. Fuzzy AHP analysis identified Compressive Strength (w = 25.2%) and Cell Viability (w = 21.5%) as the dominant decision criteria for load-bearing cortical bone repair. The Fuzzy TOPSIS ranking identified PLA/HA as the optimal composite candidate (Closeness Coefficient, CCᵢ = 0.677), demonstrating the superior multi-criteria balance required for cortical bone repair applications. Although PLA/CNT achieved the highest mechanical strength, it was outranked due to lower osteoconductivity and elevated cytotoxicity uncertainty at high nanotube concentrations (CCᵢ = 0.544). Sensitivity analysis across five distinct weighting scenarios confirmed the robustness of PLA/HA as the primary candidate. These findings provide a validated, replicable computational blueprint for evidence-based scaffold material selection, with direct implications for reducing the burden of costly trial-and-error experimentation in BTE research.

Hypothesis
Medicine and Pharmacology
Neuroscience and Neurology

Artyom Dyupin

Abstract: Manual medicine has long outgrown explanations that rest solely on structural-biomechanical correction. While the techniques reliably alleviate musculoskeletal pain and functional complaints, the evidence suggests that durable benefit depends far less on lasting mechanical realignment than on a distributed set of neurophysiological, autonomic, interoceptive and contextual processes. A persistent translational gap nevertheless remains between these abstract predictive models of bodily regulation and the tangible, regional tissue dynamics that clinicians encounter in practice. We propose PULSE-V (Predictive Updating of Local Somatic Errors via Vasomotion) as a hypothesis-generating framework that seeks to narrow that gap. The central suggestion is that coherent low-frequency vasomotor oscillations (~0.1 Hz) within angiosomes may serve as a candidate biophysical substrate capable of organising ascending interoceptive signals. When coherence is disrupted, the resulting noisy afferent stream may contribute to interoceptive prediction error. Chronic somatic dysfunction can then be understood as a form of allostatic interoceptive overload — a self-stabilising loop in which ambiguous peripheral input, impaired sensory attenuation and entrenched top-down priors reinforce one another. PULSE-V is offered as a deliberately falsifiable programme rather than a settled theory. It generates testable predictions concerning regional vasomotor patterns, multimodal biomarker signatures and the differential contributions of vasomotor, affective-touch and relational elements in treatment. Should the evidence support it, the model may help account for the frequently observed discrepancy between the modest mechanical effects of manual intervention and the substantial clinical outcomes that follow.

Article
Medicine and Pharmacology
Ophthalmology

Phanna Han

,

Marady Hun

,

Fulgencio Nsue Eyene Nfumu

,

Bing Jiang

Abstract: Recent studies have demonstrated that microRNAs hold potential as diagnostic biomarkers and therapeutic targets for ophthalmic diseases. However, there is a lack of bibliometric research focused on the role of microRNAs in ophthalmology. In this study, we conducted a bibliometric analysis to examine the trends and research hotspots in the field of microRNAs in eye-related diseases, providing a visual map of both established and emerging trends. We retrieved publications from the Web of Science database covering the period from 1999 to 2025. Visual representations were created using VOSviewer, CiteSpace, Venn diagrams, UpSet RStudio, and Microsoft Excel to perform co-occurrence and co-citation analyses, highlighting trends, hotspots, and contributions from authors, institutions, journals, and countries/regions. China and the United States emerged as the leading contributors, while Investigative Ophthalmology and Experimental Eye Research were the most prolific journals in this field. Over the past 26 years, the number of publications and citations has grown exponentially across various countries, organizations, and authors. Notably, we found that the dysregulation of let-7, miR-184, miR-181, miR-155, miR-146, miR-21, and miR-9 occurred most frequently in various ocular-related diseases. This study outlines the current trends, hotspots, and emerging frontiers in the field, offering new insights into the identification of diagnostic biomarkers and the design of future clinical trials for microRNAs in ophthalmic diseases. Additionally, international collaborations are essential for expanding and advancing research on microRNAs in eye-related diseases.

Article
Computer Science and Mathematics
Mathematics

Mohammad Abu-Ghuwaleh

Abstract: We develop the strip-analytic sequel to the master-integral-transform program with entire kernels by replacing the discrete Taylor-spectrum model with a continuous spectral model on the dilation side. The central object is a Hardy-strip orbit kernel whose boundary representation induces a continuous dilation-convolution operator acting on the Fourier transform of a weighted signal. In this setting, the Abu-Ghuwaleh transform admits two complementary inversion mechanisms: Mellin contour inversion and contour-free Wiener--Mellin inversion. We prove exact factorization formulas on named weighted function spaces, derive branchwise Mellin diagonalization formulas, obtain inversion theorems under nonvanishing assumptions on the continuous symbol, and show that logarithmic coordinates convert the transform into an additive convolution equation. This yields a practical FFT-based inversion framework together with a stability bound on frequency windows away from zeros of the multiplier. We also prove an explicit injectivity-and-stability proposition for a resolvent-type kernel family with Gamma-type symbol. The paper is designed as the natural continuous-spectrum successor to the entire-kernel and finite-Laurent stages of the program.

Article
Physical Sciences
Theoretical Physics

Salim Yasmineh

Abstract: We present a geometric model in which each particle is associated with its own private spacetime manifold—a world-block—constructed from Fermi–Walker coordinates. The intrinsic spatial metric on each proper-time slice is treated as a dynamical field with a universal stiffness constant. This single assumption leads to a consistent model where self-gravity is repulsive and mutual gravity is attractive and exactly Newtonian. Newton’s constant emerges from the fundamental stiffness. The model provides a geometric derivation of the inverse-square law and suggests a connection to cosmology: the constant part of the strain field on large scales can be interpreted as a cosmological constant whose magnitude is set by the Hubble radius.

Article
Computer Science and Mathematics
Computer Science

Shuriya B

Abstract: Inclusive communication remains a critical challenge for individuals with hearing impairments, speech disorders, or multilingual barriers, particularly in educational and urban settings. This paper proposes a multimodal LLM framework that transforms auditory inputs such as noisy speech, accents, or sign language into coherent speech synthesis and written outputs, enabling seamless accessibility. Leveraging transformer-conformer architectures, our system fuses audio spectrograms, lip-reading visuals, and textual context via cross-modal attention mechanisms, achieving superior performance in real-time transcription (WER < 5% on diverse datasets) and voice cloning tailored to user prosody. Key innovations include adaptive noise suppression for hearing aid integration, ethical personalization to preserve speaker identity, and deployment on edge devices for low-latency applications like VR classrooms. Evaluations on benchmarks (e.g., LibriSpeech, VoxCeleb) and user trials with 50 participants (including seniors and hard-of-hearing students) demonstrate 30% improvements in comprehension accuracy and user satisfaction over baselines like Whisper and GPT-4V. By bridging auditory-to-text/speech gaps, this framework advances AI pedagogies for immersive learning, promotes equity in communication, and sets foundations for scalable IoT-enhanced inclusive tools. Future directions explore federated learning for privacy-preserving multilingual expansions.

Article
Biology and Life Sciences
Immunology and Microbiology

Marcos Mancilla

,

Adriana Ojeda

,

Yassef Yuivar

,

Maritza Grandón

,

Sebastián Valderrama

,

Marcela Oyarzún

,

Horst Grothusen

,

Pablo Ibarra

,

Patricio Bustos

Abstract: The incidence of furunculosis in juvenile Atlantic salmon, Salmo salar, has increased in recent years in Chile, with isolates of Aeromonas salmonicida being the primary cause. However, in some cases, molecular diagnostics failed to identify the etiological agent. We previously demonstrated that a proportion of undiagnosed cases was produced by a new A. salmonicida strain. In those cases where the pathogen remained unidentified, we isolated colonies with an A. salmonicida-like appearance. Subsequent phylogenetic analysis presented in this work grouped those A. salmonicida-like isolates within the Aeromonas piscicola clade. Whole genome sequencing confirmed the taxonomic affiliation, giving additional insights into virulence and antibiotic resistance markers. Indeed, one of the strains showed reduced susceptibility to oxytetracycline. Virulence potential was assessed by in vivo testing on S. salar, which resulted in disease with pathognomonic signs of furunculosis. Although the pathogen presents common antigens with A. salmonicida, the current vaccine triggered only a modest IgM response against A. piscicola in the field. Our results support the hypothesis that the incidence of furunculosis in Chile cannot be ascribed to the emergence of the new A. salmonicida strain, but may partially result from infections caused by A. piscicola strains which exhibit a comparable virulence level.

Article
Engineering
Aerospace Engineering

Stephen A. Whitmore

,

Ryan J. Thibaudeau

,

Ava T. Wilkey

Abstract: Hybrid rocket technologies are gaining recognition as eco-friendly alternatives to traditional propulsion systems. Utah State University's Propulsion Research Laboratory has developed a High-Performance Green Hybrid Propulsion (HPGHP) technology, leveraging 3D-printed ABS fuel for reliable, low-energy ignition. Among tested materials, only ABS shows suitable electrical-breakdown properties for arc ignition. Unfortunately, due to the proprietary formulations in commercial ABS blends, and its limited use as a rocket-propellant, related composition and combustion data are limited. This study uses spectroscopic evaluation and bomb calorimetry to estimate material compositions, enthalpies of formation, and combustion energies for multiple commercially available 3-D print feed stock ABS types, finding minimal differences amongst the samples tested. Based on these test results, “representative” ABS properties including chemical formula, mean molecular weight, enthalpy of formation, and Higher Heating Value, is recommended. Follow-on tests with 5 alternative, commonly used, 3D-printable thermoplastic feed stocks demonstrate that ABS has significantly higher energy content. This result supports ABS’s advantages and utility as a conveniently fabricated hybrid rocket fuel.

Article
Computer Science and Mathematics
Information Systems

Tomaž Podobnikar

Abstract: Spatial data quality (SDQ) is commonly assessed through technical verification. However, empirical evidence demonstrates that perceived data quality often diverges from objectively measured quality due to cognitive, institutional, and lifecycle-related factors. This paper proposes a multi-layered SDQ framework that integrates technical admissibility, process and lifecycle stewardship, visual and interpretive diagnostics, and governance indicators to enable holistic quality assessment within a socio-technical system. Rather than treating quality elements in isolation, the framework supports the diagnosis of emergent quality states and associated risk patterns. The framework is demonstrated through two empirical cases: validation of planned land use data using the OPIAvalid toolkit, and semantic conflation of multiple digital elevation models (DEMs) with heterogeneous lineage. Results show that governance failures, specification misuse, and degradation of lineage can undermine trust and decision-making even when datasets formally comply with ISO-based indicators. Visual spatial forensics and lineage-aware integration proved essential for detecting undocumented methodological shortcuts and restoring justified trust in authoritative data. Artificial intelligence is positioned as a diagnostic and explanatory support, assisting in anomaly detection, prioritization, and communication of quality risks, while deterministic validation and expert judgment remain mandatory. Overall, the framework shifts SDQ management from isolated technical validation toward lifecycle-oriented, transparent, and sustainable data governance.

Essay
Physical Sciences
Astronomy and Astrophysics

Raheb Ali Mohammed Saleh Aoudh

Abstract: We present a mathematically rigorous formulation of the Fundamental Speed Theory (FST), a vector-tensor theory of gravity featuring a dimensionless vector field
\( \mathcal{V}^{\mu}\ \). The theory introduces characteristic scales \( M_{0} = \hbar /(cL_{0}) \) and \( L_{0} = 10~\mathrm{kpc} \) to ensure complete dimensional consistency, with explicit inclusion of ℏ and c in all physical expressions. Galactic dynamics obey \( \frac{d^{2}\tilde{\mathcal{V}}}{d\xi^{2}} + \frac{2}{\xi}\frac{d\tilde{\mathcal{V}}}{d\xi} = \beta_{\mathrm{eff}}\tilde{\mathcal{V}}^{3} \) where \( \xi = r / L_{0} \) and \( \beta_{\mathrm{eff}} = \frac{\lambda}{6}\mathcal{V}_{0}^{2} = 2.0 \times 10^{7} \). We perform a hierarchical validation at three distinct levels of parameter freedom: Level 3 (Zero Free Parameters): Fixed \( M = 1.0\times10^{10}\,M_{\odot} \) and \( r_d = 3.0\,\mathrm{kpc} \) for all 175 galaxies. Even with no galaxy-specific parameters, FST correctly describes 65.7% of galaxies with mean \( \chi^{2}_{\nu} = 0.809 \). Level 2 (Estimated Parameters): Mass and scale length estimated from scaling relations (no fitting). Success rate rises to 94.9% with mean \( \chi^{2}_{\nu} = 0.283 \). Level 1 (Fully Fitted): Mass and scale length fitted per galaxy. Success rate reaches 100% with mean \( \chi^{2}_{\nu} = 0.170 \). This hierarchical validation demonstrates that FST captures the essential physics of galactic rotation without overfitting. The theory achieves a mean reduced chi-squared of \( \langle \chi^{2}_{\nu} \rangle = 0.170 \) across all 171 SPARC galaxies, with 91.2% of galaxies having \( \chi^{2}_{\nu} < 0.5 \) (excellent fit) and only 1.8% (three galaxies) having \( \chi^{2}_{\nu} > 1.0 \). The characteristic transition scale is \( \xi_c = \sqrt{2/\beta_{\mathrm{eff}}} = 3.16\times 10^{-4} \), corresponding to a fundamental scale \( r_c = \xi_c L_0 \approx 3.16 \) pc. Cluster analysis reveals three distinct dynamical families of galaxies. Solar System constraints are satisfied through a screening mechanism derived directly from the velocity field, with a characteristic screening length \( \lambda_{\mathrm{screen}} \sim 10^{13}\,\mathrm{m} \) (about 200 AU). Complete mathematical derivation and an open-source implementation ensure full reproducibility.

Article
Biology and Life Sciences
Biophysics

Bernard Delalande

,

Hirohisa Tamgawa

,

Vladimir Matveev

Abstract: The Hodgkin-Huxley (HH) model has dominated quantitative neuroscience since 1952. Its authors explicitly acknowledged its phenomenological character and called for a deeper mechanistic account. We propose that this account is the IMH model of nerve conduction. The model rests on three biophysical foundations: (1) the polyelectrolyte gel framework of Ling, in which intracellular K+ is adsorbed on protein sites and the resting ionic distribution is a thermodynamically stable Donnan equilibrium requiring no metabolic pump [6,7]; (2) the Hofmeister ion series, which governs differential adsorption of K+ versus Na+ [8]; and (3) the hydraulic wave equation for a fluid-filled elastic tube, which predicts conduction velocity from myelin elastic modulus rather than sodium channel density. In this framework, the action potential is a coupled ionic-hydraulic phase transition propagating as a pressure wave in the periaxonal space. Electrical events are causally secondary—the electromagnetic shadow of the hydraulic wave, not its cause. We demonstrate that the model resolves a 75-year-old anomaly identified but left unexplained by Huxley and Stämpfli in 1949 [10]: positive current enters a node before the membrane potential reaches its maximum, a relation the authors themselves described as “impossible in a system of resistances and capacities.” We present eight falsifiable predictions distinguishing the IMH model from HH, covering myelin mechanics, mechanoreceptor adaptation, terminal arborisation geometry as the physical substrate of the Umwelt, motor tremor as hydraulic interference, and the temporal basis of conscious perception.

Article
Social Sciences
Decision Sciences

Xiaoyi Meng

,

Shaochun Liu

Abstract: The accuracy of financing demand prediction has a direct impact on the return on investment and risk exposure in fintech investment and asset allocation. Nevertheless, the real world financial transaction data often displays significant nonstationary features — for example, cyclical fluctuations, event shocks, and short-term anomalies — which make the traditional forecasting approach unstable in the real investment scenarios. This study builds a data set that includes 34 reproducible variables — including daily financing requirements, transaction peaks, capital occupation duration, and risk exposure levels — on the basis of 180 consecutive days of investment and operating data from a leading financial services firm. It systematically compares ARIMA, Prophet, Random Forest, and XGBoost models for financing demand forecasting. Empirical results show that XGBoost maintains a low forecast error (MAPE of 8.2%) in the case of market fluctuations and unusual events, which reduces the average error by about 22% in comparison with the baseline model. Based on these results, a model is built to analyze the effect of forecast errors on the stability of investment returns and the efficiency of capital turnover. Results show that keeping the forecast error under 10% significantly reduces the risk of capital misallocation in times of high volatility, while at the same time improving the stability of overall investment returns. This study provides a reusable model workflow and engineering reference for the establishment of the investment allocation and risk management system of the financial institutions.

of 5,663

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated