Sort by

Article
Physical Sciences
Quantum Science and Technology

Cheng Jinjun

,

Cheng Dian

Abstract: This paper represents a further academic deepening and upgrading of the authors'2019 publication A Hypothesis on the Spatial Motion Mode of Photons. It should beexplicitly stated that this paper falls within the category of natural philosophicalthought experiments—its core value lies in constructing a unified physical image ofthe nature of light through rigorous logical deduction, and proposing verifiabletheoretical hypotheses and experimental schemes; the validity of all conclusionsmust ultimately be verified by rigorous and extensive scientific experiments beforebeing incorporated into the theoretical system of physics. As a foundational conceptof quantum mechanics, the wave-particle duality of light has been accompanied byprofound philosophical perplexities and theoretical tensions since its proposal,becoming a core bottleneck in the integration of classical and quantum physics. Thispaper systematically sorts out the logical incompleteness in the current quantuminterpretation system—including the self-negation of the complementarity concept,the problem of photon localization, the fundamental opposition between the statisticaland non-statistical interpretations of the wave function, and the philosophicalcontroversy over the Heisenberg Uncertainty Principle, revealing the inherentcontradictions of the traditional wave-particle duality framework. On this basis,adopting classical physical images and the logic of reduction to absurdity, and basedon six axioms and six preparatory propositions, this paper puts forward a naturalphilosophical hypothesis on the essence of photons: a photon is an energetic masspoint with a diameter smaller than the Planck length, moving in a uniform spiral linearmotion in space. The paper deduces the core characteristics such as velocity,frequency, and wavelength of the photon's uniform spiral linear motion, and designsthree operable, repeatable, and quantifiable physical experimental schemes toprovide specific paths for the empirical verification of the hypothesis. The researchdeduces that the angular momentum of photon spatial motion (excluding photon spinmotion) is always the reduced Planck constant ℏ , the energy E=mc² is naturallyunified with E=hν (the standard formula for wave energy), and the standardexpression of the Heisenberg Uncertainty Principle ΔxΔpₓ≥ℏ /2 can be given aclassical physical interpretation from the perspective of superposition ofmeasurement deviations. This paper systematically responds to potential questionsregarding the origin of photon particle nature, wave nature, and compatibility withrelativity, arguing that the hypothesis provides a logically consistent and clearlyvisualized path for understanding the nature of light, builds a new naturalphilosophical framework for the integration of quantum and classical theories of light,and also offers a new thinking perspective for the paradigm shift in the study of thenature of light.

Review
Biology and Life Sciences
Biophysics

Benjamin Drukarch

,

Micha Wilhelmus

Abstract: Neuronal excitability manifests itself mainly in the form of non-linear, self-regenerative waves of electricity moving along the surface of neuronal axons. These waves are commonly known as action potentials (APs). Theorizing and experimental investigation of the physical and functional characteristics of APs has broadly followed along the lines of the ionic hypothesis and the associated mathematical model introduced by Hodgkin and Huxley (HH). In the current form of this bioelectrical framework, adopted in mainstream physiology and other biological sciences, the axonal membrane is conceptualized as an electronic circuit where electric current is generated and propelled as the result of time-dependent opening and closure of voltage-operated ion channel proteins allowing passive flow of specific ions across and along the membrane powered by their respective electrochemical gradients. Although representing mainstream research, the bioelectric perspective has been criticized for its narrow focus on electrical characteristics of APs, whilst ignoring other physical manifestations of the nerve signal, in particular mechanical and thermal changes coinciding with AP propagation. As an alternative, a thermodynamics-based acoustic theory has been outlined in which all, electric and non-electric, manifestations of the nerve signal are considered as the result of a single density pulse in the axonal membrane carried by a reversible lipid membrane phase transition and momentum conservation. Representing a minority view, however, this unified, thermodynamic perspective on the physical nature of neuronal excitability is largely ignored by representatives of the bioelectric perspective.Here we draw special attention to the philosophical dimension of the communication failure between the two communities of scientists. We argue that adherents of the bioelectric perspective favor a mechanist-type of explanation, whilst supporters of the thermodynamic perspective are committed to so-called covering-law types of explanation. We conclude that it is this, thus far unrecognized, philosophical rift, rather than specific scientific differences of opinion that blocks fruitful interdisciplinary cooperation necessary for building a comprehensive, fully integrated, notion of the physical nature of neuronal excitability. Suggestions of how to bridge this conceptual gap are formulated.

Article
Public Health and Healthcare
Public Health and Health Services

Taiwo Opeyemi Aremu

,

Olihe Nnenna Okoro

,

Caroline Gaither

,

S. Bruce Benson

,

Drissa M Toure

,

Jon C. Schommer

Abstract: Background: During the COVID-19 pandemic, Nigeria relied largely on imported vaccines, underscoring vulnerabilities in supply chains and the absence of domestic vaccine manufacturing. Understanding supply-related barriers and the resources required for local vaccine production is critical for future pandemic preparedness and population health outcome. The objective of the study was to identify stakeholder-perceived barriers to COVID‑19 vaccine manufacturing in Nigeria and to describe the resources and enabling conditions required for local production. Methods: We conducted a qualitative needs assessment using semi-structured interviews with senior personnel from Nigerian pharmaceutical manufacturing firms and regulatory agencies. Participants were recruited purposively and consecutively. Interviews (30-60 minutes) were conducted via Zoom, audio-recorded with consent, transcribed, and analyzed using inductive thematic analysis following established six-phase procedures. Reporting adheres to the Consolidated Criteria for Reporting Qualitative Research (COREQ). Results: Six participants (two regulators and four pharmaceutical executives) identified three interrelated barrier to domestic COVID-19 vaccine production: (1) technical and knowledge gaps (loss of hands-on expertise, absence of operational vaccine manufacturing facilities, limited technology transfer), (2) financial and infrastructure barriers (high cost of capital, serial taxation, unreliable electricity and logistics constraints), and (3) systemic and institutional barriers (inconsistent political commitment, policy discontinuity, regulatory capacity gaps, and concerns about public confidence). To enable local production, participants emphasized coordinated investment in workforce development, technology-transfer partnerships, modern utilities and cold chain systems, access to specialized equipment and high-quality inputs, and predictable policy, financing, and regulatory environments. Conclusions: Participants perceived Nigeria’s current capacity as insufficient for COVID-19 vaccine manufacturing but identified actionable levers, particularly human capital development, infra-structure strengthening, and regulatory and financing reforms, to enable sustainable local production. These findings provide a practical roadmap for policymakers, regulators, and industry leaders seeking to strengthen Nigeria’s biomanufacturing and long-term pan-demic preparedness.

Technical Note
Engineering
Mechanical Engineering

Han Haitjema

Abstract: For the calibration of surface plate, the classical Moody method is still commonly used. In this method the straightness of a number of lines over a surface plate in a union-jack configuration are measured and combined to a flatness measurement. The measurement of two center lines is commonly omitted in the evaluation and only used to determine so-called closure errors. These two lines can be incorporated in the measurement evaluation in a least-squares sense, giving an 18% reduction of the uncertainty. A further reduction in the uncertainty is possible when using the gravity vector as a common reference, as can be done when using electronic levels. A least-squares evaluation of measurements taken in this way gives a further reduction of the uncertainty of 29% relative to the traditional evaluation according to the Moody method. This is illustrated with an actual measurement example and additional Monte-Carlo simulations.

Article
Public Health and Healthcare
Health Policy and Services

Claudia Chaufan

,

Maryanne Dias

,

Natalie Hemsing

,

Olga Collins

Abstract: Background: During the Covid-19 event, Ontario hospitals implemented healthcare worker vaccination policies under Directive #6, a provincial framework that formally permitted multiple compliance pathways, including mandatory vaccination. Despite this formal flexibility, institutional responses converged. This study examines how vaccination mandates were implemented and justified across institutional, legal, and media domains, with particular attention to the operation of discretion within a decentralized governance framework. Methods: An environmental scan was conducted using document analysis of publicly available materials from a purposive sample of Ontario hospitals. Sources included hospital policy documents, institutional communications, court decisions, and media coverage. Materials were analyzed to identify patterns of mandate implementation, justification, and representation across domains. The term “Covid-19 event” is used as a neutral temporal descriptor that does not presuppose epidemiological classification. The study emphasizes descriptive mapping of institutional responses rather than causal inference. Results: Across the documentary corpus, vaccination was consistently framed as a baseline condition of healthcare employment, while alternatives permitted under provincial policy were rarely presented as durable or equivalent options. Hospitals adopted highly similar implementation models despite formal discretion. Legal decisions generally treated mandates as matters of institutional or employer authority, emphasizing jurisdictional and procedural considerations while limiting substantive review of scientific and proportionality claims central to the litigation. Media coverage largely mirrored institutional and legal framings, presenting vaccination as a settled professional expectation and employment exclusion as a routine administrative consequence. Taken together, these domains exhibited parallel patterns of normalization and policy alignment. Conclusions:This environmental scan documents convergence toward restrictive vaccination mandate implementation across institutional, legal, and media domains despite a formally flexible policy framework. By tracing how discretion was exercised and legitimated, the study provides an empirical account of how vaccination mandates stabilized as routine institutional practice. These findings establish a foundation for subsequent interpretive analysis of authority, dissent, and policy problem representation within governance frameworks during declared public health emergencies.

Article
Social Sciences
Government

Igor Calzada

,

Itziar Eizaguirre

Abstract: Artificial Intelligence (AI) is increasingly embedded in public governance, raising questions about how institutions can anticipate its societal implications while safeguarding democratic accountability amid expanding computational infrastructures. This article examines how anticipatory AI governance can be operationalised in the age of super-computing through a mixed-methods multistakeholder approach in the Basque Country (Spain). The study focuses on the city-regional governance setting of Gipuzkoa, a de-volved historical territory with fiscal autonomy and a growing advanced-computing ecosystem centred in Donostia–San Sebastián, where regional initiatives are positioning the Basque Country as an emerging “quantum territory” within Europe’s high-performance and quantum computing landscape, including the installation of IBM Quantum System Two. Methodologically, the study combines action research with three stakeholder groups and a quantitative online survey of citizens (N = 911). The action research engaged six civil society organisations, seven provincial directorates, and eleven municipalities. Results indicate that city-regional administrations can function as labor-atories for public AI governance when policy experimentation is combined with empirical evidence and advanced computational infrastructures. The findings suggest policy recommendations for supercomputing ecosystems, including transparent AI experi-mentation, public-interest data governance, and policy sandboxes linking advanced computing, civic participation, and accountable digital public services.

Article
Biology and Life Sciences
Life Sciences

Ting-Chao Chou

Abstract: The Universe has two domains: Life and Non-Life, which manifest the dimensionless relativity ratio with basic codes. For life is a/b = a/(1-a) = (1-b)/b (Floating Ratio), and for Non-Life is a/b = (a + b)/a = 1 + b/a (Golden Ratio). Life and Non-Life can be connected and co-exist by the two fractional distribution functions of “1”.The Mass Action Law (MAL) Median Effect Principle leads to the Unified General Dynamics Theory and algorithm, which provides an interdisciplinary and cross-disciplinary common linkage of parameters for computerized digital research and development informatics.

Article
Computer Science and Mathematics
Mathematics

Mohammad Abu-Ghuwaleh

Abstract: We develop a unified operator- and matrix-valued strip-analytic extension of the Abu-Ghuwaleh transform program. The central object is a strongly measurable operator-valued orbit density whose boundary representation induces a continuous dilation-convolution operator acting on the Fourier transform of a weighted Hilbert-space-valued signal. In this setting the transform admits two complementary inversion mechanisms: Mellin contour inversion and contour-free Wiener--Mellin inversion on the logarithmic scale. We prove exact factorization formulas on named weighted signal spaces, derive branchwise Mellin diagonalization formulas with operator-valued system symbols, obtain inversion theorems under bounded invertibility assumptions, and formulate a log-scale Fourier multiplier representation suitable for FFT-based recovery. We then prove Young-type boundedness on the logarithmic side and stability estimates on frequency windows away from singularities of the multiplier. The finite-dimensional matrix case is obtained as a direct specialization of the Hilbert-space theory, and in that setting the Wiener inverse is derived from a standard matrix Wiener criterion. Finally, we isolate an explicit Gamma-type kernel family for which the system symbol is computable in closed form and yields concrete injectivity and stability constants. The paper is intended as the natural operator-theoretic successor to the scalar strip-analytic stage of the master-integral-transform program.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Zirawani Binti Baharum

,

Abdulbaset Salem Albaour

Abstract: The rapid diffusion of Artificial Intelligence (AI) across public sector institutions is reshaping governance practices and service delivery worldwide. However, most existing AI governance frameworks have been developed within technologically advanced and institutionally stable environments, limiting their applicability to developing and fragile states characterised by institutional volatility, regulatory gaps, and socio-political complexity. Addressing this gap, this study proposes an empirically grounded Ethical AI Governance Framework designed to support responsible AI adoption in public sector institutions operating within fragile governance contexts. The research adopts a sequential explanatory mixed-methods design, integrating quantitative and qualitative approaches to ensure methodological rigor and contextual validity. The quantitative phase involved a structured survey assessing organisational readiness for AI adoption, including dimensions of institutional capacity, technological infrastructure, governance preparedness, and stakeholder trust. The results provided baseline indicators of AI readiness across public sector entities. Building on these findings, the qualitative phase employed semi-structured interviews with policymakers, digital transformation experts, and institutional stakeholders to explore deeper governance challenges related to AI deployment, including ethical accountability, regulatory constraints, and institutional trust. The integration of both datasets enabled methodological triangulation and facilitated a comprehensive understanding of AI governance dynamics. Based on the empirical insights derived from this mixed-methods analysis, the study develops an Ethical AI Governance Framework structured around three interdependent pillars: Institutional Readiness, emphasising organisational capacity, regulatory alignment, and technical infrastructure; Ethical Oversight, ensuring transparency, accountability, fairness, and responsible algorithmic governance; and Participatory Design, promoting citizen engagement, stakeholder inclusion, and trust-building mechanisms throughout the AI lifecycle. The proposed framework contributes to the emerging literature on AI governance in fragile and developing states by providing a context-sensitive governance model grounded in empirical evidence from the Libyan public sector. By bridging theoretical governance principles with practical policy considerations, the study offers a scalable blueprint for enabling responsible, trustworthy, and inclusive AI-driven digital transformation in complex institutional environments.

Article
Physical Sciences
Mathematical Physics

Deep Bhattacharjee

,

Onwuka Frederick

,

Riddhima Sadhu

,

Susmita Bhattacharjee

,

Shounak Bhattacharya

,

Soumendra Nath Thakur

,

Priyanka Samal

,

Pallab Nandi

,

Tarun Bhattacharjee

,

Sanjeevan Singha Roy

+2 authors

Abstract: This paper develops a unified and comprehensive framework for Hopf-like fibrations on Calabi–Yau spaces, with emphasis on when topological fibration data is compatible with Ricci-flat Kähler geometry and with compactification constraints from string/M-theory. We prove obstruction statements for smooth compact settings by combining characteristic-class constraints, Leray–Serre transgression, and rational formality, and we contrast these with constructive local models in hyperkähler and singular regimes where circle and higher-sphere fiber structures remain geometrically meaningful. New contributions (v4). This version resolves all major open problems identified in the prior literature and in earlier versions of this manuscript. We prove: (1) a complete finite classification of Hopf-like fibrations on compact CY3 orbifolds (16 admissible isotropy types, ≤ 47 diffeomorphism classes); (2) the sharp constant C(n) = n/(4π2) in the Ricci-flat Hopf inequality; (3) an exact spectral gap formula for CY submersions; (4) a complete classification of MCF singularities preserving Hopflike structure (Types I/II/III, with Type III being conifold transitions); (5) finiteness and explicit count (2741 for the quintic) of Hopf-like flux vacua; (6) a Hopf-like analogue of the Cardy formula with logarithmic corrections from CFT twist operators; (7) a foundational p-adic theory of Hopflike fibrations with crystalline Euler class and p-adic instanton sums; (8) a constructive proof of the Cobordism Conjecture for CY3 compactifications via Hopf-like geometric transitions; (9) an L-function factorization theorem establishing a Hopf-like BSD analogy (proved for K3 surfaces). These results together constitute a resolution of the main structural questions in Hopf-like fibration theory on CY manifolds, from both geometric/topological and string-theoretic perspectives. The manuscript includes explicit diagnostic workflows—minimal-model growth estimates, low-degree homotopy exact-sequence tests, and spectral-page bookkeeping—designed for reproducible analysis. The main conclusion is precise: strict Hopf behavior is severely limited on smooth compact Calabi–Yau manifolds, while robust Hopf-like structures naturally appear in local, singular, and effective-field-theory phases, and these are now completely classified.

Article
Medicine and Pharmacology
Pharmacology and Toxicology

Sharhabil Amgad Eltahir

,

Mukhtar Ibrahim Yousef

Abstract:

Drug-induced toxicity remains a principal driver of attrition in pharmaceutical development, yet conventional screening paradigms typically address individual toxicity endpoints in isolation. Here, we introduce MultiEndpointTox, a chemoinformatics platform that simultaneously predicts seven critical drug toxicity endpoints—hERG cardiotoxicity, hepatotoxicity (DILI), nephrotoxicity (DIKI), Ames mutagenicity, skin sensitization, cytotoxicity, and reproductive toxicity (exploratory)—from molecular structures using curated datasets totaling over 18,000 compounds. The platform employs optimized classical machine learning models with systematic benchmarking of 2D topological descriptors (2240 features), enhanced multi-conformer 3D descriptors (1975 features from 5-conformer ensembles incorporating AUTOCORR3D, RDF, WHIM, and pharmacophore fingerprints), and hybrid representations. Under the tested conditions, 2D descriptors achieved the highest classification performance (AUC-ROC 0.859 ± 0.02), while enhanced 3D descriptors substantially narrowed the previously reported gap (AUC-ROC 0.833 ± 0.03 versus 0.69–0.73 for basic 14-feature 3D). Scaffold-based splitting provided rigorous generalization assessment, with an average performance reduction of approximately 8%. A multi-task learning framework via stacked generalization demonstrated cross-endpoint information sharing improves performance for 5 of 6 endpoints (average +2.1% AUC). The platform integrates leverage-based applicability domain assessment (31–100% coverage), SHAP-based feature importance analysis, and a confidence-weighted multi-endpoint risk scoring system validated on known drugs (AUC = 0.83, p = 4.06 × 10−14, Cliff’s δ = 0.66), with sensitivity analysis confirming robustness across five weight configurations (AUC range 0.72–0.98). External validation on independent benchmark datasets revealed the challenge of cross-dataset domain shift in computational toxicology. MultiEndpointTox is deployed as a production-ready REST API and publicly available at https://github.com/sharhabileltahir/MultiEndpointTox.

Review
Chemistry and Materials Science
Electrochemistry

Paolo Yammine

,

Ali Al-Zein

,

Tony Tannous

,

Hanna El-Nakat

,

Doris Homsi

,

Elie Atieh

,

Zeinab Matar

,

Pierre J. Obeid

,

Ayman Chmayssem

Abstract: This paper serves as a practical guide to help the readers select electrochemical instruments with a focus on potentiostats / galvanostats. It is dedicated to professionals in industry, students and researchers from various fields. We provide an overview of the main potentiostats / galvanostats and related electrochemical instruments currently available on the market and the main suppliers worldwide. For each device, we summarize its technical specifications including current and potential ranges as well as the methods the instrument is able to support. We also discuss the limitations of each instrument in order to provide the readers with a clear and comprehensive understanding. Finally, the paper aims to help the readers in selecting the most suitable instrument for their needs while considering performance and budget.

Review
Social Sciences
Behavior Sciences

Narcis Eduard Mitu

,

George Teodor Mitu

,

Mihaela Zglavoci

Abstract: The European Union relies heavily on voluntary tax compliance, yet evidence on how tax literacy (TL) and tax education (TE) relate to tax morale (TM) and voluntary tax compliance or compliance intentions (VTC) remains fragmented across partly disconnected strands of literature. This systematic review examined EU-relevant evidence on the stakeholder contexts in which TL/TE are discussed in relation to TM and VTC, with particular attention to schools, communities, and public institutions. Following PRISMA 2020, searches in Scopus and Web of Science (2000–2025) applied two complementary query streams focused on TL/TE and TM/VTC-related mechanisms. The searches identified 1327 records; after deduplication and screening, 402 studies were included. Based on structured coding of titles, abstracts, and author keywords, the review maps patterns of emphasis and framing rather than causal effects. Public-institutional and education-related contexts were the most frequently signposted stakeholder environments, while digital and outreach-oriented delivery cues were more visible than classroom-based cues. Trust and fairness/justice dominated the explanatory vocabulary. Overall, the review supports an ecosystem-oriented interpretation of stakeholder coordination in EU tax literacy research.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Mohsen Mostafa

Abstract: The choice of activation function is a fundamental design decision in deep learning, yet most popular options like ReLU, GELU, or Swish are static and treat all inputs uniformly. This one-size-fits-all approach can be suboptimal when data contains heterogeneous noise, where the ideal non-linearity might depend on the input’s statistical context. In this paper, we introduce Bayesian Probabilistic Adaptive Sigmoidal Activation (Bayesian PASA), an activation function that adapts its behavior based on input uncertainty. The method frames activation selection as a Bayesian model averaging problem, adaptively mixing sigmoidal, linear, and noise-aware behaviors. Mixing weights are derived from a variational evidence lower bound (ELBO) and regularized by a ψ-function that bounds the influence of local noise estimates. We provide theo- retical analysis showing Lipschitz continuity, gradient bounds, and convergence under standard assumptions. Due to computational constraints (Google Colab, limited epochs), we evaluate Bayesian PASA on CIFAR-100 (50 epochs, 3 seeds) and CIFAR-10-C (100 epochs, 3 seeds). Despite these limitations, Bayesian PASA achieves 76.38% accuracy on CIFAR-100, slightly outperforming ReLU (75.68%) and GELU (75.98%) under the same constrained conditions. On corrupted CIFAR-10-C, Bayesian PASA combined with Bayesian R-LayerNorm achieves an average accuracy of 53.91%, a +1.87% improvement over the ReLU+LayerNorm baseline. These results, though modest, are consistent across seeds and suggest that Bayesian PASA offers a promising direction for uncertainty-aware activation functions, particularly when training re-sources are limited. Code is available at: https://github.com/BayesianPASA/BayesianPASA/

Article
Engineering
Aerospace Engineering

Stephen A. Whitmore

,

Jared S. Coen

,

Ryan J. Thibaudeau

Abstract: Utah State University has developed a high-performance "green" hybrid propulsion technology based on the unique electrical breakdown properties of 3-D printed acrylonitrile butadiene styrene. Using 3-D printed ABS as fuel, typical startup sequences require approximately 5-15 joules; and once started, the system can be sequentially fired with no additional energy inputs required. The number of possible ignitions is limited only by the amount of fuel. The most technologically mature version uses gaseous oxygen (GOX) as oxidizer and 3-D printed ABS as fuel. While GOX is mass efficient, it lacks volumetric efficiency unless highly pressurized. Nytrox, a blend of GOX and nitrous oxide, improves propellant density and volumetric efficiency, while maintaining acceptable levels of mass efficiency (specific impulse). Nytrox can safely self-pressurize, eliminating the need for a separate oxidizer pressurization system and reducing overall complexity. However, using Nytrox as a direct replacement for GOX presents ignition decreases ignition reliably, significantly increasing cold-start ignition latency. This paper quantifies the latency, explores its sources, and analyzes expected behaviors. Solutions include raising combustion and storage pressures to boost oxygen content in Nitrox’s liquid phase and increasing combustion chamber pressure to reduce ignition delays.

Article
Computer Science and Mathematics
Probability and Statistics

Peter Gács

Abstract: In the context of the dynamical systems of classical mechanics, we introduce two new notions called “algorithmic fine-grain and coarse-grain entropy”. The fine-grain algorithmic entropy is, on the one hand, a simple variant of the randomness tests of Martin-L¨of (and others) and is, on the other hand, a connecting link between description (Kolmogorov) complexity, Gibbs entropy and Boltzmann entropy. The coarse-grain entropy is a slight correction to Boltzmann’s coarse-grain entropy. Its main advantage is its less partition-dependence, due to the fact that algorithmic entropies for different coarse-grainings are approximations of one and the same fine-grain entropy. It has the desirable properties of Boltzmann entropy in a wider range of systems, including those of interest in the “thermodynamics of computation”. It also helps explaining the behavior of some unusual spin systems arising from cellular automata.

Review
Medicine and Pharmacology
Anesthesiology and Pain Medicine

Daniel John Doyle

Abstract: Respiratory diseases such as chronic obstructive pulmonary disease (COPD), asthma, tuberculosis, and acute respiratory distress syndrome (ARDS) remain leading causes of morbidity and mortality worldwide. Traditional respiratory care faces challenges in early diagnosis, personalized treatment, efficient resource allocation, and optimal mechanical ventilation management. Artificial intelligence (AI) has emerged as a transformative tool, offering applications in diagnosis, monitoring, treatment optimization, critical care support, and automated ventilator control. This comprehensive review examines AI's role across the respiratory care continuum, from diagnostic imaging and spirometry interpretation to autonomous multiparameter ventilator adjustment during maintenance and weaning phases. The article highlights quantitative evidence of clinical impact, regulatory status, challenges including algorithmic bias and health equity concerns, implementation strategies, and detailed analysis of AI-driven mechanical ventilation systems. Case studies illustrate real-world outcomes with specific effect sizes. The discussion emphasizes both the promise and limitations of AI, preparing healthcare professionals and students to critically evaluate its role in clinical practice.

Review
Medicine and Pharmacology
Other

Simona Wójcik

,

Monika Tomaszewska

,

Anna Rulkiewicz

Abstract: Background/Objectives: Obesity is a chronic, relapsing disease with a widening gap between clinical need and the availability of specialist care. Artificial intelligence (AI) may enable earlier risk detection, more precise phenotyping, and scalable behavioural support across obesity treatment pathways. This narrative review synthesises con-temporary AI applications across the obesity care continuum and evaluates their translational readiness. Methods: A targeted search of PubMed/MEDLINE and Google Scholar (January 2024–January 2026) was conducted, complemented by citation chaining. Evidence was syn-thesised across four domains: (1) risk prediction and screening, (2) environmental and behavioural determinants, (3) multimodal phenotyping and precision stratification, and (4) AI-enabled lifestyle interventions and behavioural coaching (AIBC). Results: EHR-based models demonstrate clinically useful discrimination for early risk identification. Multimodal approaches refine stratification beyond BMI-centric classi-fication. AIBC platforms show emerging evidence of clinically meaningful weight loss, including non-inferiority to human coaching, but long-term effectiveness, generalisa-bility, and equity remain insufficiently established. Conclusions: AI is positioned to become a core enabler of personalised obesity path-ways. Safe translation requires external validation, bias auditing, transparent report-ing, human oversight, and post-deployment surveillance aligned with clinical guide-lines and regulatory expectations.

Article
Engineering
Bioengineering

Lafi Hamidat

,

Dilber Uzun Ozsahin

,

Berna Uzun

Abstract: The development of biodegradable scaffolds for load-bearing bone tissue engineering (BTE) presents a fundamental multi-criteria optimization challenge, requiring a simultaneous balance among mechanical performance, biological integration, and degradation kinetics. These criteria are inherently conflicting: composite formulations with the highest compressive strength frequently exhibit suboptimal porosity, while those with superior osteoconductivity often lack sufficient load-bearing capacity. To address this challenge rigorously, this study establishes a hybrid Fuzzy Analytic Hierarchy Process–Technique for Order of Preference by Similarity to Ideal Solution (Fuzzy AHP-TOPSIS) framework to evaluate and rank five clinically relevant biodegradable polymer–ceramic composite candidates: PLA/Hydroxyapatite (PLA/HA), PCL/Hydroxyapatite (PCL/HA), PLGA/Bioactive Glass (PLGA/BG), PLA/Carbon Nanotubes (PLA/CNT), and PLA/Magnesium (PLA/Mg). Quantitative property data were systematically extracted from ten peer-reviewed experimental studies published between 2021 and 2025, and converted into Triangular Fuzzy Numbers (TFNs) to explicitly model inter-study variability arising from differences in fabrication methods, filler loading, and testing conditions. Fuzzy AHP analysis identified Compressive Strength (w = 25.2%) and Cell Viability (w = 21.5%) as the dominant decision criteria for load-bearing cortical bone repair. The Fuzzy TOPSIS ranking identified PLA/HA as the optimal composite candidate (Closeness Coefficient, CCᵢ = 0.677), demonstrating the superior multi-criteria balance required for cortical bone repair applications. Although PLA/CNT achieved the highest mechanical strength, it was outranked due to lower osteoconductivity and elevated cytotoxicity uncertainty at high nanotube concentrations (CCᵢ = 0.544). Sensitivity analysis across five distinct weighting scenarios confirmed the robustness of PLA/HA as the primary candidate. These findings provide a validated, replicable computational blueprint for evidence-based scaffold material selection, with direct implications for reducing the burden of costly trial-and-error experimentation in BTE research.

Hypothesis
Medicine and Pharmacology
Neuroscience and Neurology

Artyom Dyupin

Abstract: Manual medicine has long outgrown explanations that rest solely on structural-biomechanical correction. While the techniques reliably alleviate musculoskeletal pain and functional complaints, the evidence suggests that durable benefit depends far less on lasting mechanical realignment than on a distributed set of neurophysiological, autonomic, interoceptive and contextual processes. A persistent translational gap nevertheless remains between these abstract predictive models of bodily regulation and the tangible, regional tissue dynamics that clinicians encounter in practice. We propose PULSE-V (Predictive Updating of Local Somatic Errors via Vasomotion) as a hypothesis-generating framework that seeks to narrow that gap. The central suggestion is that coherent low-frequency vasomotor oscillations (~0.1 Hz) within angiosomes may serve as a candidate biophysical substrate capable of organising ascending interoceptive signals. When coherence is disrupted, the resulting noisy afferent stream may contribute to interoceptive prediction error. Chronic somatic dysfunction can then be understood as a form of allostatic interoceptive overload — a self-stabilising loop in which ambiguous peripheral input, impaired sensory attenuation and entrenched top-down priors reinforce one another. PULSE-V is offered as a deliberately falsifiable programme rather than a settled theory. It generates testable predictions concerning regional vasomotor patterns, multimodal biomarker signatures and the differential contributions of vasomotor, affective-touch and relational elements in treatment. Should the evidence support it, the model may help account for the frequently observed discrepancy between the modest mechanical effects of manual intervention and the substantial clinical outcomes that follow.

of 5,664

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated