Sort by

Article
Engineering
Aerospace Engineering

Jie Hu

,

Shuai Zhang

,

Xiaorong Feng

,

Xinglong Wang

Abstract: The Aircraft Landing Problem (ALP) poses significant challenges for traditional Monte Carlo Tree Search (MCTS) due to its vast search space and reliance on inefficient random simulations. To overcome these limitations, this paper proposes a novel Transformer-Augmented Monte Carlo Tree Search (TMCTS) algorithm. Our approach integrates a reinforcement learning framework that incorporates key operational constraints, including wake turbulence separation and time windows, and employs a cost function aimed at minimizing both delay time and fuel consumption. A core innovation is the replacement of the conventional random simulation phase in MCTS with a Transformer-based value predictor. This leverages the Transformer’s superior capability in sequence modeling and capturing global dependencies among flights, thereby dramatically accelerating search convergence. Specifically, we design a two-head Transformer network (comprising policy and value heads) to provide informed prior knowledge, which effectively guides the selection and expansion steps of the MCTS tree. The model is trained within an Actor-Critic framework, utilizing behavior cloning for pre-training followed by reinforcement learning for fine-tuning. Experimental evaluations on the standard OR-Library benchmark demonstrate that our TMCTS method significantly reduces scheduling deviation compared to state-of-the-art baselines (including DPALO+GA, DPALO+PSO, and DALP). Moreover, it achieves a 90.6% reduction in computation time relative to the DALP method, highlighting its superior efficiency and practical applicability for real-time scheduling.

Article
Chemistry and Materials Science
Nanotechnology

Congyi Zhang

,

Haotian Wu

,

Xiaotong Chen

,

Wenze Yin

,

Shizhuan Huang

,

Dixiang Wen

,

Xueting Song

,

Xiaoyan Xu

,

Changmei Zhang

,

Sheng Tai

Abstract: This study successfully developed a novel tumor-associated macrophages (TAMs)-targeting nanoplatform-sialic acid-disulfide bond-camptothecin (SA-SS-CPT) nanowires. This system significantly improved the solubility and bioavailability of camptothecin (CPT) and achieved active targeted drug delivery by utilizing sialic acid as a targeting ligand to specifically recognize the highly expressed Siglec-E receptor on TAMs. Upon internalization into TAMs, the disulfide bond in the SA-SS-CPT nanowires was cleaved in response to intracellular glutathione (GSH), leading to the controlled re-lease of CPT. SA-SS-CPT induced DNA damage in TAMs, thereby activating the cGAS-STING signaling pathway, promoting the polarization of TAMs toward the M1 phenotype, enhancing pro-inflammatory and anti-tumor immune responses, and effec-tively inhibiting tumor immune escape. Furthermore, the SA-SS-CPT nanowires syner-gistically enhanced the efficacy of PD-L1 blockade immunotherapy, collectively remod-eling the tumor immune microenvironment and ultimately facilitating significant tumor clearance.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Mohsen Mostafa

Abstract: This paper introduces Bayesian R-LayerNorm, a normalization layer that extends the previously proposed R-LayerNorm with uncertainty quantification. Building upon R-LayerNorm, we draw connections to statistical field theory, renormalization group methods, and infor-mation geometry to motivate the design. The method incorporates uncertainty estimation through a stable ψ-function, enabling adaptive noise suppression based on local entropy esti-mates. We provide theoretical analysis of numerical stability, gradient stability, and training convergence under standard assumptions. A key practical contribution is the integration of uncertainty quantification directly into the normalization operation, providing confidence estimates for each normalized activation without additional cost. The method adapts to local noise, varying normalization strength spatially based on estimated noise levels. The implementation is simple, adding only two learnable parameters per layer, and serves as a drop-in replacement for existing normalization layers. Due to computational constraints (Kaggle P100 GPU, limited epochs), we evaluate Bayesian R-LayerNorm on CIFAR-10-C using 50 training epochs and 3 random seeds. Under these limitations, it achieves average accuracy gains of +0.49% over standard LayerNorm across four common corruptions, with the largest improvement of +0.74% on shot noise. While these gains are modest, they are consistent across seeds. The method requires mini-mal computational overhead ( 10%) and we provide complete open-source implementation. We further show that the learned λ parameters offer interpretability, revealing which layers adapt most strongly to different corruptions. The framework suggests promising directions for trustworthy normalization in safety-critical applications where uncertainty matters alongside accuracy.

Article
Physical Sciences
Theoretical Physics

Jef Zerrudo

Abstract: We derive a quantum conjugacy between spacetime diffusivity and inertial mass from relativistic information-transport kinematics. Two Lorentz-invariant laws—(i)~an invariant-time gauge for timelike segments, \( ds=c\,dt \), and (ii)~diffusive evolution \( d\epsilon/ds=c- \)yield a first-order action whose canonical quantization gives \( [\hat\epsilon,\hat m]=i\hbar \) and the emergent Cosmological Uncertainty Principle~(CUP), \( \Delta\epsilon\,\Delta m\ge\hbar/2 \). Independence across coherence cells of size \( \ell_{\rm coh} \) amplifies the bound to \( \Delta\epsilon\,\Delta m\ge(\hbar/2)\,N_{\rm eff} \) with \( N_{\rm eff}=D/\ell_{\rm coh} \), extending quantum uncertainty to cosmic baselines. A single area-diffusion parameter provides an operational unification of Planck and Hubble times across \( {\sim}\,61 \) orders of magnitude. Applied to black-hole horizons, the CUP reproduces Hawking's temperature exactly. for de~Sitter space, a naive 1/H correlation window overshoots by a factor \( \pi \), while KMS/Unruh calibration (\( \tau=\pi/H \)) recovers the standard Gibbons–Hawking result \( T_{\rm dS}=\hbar H/(2\pi k_B) \). Unlike generalised or extended uncertainty principles that deform the position--momentum commutator, the CUP introduces a new conjugate pair (\( \epsilon,m \)) while leaving the Heisenberg sector intact. These results position CUP as an emergent, testable quantum--informational constraint on cosmological observables rather than an added axiom.

Article
Biology and Life Sciences
Plant Sciences

Swetaleena Mishra

,

Suchismita Prusty

,

Sowmya Poosapati

,

Durga Madhab Swain

,

Ranjan Kumar Sahoo

Abstract: Salinity stress is one of the major obstacle worldwide for the glycophytic crop production, including rice. It alters the cellular metabolism, causing significant crop destruction resulting in substantial reductions in yield. Through genetic engineering, the oxidative stress can be decreased while increasing the photosynthetic capability by using C3 transgenic plants that produce the C4 enzymes like phosphoenolpyruvate carboxykinase (PEPCK) at a high level. In this research, we evaluate the efficiency of transgenic rice plants (Oryza sativa L. cv. IR64) over-expressing PEPCK genes to act against salinity stress as well as increasing its photosynthetic efficiency. The T1 transgenics showed increased levels of several biochemical factors, including ascorbate peroxidase (APX), malondialdehyde (MDA), glutathione reductase (GR) and guaiacol peroxidase (GPX) activities suggesting the existence of an effective antioxidant defense mechanism that helps the plants to deal with oxidative damage driven by salt stress. The photosynthetic parameters like chlorophyll contents, net photosynthetic rate, intercellular CO2 content and stomatal conductance were elevated in transgenic plants when compared with the control plants (null seggregant). It also exhibited higher agronomic characteristics than the control plant. Our findings add a conclusive evidence of PEPCK gene’s potential role in regulating salt stress response and tolerance of rice plants.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Mohsen Mostafa

Abstract: Deep learning classifiers deployed in scientific applications often encounter inputs that violate physical laws (e.g., due to sensor failure or corruption). Standard methods cannot detect such violations and may produce confident but wrong predictions. We propose UA-PBR, a framework that combines a physics-informed autoencoder (to detect physics violations) with a Bayesian CNN (to quantify predictive uncertainty). Inputs are rejected if either the PDE residual exceeds a threshold or the predictive entropy is too high. As a proof-of-concept, we evaluate UA-PBR on a synthetic Darcy flow dataset (32 × 32 grid) under severe computational constraints (Google Colab, 10 seeds). Despite these limitations, UA-PBR reduces classification risk by over 90% on heavily corrupted samples while accepting 89.7% of clean inputs with 99.99% accuracy on accepted samples. Ablation studies confirm that both components contribute synergistically. These preliminary results on a synthetic benchmark illustrate the potential of physics-aware rejection and motivate further investigation with larger-scale experiments. Code is available at: https://github.com/UA-PBR/UA-PBR.

Article
Biology and Life Sciences
Neuroscience and Neurology

Natalia Shamantseva

,

Arseniy Polyakov

,

Vsevolod Lyakhovetskii

,

Margarita Bystrova

,

Ivan Sakun

,

Sergey Ananyev

,

Yury Gerasimenko

,

Tatiana Moshonkina

Abstract: Pupillometry is a reliable method of pain control. There are experimental conditions under which standard pupillometry equipment cannot be used. Studying effects of different pulse forms used for transcutaneous spinal cord stimulation (tSCS) is one such task. The aim was to create a system for recording pupil diameter based on a web camera because it can be synchronized with external equipments, which allows the diameter to be recorded simultaneously with other physiological signals. A markerless system for recording and analyzing pupil diameter using deep neural networks was developed based on a commercially available web camera. The accuracy of this system was compared with the accuracy of measurements using manual analysis with the ImageJ. The system was tested in a study of the dependence of tolerance to tSCS on the shape of stimulating pulses, which involved the participation of volunteers (n=12). The results of the developed pupillometry were compared with the pain rating scale traditionally used in such studies. The developed system is accurate in determining the pupil diameter, comparable to human accuracy. The pupillometry results reproduced those obtained using a subjective pain scale. The developed method was found to be a reliable method for recording pain in electrophysiological studies.

Article
Public Health and Healthcare
Public Health and Health Services

Donghyoun Lee

,

Beom Jun Lee

Abstract: We performed a retrospective analysis of the data from a total of 241 patients (n=241). A total of 161 (66.8%) patients received the peripherally-inserted central catheter (PICC) for long-term intravenous access, 172 (71.4%) had no past history of receiving catheters and 142 (58.9%) received the PICC on the right side. Target veins include basilic vein (42.7% [103/241]), brachial vein (41.9% [101/241]) and cephalic vein (15.4% [37/241]). There were a total of five cases (2.1%) of the PICC-related infection. Of these, one case (0.4%) was the PICC-related bloodstream infection; Candida parapsilosis was identified from both the PICC tip and blood samples. A total of 224 patients (92.9%) had optimal positions of the PICC tip. Male sex (OR 0.183; 95% CI 0.050-0.675, p=0.011), the length of a catheter (OR 0.794; 95% CI 0.657-0.960, p=0.017) and right side (OR 4.711; 95% CI 1.227-18.091, p=0.024) were significant risk factors of non-optimal positions of the catheter. Time-to-events are estimated at 56.02±1.37 days (95% CI 53.33-58.71). Here, we describe our single-center, retrospective experience with bedside ultrasound (US)-guided PICCs in elderly ICU patients in a small-volume center.

Article
Medicine and Pharmacology
Epidemiology and Infectious Diseases

José Oñate-Gutiérrez

,

Carlos Alvarez-Moreno

,

Claudia Cañadas-Aragón

,

Hernán Vergara-Samur

Abstract: Invasive candidiasis is a severe opportunistic infection whose incidence may be influenced by major disruptive events. The COVID-19 pandemic substantially altered hospital dynamics in Colombia. This study aimed to evaluate temporal trends, seasonality, and potential changes in the incidence of invasive candidiasis between 2019 and 2024. We conducted an observational time-series study using confirmed cases of invasive candidiasis from medium- and high-complexity hospitals in three major Colombian cities. Cases were aggregated quarterly. An interrupted time-series (ITS) analysis was performed. A total of 1,294 cases were analyzed. An increasing trend was observed until mid-2022, followed by a decline during 2023. Seasonal decomposition revealed persistent seasonality with recurrent peaks in the second and fourth quarters. The ITS analysis did not demonstrate statistically significant changes in level or slope after the interruption (p > 0.05), although clinically relevant fluctuations were observed. No significant differences in temporal trends were identified across Candida species. Invasive candidiasis in Colombia exhibited a complex temporal evolution during and after the COVID-19 pandemic characterized by sustained seasonality and an increase followed by a decline. Although the ITS analysis did not identify statistically significant post-pandemic changes, the findings support the use of time-series models as valuable tools for epidemiological surveillance and trend monitoring.

Article
Medicine and Pharmacology
Epidemiology and Infectious Diseases

Mahmud Azbida

,

Sana Ferjani

,

Omar Elahmer

,

Rmadhan Osman

,

Salem Shenaisheh

,

Amal Barakat

,

Salma Abid

,

Adem Eljerbi

,

Abdulwahab Kammon

,

Haider El-Saeh

+2 authors

Abstract: Influenza sentinel surveillance in Libya was formally established in 2022 by the Libyan National Center for Disease Control (NCDC), initially comprising a single sentinel site in Tripoli. By the end of 2025, the network had expanded to 15 sites across five cities nationwide. Between 2022 and 2024, a total of 1,864 nasopharyngeal specimens were collected from patients presenting with influenza-like illness and tested using the GeneXpert for influenza A virus, influenza B virus, severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), and respiratory syncytial virus (RSV). Influenza A virus was detected in 21.1% (393/1,864) of samples and influenza B virus in 5.4% (100/1,864). SARS-CoV-2 and RSV were identified in 11.6% (216/1,864) and 4.1% (77/1,864) of specimens, respectively. A subset of 29 influenza A–positive samples was randomly selected for confirmatory testing and further molecular characterization. Real-time RT-PCR subtyping identified 13 A(H1N1)pdm09 and five A(H3N2) viruses. Whole-genome sequencing was successfully performed for 13 isolates, followed by phylogenetic analysis. Genetic characterization revealed that all A(H1N1)pdm09 viruses belonged to clade 6B.1A.5a.2a (5a.2a), while A(H3N2) viruses clustered within clade 3C.2a1b.2a.2a.3a.1 (2a.3a.1) based on hemagglutinin gene mutations. No neuraminidase mutations associated with antiviral resistance were detected. This study represents the first molecular and phylogenetic characterization of circulating human influenza viruses in Libya, with sequence data submitted to the Global Initiative on Sharing All Influenza Data (GISAID). These findings establish baseline genetic data for influenza viruses in Libya and support the strengthening of national respiratory virus surveillance.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Francis Frydman

Abstract: Large language models (LLMs) have demonstrated logical reasoning abilities, but their inferences remain non-traceable and lack formal guarantees. We introduce eXa-LM, a controlled natural language (CNL) interface between LLMs and first-order logic solvers. Based on a Controlled Natural Language, our approach aims to create an explicit, verifiable, and interpretable bridge between text and formal logic. It relies on three main components: (1) a reformulation prompt that constrains the LLM to produce a set of facts and rules in CNL, (2) the semantic analyzer eXaSem translating this CNL into a Prolog program made of extended Horn clauses, and (3) the logic engine eXaLog, which integrates a second-order meta-interpreter capable of inferring ontological properties.We evaluate eXa-LM on three standard benchmarks—PrOntoQA, ProofWriter and FOLIO—comparing it to GPT-4o baselines including Standard prompting, Chain-of-Thought, Logic-LM, LINC, and LLM-TP. Results show that eXa-LM matches or exceeds recent neuro-symbolic systems while providing full traceability of reasoning and intrinsic explainability. On FOLIO, eXa-LM achieves 92.9% accuracy, a +5.5 point gain over LLM-TP, the strongest competing GPT-4o-based method in our comparison.This approach demonstrates the feasibility of a transparent neuro-symbolic reasoning pipeline in which LLMs produce not direct inferences but formally controlled linguistic representations. eXa-LM opens the way to neuro-symbolic architectures that are safer, verifiable and extensible, ultimately integrating hypothetical, abductive or inductive reasoning. Program and data will be made publicly available upon publication.

Article
Engineering
Architecture, Building and Construction

Paola Altamura

,

Gabriele Rossini

,

Gaia Garofali

,

Serena Baiani

,

Fabrizio Tucci

Abstract: In line with circular bioeconomy goals, the reported research focuses on circular building materials, intended as reused components, recycled and bio-based materials, including those derived from sub-products and waste, as a strategic solution to simultaneously cut embodied and operational carbon emissions in buildings. In particular, the research aims to provide a methodology for an early, rapid and effective assessment of the contribution that circular materials can give to reducing climate-altering emissions and resource consumption. The research started with the collection, selection and analysis of multiple case studies of buildings using circular materials and adopting different circular design strategies. The paper reports in particular the mapping of circular design strategies and materials in ten case studies, representing different approaches. Moreover, by collecting and comparing fifteen existing frameworks of indicators for circularity evaluation at the building and product level, selecting relevant indicators and integrating specific ones, the research develops a set of eight KPIS, a specific evaluation framework that allows to assess the effects of alternative combinations of materials reused, bio-based and recycled building materials. The KPIs set was tested on a selection of three relevant case studies of buildings using circular materials, to verify the effectiveness of the indicators in supporting the designer in taking material related choices.

Article
Engineering
Transportation Science and Technology

Raj Bridgelall

Abstract: Highway–rail grade crossing (HRGC) safety research relies on federal incident and inventory datasets that span multiple decades. However, inconsistencies in geographic identifiers and incomplete reconstruction of crossing denominators can distort exposure-based rate metrics. This study develops, documents, and validates a reproducible nine-stage reconciliation pipeline applied to 51 years (1975–2025) of national HRGC incident data from the Federal Railroad Administration Form 57 and Form 71 datasets. The hierarchical pipeline integrated deterministic alignment and AI-assisted inference to produce an audited, geographically consistent dataset. The study formalizes four longitudinal county-level exposure metrics that quantify spatiotemporal risk. These metrics include accumulated incidents per million population (AIPM), accumulated incidents per crossing (AIPC), crossings per million population (CPM), and crossings per 100 square miles (CPHSM). All four metrics exhibited pronounced right-skewness: AIPM, CPM, and CPHSM approximated exponential forms, and AIPC approximated a log-normal form. Anderson–Darling tests detected statistically significant tail deviations in three metrics; CPM did not reject the exponential fit at conventional significance levels. Spatial analysis shows coherent regional concentration in incident rates in the Central Plains and lower Mississippi corridors. The national time series exhibits a late-1970s plateau, sustained exponential decline beginning around 1980, and stabilization but persistent incident rates after 2001. Population-normalized AIPM remained statistically indistinguishable between the reconciled and record-dropped datasets; however, crossing-based metrics changed materially when reconstructing denominators from the reconciled crossing universe. Median ratio comparisons confirmed that incident-only denominators introduced substantial measurement bias in local risk assessment. State-level rank reversals persisted even when omnibus distributional tests failed to reject equality. By formalizing multistage data cleaning and quantifying its analytical impact over an unprecedented longitudinal horizon, this study establishes denominator integrity and geographic reconciliation as prerequisites for valid HRGC exposure assessment and provides a replicable platform for future predictive modeling.

Review
Biology and Life Sciences
Behavioral Sciences

Chayan Munshi

,

Farhan Jamil

,

Ishika Pal

,

Swapnanil Mondal

,

Bithi Khan

,

Upama Das

Abstract: Behavioural ecotoxicology is a field of applied ecotoxicology, where researchers consider the alterations in the behavioural markers due to the impact of environmental toxicants or contaminants. In fact, understanding the changes in behavioural manifestations helps to understand the respective underlying neurological mechanisms in the organisms and therefore, it effectively helps to describe or predict the neuro-behavioural context of behavioural modifications due to exposure to anthropogenic pollutants. Through this review we are addressing how environmentally available chemicals (such as pesticides, heavy metals (or metalloids), plastics and also pharmaceuticals can have significant acute and/or long-term impact on the behavioural profile of organisms (bioindicator species).

Article
Physical Sciences
Quantum Science and Technology

Cheng Jinjun

,

Cheng Dian

Abstract: This paper represents a further academic deepening and upgrading of the authors'2019 publication A Hypothesis on the Spatial Motion Mode of Photons. It should beexplicitly stated that this paper falls within the category of natural philosophicalthought experiments—its core value lies in constructing a unified physical image ofthe nature of light through rigorous logical deduction, and proposing verifiabletheoretical hypotheses and experimental schemes; the validity of all conclusionsmust ultimately be verified by rigorous and extensive scientific experiments beforebeing incorporated into the theoretical system of physics. As a foundational conceptof quantum mechanics, the wave-particle duality of light has been accompanied byprofound philosophical perplexities and theoretical tensions since its proposal,becoming a core bottleneck in the integration of classical and quantum physics. Thispaper systematically sorts out the logical incompleteness in the current quantuminterpretation system—including the self-negation of the complementarity concept,the problem of photon localization, the fundamental opposition between the statisticaland non-statistical interpretations of the wave function, and the philosophicalcontroversy over the Heisenberg Uncertainty Principle, revealing the inherentcontradictions of the traditional wave-particle duality framework. On this basis,adopting classical physical images and the logic of reduction to absurdity, and basedon six axioms and six preparatory propositions, this paper puts forward a naturalphilosophical hypothesis on the essence of photons: a photon is an energetic masspoint with a diameter smaller than the Planck length, moving in a uniform spiral linearmotion in space. The paper deduces the core characteristics such as velocity,frequency, and wavelength of the photon's uniform spiral linear motion, and designsthree operable, repeatable, and quantifiable physical experimental schemes toprovide specific paths for the empirical verification of the hypothesis. The researchdeduces that the angular momentum of photon spatial motion (excluding photon spinmotion) is always the reduced Planck constant ℏ , the energy E=mc² is naturallyunified with E=hν (the standard formula for wave energy), and the standardexpression of the Heisenberg Uncertainty Principle ΔxΔpₓ≥ℏ /2 can be given aclassical physical interpretation from the perspective of superposition ofmeasurement deviations. This paper systematically responds to potential questionsregarding the origin of photon particle nature, wave nature, and compatibility withrelativity, arguing that the hypothesis provides a logically consistent and clearlyvisualized path for understanding the nature of light, builds a new naturalphilosophical framework for the integration of quantum and classical theories of light,and also offers a new thinking perspective for the paradigm shift in the study of thenature of light.

Review
Biology and Life Sciences
Biophysics

Benjamin Drukarch

,

Micha Wilhelmus

Abstract: Neuronal excitability manifests itself mainly in the form of non-linear, self-regenerative waves of electricity moving along the surface of neuronal axons. These waves are commonly known as action potentials (APs). Theorizing and experimental investigation of the physical and functional characteristics of APs has broadly followed along the lines of the ionic hypothesis and the associated mathematical model introduced by Hodgkin and Huxley (HH). In the current form of this bioelectrical framework, adopted in mainstream physiology and other biological sciences, the axonal membrane is conceptualized as an electronic circuit where electric current is generated and propelled as the result of time-dependent opening and closure of voltage-operated ion channel proteins allowing passive flow of specific ions across and along the membrane powered by their respective electrochemical gradients. Although representing mainstream research, the bioelectric perspective has been criticized for its narrow focus on electrical characteristics of APs, whilst ignoring other physical manifestations of the nerve signal, in particular mechanical and thermal changes coinciding with AP propagation. As an alternative, a thermodynamics-based acoustic theory has been outlined in which all, electric and non-electric, manifestations of the nerve signal are considered as the result of a single density pulse in the axonal membrane carried by a reversible lipid membrane phase transition and momentum conservation. Representing a minority view, however, this unified, thermodynamic perspective on the physical nature of neuronal excitability is largely ignored by representatives of the bioelectric perspective.Here we draw special attention to the philosophical dimension of the communication failure between the two communities of scientists. We argue that adherents of the bioelectric perspective favor a mechanist-type of explanation, whilst supporters of the thermodynamic perspective are committed to so-called covering-law types of explanation. We conclude that it is this, thus far unrecognized, philosophical rift, rather than specific scientific differences of opinion that blocks fruitful interdisciplinary cooperation necessary for building a comprehensive, fully integrated, notion of the physical nature of neuronal excitability. Suggestions of how to bridge this conceptual gap are formulated.

Article
Public Health and Healthcare
Public Health and Health Services

Taiwo Opeyemi Aremu

,

Olihe Nnenna Okoro

,

Caroline Gaither

,

S. Bruce Benson

,

Drissa M Toure

,

Jon C. Schommer

Abstract: Background: During the COVID-19 pandemic, Nigeria relied largely on imported vaccines, underscoring vulnerabilities in supply chains and the absence of domestic vaccine manufacturing. Understanding supply-related barriers and the resources required for local vaccine production is critical for future pandemic preparedness and population health outcome. The objective of the study was to identify stakeholder-perceived barriers to COVID‑19 vaccine manufacturing in Nigeria and to describe the resources and enabling conditions required for local production. Methods: We conducted a qualitative needs assessment using semi-structured interviews with senior personnel from Nigerian pharmaceutical manufacturing firms and regulatory agencies. Participants were recruited purposively and consecutively. Interviews (30-60 minutes) were conducted via Zoom, audio-recorded with consent, transcribed, and analyzed using inductive thematic analysis following established six-phase procedures. Reporting adheres to the Consolidated Criteria for Reporting Qualitative Research (COREQ). Results: Six participants (two regulators and four pharmaceutical executives) identified three interrelated barrier to domestic COVID-19 vaccine production: (1) technical and knowledge gaps (loss of hands-on expertise, absence of operational vaccine manufacturing facilities, limited technology transfer), (2) financial and infrastructure barriers (high cost of capital, serial taxation, unreliable electricity and logistics constraints), and (3) systemic and institutional barriers (inconsistent political commitment, policy discontinuity, regulatory capacity gaps, and concerns about public confidence). To enable local production, participants emphasized coordinated investment in workforce development, technology-transfer partnerships, modern utilities and cold chain systems, access to specialized equipment and high-quality inputs, and predictable policy, financing, and regulatory environments. Conclusions: Participants perceived Nigeria’s current capacity as insufficient for COVID-19 vaccine manufacturing but identified actionable levers, particularly human capital development, infra-structure strengthening, and regulatory and financing reforms, to enable sustainable local production. These findings provide a practical roadmap for policymakers, regulators, and industry leaders seeking to strengthen Nigeria’s biomanufacturing and long-term pan-demic preparedness.

Technical Note
Engineering
Mechanical Engineering

Han Haitjema

Abstract: For the calibration of surface plate, the classical Moody method is still commonly used. In this method the straightness of a number of lines over a surface plate in a union-jack configuration are measured and combined to a flatness measurement. The measurement of two center lines is commonly omitted in the evaluation and only used to determine so-called closure errors. These two lines can be incorporated in the measurement evaluation in a least-squares sense, giving an 18% reduction of the uncertainty. A further reduction in the uncertainty is possible when using the gravity vector as a common reference, as can be done when using electronic levels. A least-squares evaluation of measurements taken in this way gives a further reduction of the uncertainty of 29% relative to the traditional evaluation according to the Moody method. This is illustrated with an actual measurement example and additional Monte-Carlo simulations.

Article
Public Health and Healthcare
Health Policy and Services

Claudia Chaufan

,

Maryanne Dias

,

Natalie Hemsing

,

Olga Collins

Abstract: Background: During the Covid-19 event, Ontario hospitals implemented healthcare worker vaccination policies under Directive #6, a provincial framework that formally permitted multiple compliance pathways, including mandatory vaccination. Despite this formal flexibility, institutional responses converged. This study examines how vaccination mandates were implemented and justified across institutional, legal, and media domains, with particular attention to the operation of discretion within a decentralized governance framework. Methods: An environmental scan was conducted using document analysis of publicly available materials from a purposive sample of Ontario hospitals. Sources included hospital policy documents, institutional communications, court decisions, and media coverage. Materials were analyzed to identify patterns of mandate implementation, justification, and representation across domains. The term “Covid-19 event” is used as a neutral temporal descriptor that does not presuppose epidemiological classification. The study emphasizes descriptive mapping of institutional responses rather than causal inference. Results: Across the documentary corpus, vaccination was consistently framed as a baseline condition of healthcare employment, while alternatives permitted under provincial policy were rarely presented as durable or equivalent options. Hospitals adopted highly similar implementation models despite formal discretion. Legal decisions generally treated mandates as matters of institutional or employer authority, emphasizing jurisdictional and procedural considerations while limiting substantive review of scientific and proportionality claims central to the litigation. Media coverage largely mirrored institutional and legal framings, presenting vaccination as a settled professional expectation and employment exclusion as a routine administrative consequence. Taken together, these domains exhibited parallel patterns of normalization and policy alignment. Conclusions:This environmental scan documents convergence toward restrictive vaccination mandate implementation across institutional, legal, and media domains despite a formally flexible policy framework. By tracing how discretion was exercised and legitimated, the study provides an empirical account of how vaccination mandates stabilized as routine institutional practice. These findings establish a foundation for subsequent interpretive analysis of authority, dissent, and policy problem representation within governance frameworks during declared public health emergencies.

Article
Social Sciences
Government

Igor Calzada

,

Itziar Eizaguirre

Abstract: Artificial Intelligence (AI) is increasingly embedded in public governance, raising questions about how institutions can anticipate its societal implications while safeguarding democratic accountability amid expanding computational infrastructures. This article examines how anticipatory AI governance can be operationalised in the age of super-computing through a mixed-methods multistakeholder approach in the Basque Country (Spain). The study focuses on the city-regional governance setting of Gipuzkoa, a de-volved historical territory with fiscal autonomy and a growing advanced-computing ecosystem centred in Donostia–San Sebastián, where regional initiatives are positioning the Basque Country as an emerging “quantum territory” within Europe’s high-performance and quantum computing landscape, including the installation of IBM Quantum System Two. Methodologically, the study combines action research with three stakeholder groups and a quantitative online survey of citizens (N = 911). The action research engaged six civil society organisations, seven provincial directorates, and eleven municipalities. Results indicate that city-regional administrations can function as labor-atories for public AI governance when policy experimentation is combined with empirical evidence and advanced computational infrastructures. The findings suggest policy recommendations for supercomputing ecosystems, including transparent AI experi-mentation, public-interest data governance, and policy sandboxes linking advanced computing, civic participation, and accountable digital public services.

of 5,664

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated