Sort by

Article
Business, Economics and Management
Business and Management

Shian Dee Hoo

,

Kidong Lee

Abstract: Automobiles have been the pivotal instrument shaping the cities' structures and urban lives. While promising numerous benefits, the successful introduction of self-driving cars depends not only on their technical completeness, but mostly on the market acceptance for these disruptive transformations. To gauge the market response, we first summarize the most recent 15 research applying technical acceptance model (TAM) to compare their objectives, findings, and applied constructs around the countries. Then we examine to identify some influential factors whether or not the public use these self-driving machines, using 519 Korean samples. We use three groups as the system characteristics, social influence, and individual differences to do the quantitative survey and the structure equation model (SEM). The findings show that technical completeness, law and regulations, media support, perceived cost, trial and experience influence significantly on perceived usefulness and again on the usage intention while perceived safety do not significantly influence perceived usefulness. The results of this study help to strengthen existing knowledge about the self-driving cars by emphasizing the key elements that drive the intention to use the self-driving cars in the future.

Article
Physical Sciences
Quantum Science and Technology

José Tito Mendonça

,

José Luis Figueiredo

,

Hugo Terças

Abstract: Temporal effects associated with surface plasmon polaritons (SPP) in a quantum slab of a plasma material, such as a thin film of metal, semiconductor or a graphene plate, where the quantum dispersion effects, and in particular exchanges effects, are retained.

Article
Chemistry and Materials Science
Polymers and Plastics

F Valdebenito

,

CP Quezada

,

D Parra

,

Valentina Rivera Concha

,

Elizabeth Elgueta

,

Rodrigo Cáceres

,

R Cabezas

,

C Farkas

,

M Pereira

,

L Azocar

+1 authors

Abstract: This study evaluates the antimicrobial properties of nanocomposite materials based on polyvinyl alcohol (PVA) reinforced with cellulose nanofibrils (CNFs) and/or supplement-ed with biobased additives derived from blueberry pruning wastes, with the objective of developing biodegradable food packaging systems with antimicrobial properties. The nanocomposites were prepared using a solvent-casting processing approach, and their thermal, physicochemical, and antimicrobial properties were assessed. All the nanocomposites exhibited thermal stability up to 200 °C, confirming their suitabil-ity for conventional food packaging processing conditions. Antimicrobial activity tests re-vealed inhibitory effects against both Gram-positive and Gram-negative bacteria. Bleached PVA/CNFs films showed complete growth inhibition (100%) against E. coli and S. aureus. In contrast, unbleached PVA/CNFs and PVA/CNFsB supplemented with blueber-ry-derived additives exhibited selective inhibition against E. coli, highlighting the influ-ence of nanofibril composition and additive incorporation on antimicrobial performance. Zeta potential measurements revealed values of –35.3 mV for the CNFs, confirming their negatively charged surface, which may contribute to interactions with bacterial mem-branes. Additionally, scanning electron microscopy (SEM) showed that the incorporation of CNFs generates nanostructured surfaces with exposed fibrillar domains, where bacteri-al cells become adhered and immobilized. These topographical features suggest that the antimicrobial behavior of the nanocomposites is associated with direct bacteria–surface interactions, supporting a contact-active antimicrobial behavior associated with the CNFs.

Article
Engineering
Safety, Risk, Reliability and Quality

Jesús Manuel Ballesteros-Álvarez

,

Álvaro Romero-Barriuso

,

Blasa María Villena-Escribano

,

Ángel Rodríguez-Sáiz

Abstract: In architecture and construction, it is common to use acrylic products with a high flammable content, from lacquers to improve the curing of concrete and mortar to resins that offer protection, sealing, flexibility and elasticity properties. The drying process of the treated surface involves the formation of volatile organic compound (VOC) vapours. To prevent these from degenerating into a potentially dangerous flammable atmosphere, a procedure is presented that establishes the maximum application yield for solvent-based products, providing equations that relate the maximum application surface area and minimum drying time to the air velocity available in the work area. The results are provided for both indoor and outdoor applications. A maximum application speed is established to prevent the generation of areas classified as fire or explosion hazards: 1.5 m²/h indoors and 1 m²/h outdoors. When this is carried out at an ambient temperature of 20°C, and above 40°C, it is not possible to apply varnishes without generating a flammable atmosphere.

Article
Public Health and Healthcare
Nursing

Eva M Montoro-Ramírez

,

Isabel M López-Medina

,

Daniel Puente-Fernández

,

Laura Parra-Anguita

Abstract: Introduction: Climate change is increasingly affecting the health of older people. This study aimed to determine the knowledge, skills, and attitudes of nurses and undergraduate nursing students regarding the effects of climate change on older people’s health. Material and Methods: A descriptive cross-sectional study was conducted between January and April 2024 with 708 participants (210 nurses and 498 undergraduate nursing students). The Nursing Competencies Questionnaire on Environmental Health of Older People (NCQ-OPEH) was used to assess environmental competencies. Descriptive values were calculated and interrelationships between knowledge, attitudes, and skills were analysed. Results: A total of 115 nurses (54.75%) and 185 students (37.15%) demonstrated good-excellent knowledge. Similarly, a higher percentage of nurses (50.77%) reported better perceived skills than students (42.52%). However, the majority of both samples (98.97% and 87.85%, respectively) had good to excellent attitudes. These differences were significant for knowledge (p< .001) and attitudes (p= .013), but not for skills (p= .054). Furthermore, a significant relationship was found between prior education on climate change and health and greater knowledge (p= .019) and skills (p= .027) among nurses and better skills and attitudes (p< .001 in both) among nursing students. Conclusion: Nurses have better environmental competencies than undergraduate nursing students. Therefore, it is important to include education on climate change and older people's health to be included in the academic curriculum of university nursing degrees. Nurses also need to reinforce these competencies through specific educational programmes. This new tool will evaluate educational and formation sessions on climate change and the health of older adults.

Review
Environmental and Earth Sciences
Pollution

Soledad González-Juárez

,

Nora Ruiz-Ordaz

,

Juvencio Galindez-Mayer

Abstract: Diffuse pollution from agricultural runoff, characterized by intermittent discharges of complex contaminant mixtures—including nutrients, pesticides, and heavy metals (HMs)—poses a persistent threat to global water quality. Conventional "end-of-pipe" strategies often fail to address these decentralized, nonpoint sources. This review examines the evolution of Permeable Reactive Barriers (PRBs) from static, abiotic filters into modern Permeable Reactive Bio-Barriers (PRBBs), engineered as dynamic, fixed-bed biofilm reactors. A key advancement in PRBB efficacy is the exploitation of biofilm plasticity, particularly in response to coexistence with organic and inorganic pollutants. While heavy metals are traditionally viewed as inhibitors, this review synthesizes evidence showing that sub-inhibitory HM levels can act as structural and functional drivers. These metals induce the upregulation of Extracellular Polymeric Substances (EPS), creating a "protective shield" that sequesters metals and confers functional resilience on the microbial consortia responsible for nutrient removal and pesticide biodegradation. The review analyzes contaminant removal mechanisms, highlighting the bio-chemo synergy between reactive media and biofilms, and proposes a classification framework based on target contaminants, media, and technological integration. Significant focus is placed on emerging hybrid multi-media systems designed to protect the "biological engine" from toxic metal shocks, alongside the integration of artificial intelligence for predictive control. While challenges in hydraulic sustainability and field validation remain, PRBBs represent a compact, low-energy, and scalable eco-technology. They offer a strategically targeted solution within the Nature-Based Solutions toolkit for building resilient protection of aquatic ecosystems at the critical land-water interface.

Article
Engineering
Other

Kevin MacG. Adams

,

Irfan Ibrahim

,

Steven L. Krahn

Abstract: This paper proposes a formal method and associated techniques for completing the ISO/IEC/IEEE Standard 15288 technical process 6.4.2 – stakeholder needs and requirements definition within the 15288-SysML grid framework. The paper is a companion work to Engineering systems with standards and digital models: Development of a 15288-SysML Grid, which describes an engineering design method that supports the tenets of the Industry 4.0 paradigm. The formal method presented here is grounded using established constructs from systems science; specifically, the systems principles of hierarchy, emergence, requisite parsimony, minimum critical specification, and requisite saliency. The application of accepted principles ensures that stakeholders are able to objectively specify measurable criterion that can satisfy stakeholder needs and capabilities. The method uses: (1) international standards for systems (e.g., ISO/IEC/IEEE 15288); (2) adopts the four fundamental aspects of system design supported by model-based systems engineering (MBSE); (3) invokes the international standard for the systems modeling language (SysML); and (4) adopts a hierarchical requirements tree that specifies Mission, Goals, Objectives, and Sub-objectives (MGOS) to provide the stakeholder-analysis process for the articulation of system-level engineering requirements. Utilization of the MGOS is intended to have a positive impact on the system design process by ensuring reproducibility, replicability, transparency, and generalization.

Hypothesis
Biology and Life Sciences
Biophysics

C. Leonard Neatu

Abstract: Biological coherence arises from coordinated integration of redox chemistry, hydration dynamics, electromagnetic interactions, and bioenergetic flux. Although substantial progress has been made in characterizing these processes individually, current frameworks do not fully explain how distributed biochemical events achieve stable temporal coordination across scales. In thermally noisy, dissipative environments, energy alone cannot account for sustained biological organization. A missing element is the establishment and renewal of phase reference - the temporal alignment that enables spatially distributed processes to act in synchrony. Here we propose a physical mechanism for phase reference access and anchoring based on cyclic nanodomain dynamics at a nanoscale redox-photonic interface previously termed the Redox Photonic Coupling System (RPCS). This interface supports an additional functional modality - phase breathing - a process mediated by molecular nitrogen (N₂) through which cyclic nanodomain nucleation and collapse anchors and sustains phase reference in living systems. Nitrogen-mediated oscillatory boundary dynamics create transient coherence windows that permit local access to phase reference, enabling phase-aligned oxidative-reductive resolution and anchoring of phase onto redox-generated Photonic Activation Quanta (PAQs). Absorption of phase-conditioned PAQs by adjacent hydration shells enables generation and accumulation of centropy, defined as stored organizational capacity that supports coordinated biological work.This framework identifies phase breathing as a previously unrecognized mechanism sustaining biological coherence and assigns molecular nitrogen a structural organizational role beyond respiratory dilution. By integrating nanodomain mechanics, photonic phase conditioning, and redox dynamics within a single interface, it provides a mechanistic basis for how coherent biological function is generated and maintained.

Review
Public Health and Healthcare
Other

Nikolaou I. Aikaterini

,

Soldatou Alexandra

,

Grantzi Georgia-Christiana

,

Giapros Vasileios

,

Ladomenou Fani

Abstract: Maternal vaccination against respiratory syncytial virus (RSV) represents a major advance in early-life infection prevention. Although clinical efficacy and early real-world effectiveness are well established, sustained population-level impact depends on equitable uptake. This review synthesizes determinants influencing maternal RSV vaccination within the evolving dual-strategy landscape that includes both maternal vaccination and infant monoclonal antibody prophylaxis. A structured narrative review was conducted following PRISMA principles. PubMed/MEDLINE and Google Scholar were searched for studies published between January 2022 and February 2026. Eligible studies examined behavioral, interpersonal, structural, economic, and policy determinants of maternal RSV vaccination uptake, as well as early implementation and modelling evidence. Findings were integrated within a multilevel analytical framework. Maternal uptake is shaped by interacting determinants across individual, healthcare provider, and health system domains. Key drivers include perceived infant disease severity, vaccine safety confidence, perceived effectiveness, and prior antenatal vaccination behavior. Healthcare provider recommendation consistently emerges as the strongest facilitator. Coverage variability reflects differences in reimbursement, antenatal care integration, and national policy endorsement. The coexistence of maternal vaccination and infant monoclonal antibody strategies introduces additional comparative decision-making complexity. Early implementation data indicate heterogeneous uptake and socioeconomic gradients, while modelling demonstrates sensitivity to coverage, timing, epidemiology, and cost. Translating biological efficacy into sustained public health benefit requires coordinated behavioral, structural, and policy strategies, strong provider engagement, and context-sensitive implementation frameworks to ensure equitable coverage.

Hypothesis
Biology and Life Sciences
Life Sciences

Cheng Wang

Abstract: The Central Dogma has provided a foundational framework for biological information flow, yet it does not fully explain how living systems preserve stable identity, functional robustness, and recoverability under continuous molecular noise and environmental perturbation. Here, I propose the Central Homeostatic Principle (CHP) as a complementary first-principle framework that shifts the explanatory center from information execution alone to the physical constraint architecture that makes biological execution possible. The CHP posits that, in living cells, a central homeostatic state functions as a system-level coordinating layer that defines the feasible state space within which genetic and biochemical programs can operate.This framework is motivated by convergent evidence across mechanical confinement, electrophysiological coupling, membrane contact-site transduction, phase-state regulation, and non-genetic phenotypic heterogeneity, all of which indicate that global physical states can gate, reshape, or buffer molecular outcomes. Building from systemic prerequisites and material constraints, I further argue through an exclusionary first-principle analysis that lipid-organized boundary systems occupy a near-irreplaceable physical position in implementing this central homeostatic constraint in aqueous cellular life-not as exclusive causal authors, but as the dominant substrate of feasibility control.To render the theory scientifically actionable, this manuscript provides a formal articulation of CHP, a three-tier realization model, operational corollaries, and a rule typology that distinguishes stronger and weaker forms. It then derives a set of falsifiable hypotheses spanning temporal commitment dynamics, non-genetic resistance, aging-related resilience loss, state-engineering-based reprogramming, and evolutionary primacy in prebiotic systems. By reframing life as a problem of constrained state maintainability rather than information flow alone, the CHP offers a testable theoretical scaffold for integrating molecular biology, biophysics, systems biology, and translational state engineering.

Data Descriptor
Engineering
Mechanical Engineering

Krisztian Horvath

Abstract: This data descriptor provides a standardized and reproducible subsystem-level represen-tation of the NREL wind turbine gearbox condition monitoring benchmarking dataset. The released records are derived from Healthy (H1–H10) and Damaged (D1–D10) meas-urement files and include subsystem-level standardized indices (KHI_HS, KHI_IMS, KHI_PL) together with a calibrated 0–1 Gearbox Health Index (GHI). The indices are gen-erated using a fully specified and deterministic feature extraction and aggregation work-flow based on established vibration indicators and healthy-referenced normalization. The Zenodo deposit contains machine-readable CSV tables intended to support transparent benchmarking across supervised classification and anomaly detection studies. The pro-posed GHI is introduced as an interpretable and reproducible reference baseline rather than an optimized diagnostic model. Technical validation demonstrates condition-level separability within the analyzed dataset while emphasizing the descriptive nature of the index. By releasing structured derived records and a documented regeneration procedure, this work enables implementation-independent comparison of gearbox condition moni-toring approaches and supports reproducible evaluation of alternative health index formulations.

Article
Physical Sciences
Astronomy and Astrophysics

Shoude Li

Abstract: If the invariance of light speed is absolutely universal in covariant space rather than contra variant space, the following researches and conclusions on general covariance would be so far as to catch up with very high probabilities and reliabilities. As has been verified, it is. General covariance is tested controversial after the investigations on gravitational redshift and acceleration. Further inspections on differential geometry indicate the opportunities of inequality of mixed derivatives of bases for the transformations between Riemannian spaces that will then lead to the inequality of Christoffel symbols of alternative sub-index and then the failure of the classical equations of Christoffel symbols. That is one of the reasons that causes controversies on general covariance. Even the negative form of time metric could be proved to be a false setting in that space transformations do nothing with the negative sign inherited from that of Minkowski space. Nevertheless, after discussions on transformations between original spherical space, distance expressed spherical space and Cartesian space, it has been seen that the distance factors for angular coordinates of a spherical space are improper to be employed the metrics in general relativity, instead of that, the concept of gravitational metrics were suggested. In fact, Christoffel symbols and base derivatives both are valid methodologies for analysis in a non-Euclidean space. The concept of trajectory derivatives was carried out to define the derivations on trajectory of matter motions that could help to revise those equations and calculations. Measurable experiments on gravitational redshifts and accelerations have been carried out to support the theoretical results. Conclusions have been drawn that light speed keeps general covariance in gravitational fields but light energy momentum would not, may as well, the motions of massive matters in gravitational fields do not perform general covariance thoroughly. It is impossible to geometrize the gravity effects of massive matters with position depending metrics in that the variable velocities cannot be eliminated thoroughly. Consequently, inferences on kinematics and relativistic release were carried out, which might have been forcefully verified in applications. With the concept of gravitational metrics, the so-called geodesic equations have been falsified to be kinematic equations anymore. Velocities, mass energy and momentum all should have their conservative forms. To seek for these conservative forms is the critical route to create kinematic or dynamic equations of motions of light rays and massive matters. The conservativeness of light angular momentum has been discovered in most surprising form. Renovated solutions for light rays as well as massive matters have been carried out that forcefully impact the traditional methodologies on kinematics of trajectory and time delay because they are the correct interpretations of the realities. It should be highlighted that the renovated mass equation, the general mass equation for free motions, could completely demonstrate energy variations, especially the variations in gravitational fields, that help to create more general dynamic equations, so as to cause an irrelativistic solution of planet perihelion precession. As a proof by contradictions, the traditional solution of that must be involved with errors. Newtonian ballistic method was put forward to make numerical analyses on close-to-light-speed motions, and the invalidity of that method on light propagation could be seen as another support to the conclusions and inferences. Another forceful falsification on energy momentum conservativeness carried out in the discussions on the traditional treatment of time delay of close-to-light-speed particles, has thoroughly exposed the essential mistakes in the way of methodologies. Dynamic models of fluid planet rings were founded to interpret the evolutions of accretions of quasars and active galactic nuclei and the mechanism of relativistic release. It is predicted that the peak release of an inflow is at 1.5 of gravitational radius and the peak luminosity of an accretion may locate at the position of about 1.33 of gravitational radius. Relativistic frequency shift interprets the mechanism of giant redshifts that predict the probability of observational redshifts might be up to a particularly higher level with respect to that have been observed in the past years. The width equations of emission and absorption lines indicate the mechanism and the positions of broad line regions and narrow line regions. It could be imagined that the relativistic emissions and relativistic absorptions with relativistic redshifts would have been involved with fantastic mystery of intrinsic structures of matters that we know less.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Jineng Ren

Abstract: Since the beginning of modern computer history, the Turing machine has been a dominant architecture for most computational devices, which consists of three essential components: an infinite tape for input, a read/write head, and finite control. In this structure, what the head can read (i.e., bits) is the same as what it has written/outputted. This is actually different from the ways in which humans think or do thought/tool experiments. More precisely, what humans imagine/write on paper are images or texts, and they are not the abstract concepts that they represent in the human brain. This difference is neglected by the Turing machine, but it actually plays an important role in abstraction, analogy, and generalization, which are crucial in artificial intelligence. Compared with this architecture, the proposed architecture uses two different types of heads and tapes, one for traditional abstract bit inputs/outputs and the other for specific visual ones. The mapping rules among the abstract bits and the specific images/texts can be realized by neural networks with a high accuracy rate. Logical reasoning is thus performed through the transfer of mapping rules. The statistical decidability of the Halting Problem with an imperceptibly small error rate in reasoning steps is established for this type of machines. As an example, this paper presents how the new computer architecture (what we call ``Ren machine" for simplicity here) autonomously learns a distributive property/rule of multiplication in the specific domain and further uses the rule to generate a general method (mixed in both the abstract domain and the specific domain) to compute the multiplication of any positive integers based on images/texts. The machine's strong reasoning ability is also corroborated in proving a theorem in Plane Geometry. Moreover, a robotic architecture based on Ren machine is proposed to address the challenges faced by the Vision-Language-Action (VLA) models in unsound reasoning ability and high computational cost.

Article
Physical Sciences
Theoretical Physics

Dennis Kahan

Abstract: Foundational tensions between special relativity and quantum mechanics, together with conflicts between general relativity and quantum gravity, and unresolved cosmogonical and cosmological anomalies, block theoretical unification and limit explanatory depth. Based on ontological first principles rather than mathematical constructs, this analysis integrates a “discrete,” background-independent, relativistic 4D spacetime with a physically co-located Planck Domain. Through a one-to-one identity, the Planck Domain mirrors the discrete spatial elements of 4D spacetime, enabling a single, unified set of physical laws across quantum and classical regimes. Under this framework, ontic single- and N-body quantum states evolve deterministically in 4D spacetime and collapse instantaneously in the Planck Domain. Current theoretical tensions between special relativity and quantum mechanics, including nonlocality, separability, time, simultaneity, total energy scaling, and probability conservation, are reappraised by replacing the Hilbert-space wavefunction with an ontic energy field and a single energy-based operator that governs both motion and gravitational response. The identical ontological framework and dynamical laws apply unchanged across general relativity and quantum gravity, recasting gravity as the relational dynamics of discrete energy rather than the coupling of the stress-energy and metric tensors and reappraising the equivalence principle and the black hole information paradox. Cosmogonically, the same model re-examines the origin of 4D spacetime, accounting for near-homogeneity, isotropy, and low gravitational entropy without ad hoc assumptions, fine-tuning, or perturbative techniques, and provides ontological foundations for the cosmological constant and global energy conservation. Eight descriptive mathematical validations, derived from a unified evolution law, Planck Domain collapse rule, and the relational gravity law, support (but do not govern) the analysis: (i) the low-ℓ CMB TT shape generated from a field with one global amplitude on power; (ii) CHSH correlations at Tsirelson’s bound from collapse; and (iii–viii) hard-mass relational dynamics, highlighted by a tilted Earth–Moon orbit, a tilted hierarchical three-body system, and a high-energy Mercury–Sun analog, all sustained for 1000 orbits or inner orbits.

Article
Physical Sciences
Quantum Science and Technology

Zhaoxu Ji

,

Huanguo Zhang

Abstract: Since its establishment, quantum mechanics has developed for a century and has a very large theoretical system, but the phenomenon of quantum mechanics still lacks a generally accepted explanation, which undoubtedly shows that the existing theoretical system is incomplete. Inspired by ancient Chinese philosophy, we propose a theoretical framework in this paper, which provides a new perspective for explaining quantum mechanical phenomena including superposition and entanglement. In addition, the proposed framework contributes to a profound understanding of the law of conservation of energy. We show through examples how basic superposition states and entangled states are constructed. Our work can inspire people to think deeply about the mysteries of nature, especially quantum mechanical phenomena.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Jinyu Chen

,

Feiyang Wang

,

Tian Guan

,

Yumeng Ma

,

Linghao Yang

,

Yutong Wang

Abstract: Large language model (LLM)-based multi-agent systems have demonstrated remarkable capabilities in collaborative task solving. Although the mechanisms that facilitate seamless cooperation, such as shared contexts, role assignments, and iterative message passing, present significant risks of unintentional information disclosure. We present MIN-Trust, a trust orchestration framework that enforces Minimum Necessary Information (MNI) constraints, an operationalization of the data minimization principle for inter-agent communication—while maintaining task effectiveness. Our approach introduces an MNI-Gate that automatically classifies and filters information into essential, summarized, or pointer-referenced subsets before transmission. Additionally, we propose a Trust-Gated Channel (TGC) that counterintuitively increases verification requirements rather than relaxing information access as inter-agent trust elevates. Through experiments on four collaborative tasks using public benchmarks, we demonstrate that MIN-Trust reduces sensitive information exposure by 67.8% compared to baseline multi-agent frameworks while maintaining 93.3% of task success rates. Our evidence traceability mechanism achieves 84.2% claim-to-source attribution, significantly outperforming conventional approaches. These results suggest that privacy-preserving multi-agent collaboration is achievable under synthetic benchmark conditions with moderate performance trade-offs.

Brief Report
Medicine and Pharmacology
Obstetrics and Gynaecology

Prajwal Shetty

,

B A Sujeewa Fernando

,

B Anuthi Fernando

,

Sindhu Sekar

,

Lakshmi Jayaraj

Abstract: Background: Population ageing is accelerating worldwide, accompanied by a rising prevalence of multimorbidity and polypharmacy. Medicines with anticholinergic properties are commonly prescribed to older adults for a wide range of conditions, including depression, urinary incontinence, Parkinson’s disease, allergies, and respiratory disorders. While short-term anticholinergic effects such as dry mouth and constipation are well recognised, increasing evidence suggests that cumulative anticholinergic exposure may contribute to adverse cognitive outcomes in older populations.Objective: This review aims to discuss the concept of anticholinergic burden, outline commonly used tools to quantify exposure, and examine the evidence linking cumulative anticholinergic exposure with cognitive decline and other adverse health outcomes. It also explores strategies to identify and mitigate anticholinergic burden in clinical practice.Methods: Relevant literature on anticholinergic medications, burden scales, and associated clinical outcomes was reviewed. Attention was given to validated measurement tools such as the Anticholinergic Cognitive Burden (ACB) scale, Anticholinergic Risk Scale (ARS), and Anticholinergic Drug Scale (ADS), as well as studies examining associations between anticholinergic exposure and cognitive and functional outcomes.Results: Evidence from observational studies indicates that higher cumulative anticholinergic burden is associated with increased risks of cognitive impairment, delirium, falls, functional decline, and possibly dementia. Measurement tools allow clinicians and researchers to estimate cumulative exposure, with several studies identifying clinically meaningful risk at moderate to high burden scores. Conclusion: Anticholinergic burden represents a potentially modifiable contributor to adverse outcomes in ageing populations. Routine assessment of anticholinergic exposure, careful medication review, and deprescribing strategies where appropriate may help reduce avoidable cognitive and functional harm in older adults. Integrating burden assessment into prescribing systems and clinical decision support tools may further support safer pharmacotherapy in an ageing society.

Article
Computer Science and Mathematics
Computational Mathematics

Basem Ajarmah

,

Saber Syouri

Abstract: Managing inventory for perishable goods remains a persistent operational challenge, largely because conventional exponential decay models struggle to capture the irregular deterioration patterns observed in practice. This paper develops the Reliable Fractional Derivative (RFD) framework, which incorporates memory effects into the modeling of product decay through a time-shifted kernel. Unlike standard approaches that assume constant deterioration, this formulation accommodates both accelerating and decelerating patterns depending on product characteristics and storage conditions. We derive closed-form expressions for optimal ordering quantities under both deterministic and stochastic demand, then test the framework's performance through numerical experiments spanning two thousand parameter combinations. The analysis reveals that RFD models deliver the greatest improvements when deterioration rates are steep, holding costs are substantial, or storage horizons are extended—conditions under which switching from conventional methods yields average cost reductions approaching nineteen percent, with substantially larger gains in certain cases. A pharmaceutical application confirms savings between 3.6 and 9.1 percent relative to misspecified traditional models. These findings connect with recent industry movements toward more sophisticated safety-stock practices, offering managers a principled basis for selecting inventory policies aligned with actual product behavior rather than assuming decay conforms to simpler theoretical forms.

Review
Public Health and Healthcare
Public Health and Health Services

Angyiba Serge Andigema

,

Dimalla Paola Aphrodite Olive

Abstract: Vaccine hesitancy has evolved from episodic resistance to a structural threat to global health systems. Although opposition to vaccination has accompanied immunisation since its inception, contemporary hesitancy reflects a transformation driven by digital information ecosystems, political polarisation, institutional mistrust, and shifting risk perceptions. Its consequences extend beyond individual vaccine refusal to systemic vulnerabilities within immunisation programs. Here, wesynthesisee historical and contemporary evidence to examine vaccine hesitancy as a multilevel phenomenon shaped by sociocultural identities, psychological heuristics, and political governance structures. Tracing its trajectory from early smallpox resistance to COVID-19–era polarization, we identify recurring patterns of mistrust, moral framing, and autonomy-based resistance that re-emerge across contexts. We argue that vaccine hesitancy operates not merely as an attitudinal deficit but as a reflection of broader fractures in social trust and institutional legitimacy. We further analyse how clustering of under-immunised populations, digital misinformation amplification, and politicisation of public health undermine immunisation resilience. Evidence suggests that durable solutions require trust-centred governance, community co-production of health strategies, behavioral insight informed communication, and structural reforms that address inequity and historical injustice. Reconceptualising vaccine hesitancy as a systems-level vulnerability reframes immunisation programs as social contracts as much as biomedical interventions. Strengthening these contracts will be central to sustaining global vaccination gains in an era defined by misinformation, institutional fragility, and recurrent pandemic threats.

Case Report
Medicine and Pharmacology
Anesthesiology and Pain Medicine

Jeongsoo Choi

,

Ho Soon Jung

,

Da Hyung Kim

,

Yong Han Seo

,

Hea Rim Chun

,

Hyung Yoon Gong

,

Jae Young Ji

,

Jin Soo Park

,

Sangwoo Im

Abstract: Background and Clinical Significant: Patent ductus arteriosus (PDA) is a common car-diovascular disorder in extremely low birth weight(ELBW) infants, for which surgical ligation is indicated when pharmacologic closure fails. Sudden increases in afterload combined with immature myocardial contractility can lead to post-ligation cardiac syn-drome, which usually occurs within hours after surgery. However, acute intraoperative hemodynamic collapse during PDA ligation has rarely been described. Case Presenta-tion: A preterm infant born at 24 weeks and 3 days of gestation with a birth weight of 890 g underwent emergency PDA ligation for a hemodynamically significant PDA refractory to pharmacological treatment. Fifteen minutes after skin incision, the infant developed severe hypoxemia, bradycardia, and non-measurable noninvasive blood pressure, which required immediate hemodynamic resuscitation with manual ventilation, fluid admin-istration, and dopamine and dobutamine infusions. Hemodynamics gradually recovered after completion of ductal ligation, whereas hypoxemia persisted. Postoperative chest radiography revealed a left-sided pneumothorax, and oxygen saturation stabilized after pleural air aspiration. The subsequent clinical course was uneventful, and typical post-ligation cardiac syndrome did not develop. Conclusions: This case suggests that intraoperative hemodynamic collapse during PDA ligation may share pathophysiologic features with post-ligation cardiac syndrome, and that concomitant pneumothorax can further aggravate hemodynamic instability by worsening hypoxemia and reducing venous return.

of 5,647

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated