Sort by

Article
Computer Science and Mathematics
Applied Mathematics

Florentin Șerban

,

Bogdan Vrinceanu

Abstract: Modern financial markets are increasingly shaped by algorithmic trading systems and artificial intelligence techniques that process large volumes of financial data in real time. However, machine learning–based trading systems often suffer from signal instability and excessive sensitivity to market noise, which may lead to overtrading and increased financial risk. In highly volatile environments such as cryptocurrency markets, the re-liability of trading signals becomes a critical issue for both portfolio allocation and risk management. This study proposes an entropy-filtered machine learning framework designed to en-hance the stability and risk-awareness of algorithmic trading strategies. The proposed approach integrates entropy-based filtering techniques with machine learning classifiers in order to reduce noise in market signals improving the risk-adjusted stability of algo-rithmic trading strategies. Entropy measures are employed as a filtering mechanism that evaluates the informational content of market signals and suppresses unreliable predic-tions generated by the learning model. The empirical analysis is conducted using cryp-tocurrency market data, where the entropy-filtered machine learning framework is ap-plied to trading signal generation and portfolio decision making. The results indicate that the proposed approach improves the stability of trading signals and reduces the occur-rence of false signals compared to conventional machine learning trading models. Moreover, the integration of entropy filtering contributes to a more balanced risk–return profile and enhances the overall robustness of algorithmic trading strategies.The findings suggest that combining information-theoretic measures with machine learning tech-niques represents a promising direction for developing more reliable and risk-aware financial decision systems. The results suggest that entropy-based filtering can substan-tially improve the robustness and risk-awareness of machine learning trading systems, providing a promising direction for future AI-driven financial decision frameworks.

Article
Engineering
Energy and Fuel Technology

Ndemuhanga V. Nghuumbwa

,

T. Wanjekeche

,

E. Hamatwi

,

M. Kanime

Abstract: Namibia’s rural communities continue to experience limited and unreliable electricity access despite the country’s exceptional solar, wind, and biomass renewable energy re-sources potential. Conventional grid extension remains financially and technically impractical for dispersed off-grid settlements, underscoring the need for cost-effective, re-renewable based alternatives. This paper presents a resource-driven design and multi objective optimization framework for Hybrid Renewable Energy Systems (HRESs) tailored to Namibia’s off-grid communities. The proposed model integrates solar PV, wind turbines, biomass generators, and hydrogen-based fuel cells with hybridized energy storage consisting of batteries, supercapacitors, and hydrogen tanks. Using the Non-dominated sorting Genetic Algorithm-II (NSGA-II), the system simultaneously minimizes Total Life Cycle Cost (TLCC), Levelized Cost of Electricity (LCOE), Loss of Power Supply Probability (LPSP), Carbon dioxide (CO₂) emissions, and Wasted Renewable Energy (WRE). The framework is applied to three rural villages, Oluundje, Ombudiya, and Onguati using high-resolution, site-specific renewable resource datasets and community-level load forecasts. Results demonstrate that resource-aligned configurations substantially improve system reliability (up to 99.28%), reduce LCOE (0.0023–0.0811 USD/kWh), and optimize dispatch behavior across seasonal variations. Storage hybridization further enhances stability by balancing transient and long-duration deficits. Com-pared to existing diesel mini-grids, the optimized HRESs achieve markedly superior techno-economic and environmental performance. The proposed framework offers a scalable, adaptable, and policy-ready tool for accelerating sustainable rural electrification in Namibia.

Article
Public Health and Healthcare
Physical Therapy, Sports Therapy and Rehabilitation

Clément Lévêque

,

Adam Moussati

,

Julien Verraver

,

Grégory Vervloet

,

Pierre Lafère

,

Michele Salvagno

,

Costantino Balestra

Abstract: Background: Whether inspired oxygen fraction (FiO₂) can modulate the internal meta-bolic cost of supramaximal high-intensity interval training (HIIT) and thereby direct training adaptation remains unclear. We tested whether hyperoxic versus hypoxic ex-posure during Tabata-format HIIT induces distinct adaptive phenotypes. Methods: Twenty-three physically active men completed 3 weeks of supramaximal Tabata HIIT (3 sessions·week⁻¹; 8 × 20 s with 10 s recovery) under hyperoxia (FiO₂ = 0.60, n = 13) or hypoxia (FiO₂ = 0.16, n = 10). Training intensity was regulated to main-tain a comparable internal physiological stimulus rather than an identical external workload. Pre- and post-intervention assessments included maximal oxygen uptake (VO₂max), first and second ventilatory thresholds (VT1, VT2), peak blood lactate, and session rating of perceived exertion (RPE). Post-intervention between-group differences were analysed using ANCOVA adjusted for baseline values; RPE was analysed using a linear mixed-effects model. Results: VO₂max improved in both groups but increased more after hyperoxic training than after hypoxic training (+3.69 vs. +1.50 mL·kg⁻¹·min⁻¹; β = 2.18 mL·kg⁻¹·min⁻¹, 95% CI [1.77–2.59], p < 0.001). Hyperoxia also produced larger gains in VT1 (β = 29.99 W, 95% CI [17.09–42.89], p < 0.001) and VT2 (β = 20.74 W, 95% CI [9.43–32.05], p = 0.001). Peak lactate responses diverged bidirectionally, decreasing in hyperoxia (−0.77 mmol·L⁻¹) and increasing slightly in hypoxia (+0.27 mmol·L⁻¹), with a significant ad-justed between-group effect (β = −1.02 mmol·L⁻¹, 95% CI [−1.47 to −0.57], p < 0.001). RPE declined across sessions in both groups, with a steeper decrease under hyperoxia (Con-dition × Session: β = −0.36, 95% CI [−0.44 to −0.28], p < 0.001). Conclusions: Hyperoxic and hypoxic supramaximal HIIT elicited distinct functional adaptive profiles. Hyperoxia induced greater improvements in aerobic capacity and ventilatory thresholds, reduced peak lactate accumulation, and accelerated the decline in perceived exertion, whereas hypoxia was associated with a more glycolytic response pattern. These findings support the interpretation that FiO₂ acts as a modulator of in-ternal physiological load and shapes the metabolic phenotype of adaptation during su-pramaximal interval training.

Article
Medicine and Pharmacology
Internal Medicine

Oznur Oner

,

Canan Akkus

,

Doga Demircioglu

,

Ilhan Karanlik

,

Cevdet Duran

Abstract: Background/Aim: Albuminuria is an established marker of endothelial dysfunction and an independent predictor of cardiovascular risk. Polycystic ovary syndrome (PCOS) is associated with early metabolic and vascular abnormalities; however, whether urinary albumin excretion differs across PCOS phenotypes remains unclear. This study aimed to evaluate urinary albumin excretion using the urinary albumin-to-creatinine ratio (U-ACR) across distinct PCOS phenotypes and to examine its association with metabolic parameters. Materials and Methods: In this cross-sectional study, 180 women aged 18-35 years with PCOS and 51 age-matched healthy controls were included. PCOS phenotypes were classified according to the Rotterdam criteria as Phenotype A (n = 96), Phenotype B (n = 19), Phenotype C (n = 35), and Phenotype D (n = 30). Insulin resistance was assessed using the homeostasis model assessment for insulin resistance (HOMA-IR). Urinary albumin and creatinine levels were measured in morning urine samples, and U-ACR was calculated. Results: Age was comparable across all groups. Body mass index, waist circumference, diastolic blood pressure, and HOMA-IR were significantly higher in Phenotype A compared with controls and other phenotypes, indicating a more adverse metabolic profile. Serum creatinine levels were similar across all groups. Despite this unfavorable metabolic profile in Phenotype A, U-ACR was significantly elevated only in Phenotype B compared with controls (p = 0.018) and Phenotype D (p = 0.016). No significant correlations were observed between U-ACR and age, body mass index, or HOMA-IR. When participants were categorized according to U-ACR levels (< 30, 30-299.9, and ≥ 300 mg/g creatinine), no significant differences in category distribution were observed between the total PCOS cohort, phenotype subgroups, and controls. Conclusion: Among PCOS phenotypes, U-ACR elevation was observed exclusively in Phenotype B despite similar renal function markers. Notably, this occurred even though Phenotype A exhibited a more adverse metabolic profile, suggesting a dissociation between metabolic burden and early microvascular involvement across PCOS phenotypes. These findings indicate that vascular risk in PCOS may be phenotype-dependent and support the potential value of phenotype-oriented cardiovascular risk assessment.

Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Monica Khadgi

Abstract: Artificial Intelligence (AI) has developed over the years from rudimentary systems of symbolic reasoning in the middle of the twentieth century to sophisticated data-driven and generative architectures, which give rise to modern society. The acceleration of machine learning, deep neural networks and large-scale computational infrastructures has turned AI into a basic technology in the economic, social, and societal sectors. This paper investigates the history of the development of AI and critically discusses its influence on society in the 21 st century. Following a narrative review approach, the paper summarises interdisciplinary literature of technological innovation, economic transformation, social change, ethical governance, and sustainability issues.Various findings are found in the analysis. To begin with, AI has greatly increased productivity and operational efficiency in the industry as well as redefining the labor markets and skill requirements. Second, AI-centered systems have enhanced the provision of services in the education, health, transportation, and government sectors, though the issue of bias, privacy, transparency, and accountability continues to be present. Third, the spread of AI to safety-critical systems highlights the value of reliability, regulation, and human-oriented design. Finally, the environmental impact of large-scale AI models represents the necessity of sustainable development practices.The paper concludes that AI is an opportunity for transformation and a governance challenge. The implications to be considered in the future are the emergence of human-focused AI models, the creation of control measures, and the introduction of sustainability indicators into technological change. The fair and responsible implementation of AI will be required in order to maximise the positive impacts on society and reduce the risks in the long term.

Article
Medicine and Pharmacology
Internal Medicine

María de-Castro-García

,

Sara Nuñez-Palomares

,

Juan Miguel Antón-Santos

,

Alejandro Estrada-Santiago

,

Yolanda Majo-Carbajo

,

Pilar García de la Torre Rivera

,

Francisco Javier García-Sánchez

,

Pilar Cubo-Romano

Abstract: Background: Hypernatremia is an infrequent but clinically relevant electrolyte disorder in older adults and is associated with poor outcomes. Patients managed through Hospital-at-Home (HaH) programs, particularly those living in institutional settings, are especially vulnerable due to functional dependency and cognitive impairment. Evidence regarding the prevalence and prognostic impact of hypernatremia in HaH settings remains limited; Methods: We conducted a retrospective observational cohort study including all patients admitted to a Hospital-at-Home unit between 2019 and 2024. Patients were classified according to care setting as home-dwelling or institutionalized. Hypernatremia was defined as a serum sodium concentration &gt;145 mmol/L. Sociodemographic, functional (Barthel Index), and cognitive (Global Deterioration Scale) variables were collected. Mortality during HaH admission and at 30, 60, and 90 days was analyzed, and survival was assessed using Kaplan–Meier methods.; Results: A total of 4,501 patients were included, of whom 2,701 were treated at home and 1,800 in institutional settings. Hypernatremia was significantly more prevalent among institutionalized patients than among home-dwelling patients (3.1% vs. 0.8%, p &lt; 0.001). Institutionalized patients with hypernatremia showed greater functional dependency (Barthel Index 11 vs. 15, p = 0.041) and more advanced cognitive impairment (GDS 6 vs. 5.5, p = 0.033) compared with those without hypernatremia. Mortality among institutionalized patients with hypernatremia was high, reaching 32.9% during HaH admission, 61.2% at 30 days, 70.6% at 60 days, and approximately 79% at 90 days. Kaplan–Meier analysis demonstrated a rapid decline in survival during the first month following diagnosis.; Conclusions: In Hospital-at-Home programs, hypernatremia is more prevalent among institutionalized older adults and is strongly associated with severe functional and cognitive impairment and very high short- and medium-term mortality. These findings suggest that hypernatremia should be considered a marker of advanced frailty rather than an isolated electrolyte disturbance and highlight the need for enhanced preventive and monitoring strategies in institutional and HaH care settings.

Article
Medicine and Pharmacology
Epidemiology and Infectious Diseases

Salvador Domènech-Montoliu

,

Óscar Pérez-Olaso

,

Diego Sala-Trull

,

Paloma Satorres-Martinez

,

Laura López-Diago

,

Isabel Aleixandre-Gorriz

,

Maria Rosario Pac-Sa

,

Manuel Sánchez-Urbano

,

Cristina Notari-Rodriguez

,

Juan Casanova-Suárez

+6 authors

Abstract: Background and Objective: After a SARS-CoV-2 infection, a Long COVID (LC) syndrome occurred in a high proportion of patients with affecting their health. Estimating the incidence, risk and protective factors of LC was the aim of our study. Material and Methods: We performed a prospective population-based cohort study on the Borriana COVID-19 cohort (Castellon province, Valencia Community, Spain) from May 2020 to August 2023 with a follow-up of 40 months, and considering the LC definition from the World Health Organization. We used inverse probability weighted regression. Results: With a response rate of 63.8% of a total of 722 participants, the average age was 37.7±17.4 years with 460 (62.3%) females, 644 had suffered a SARS-CoV-2 infection, and 184 suffered LC with a cumulative incidence of 28.6%. A total of 135 patients with LC remained affected, and a death associated with the syndrome occurred in 0.54% of them. Significant risk factors for LC were older age, female, chronic disease, SARS-CoV-2 exposure, reinfections and severity. Asymptomatic cases and SARS-CoV-2 vaccinations were significantly protective factors. Conclusions: A high incidence of LC was found with low recovery rate, and several risk and protective factors. Continued follow-up for non-recovered LC patients, surveillance of infections, and a SARS-CoV-2 vaccination for an at-risk population can be recommended.

Article
Computer Science and Mathematics
Mathematics

Deep Bhattacharjee

Abstract: We prove Convex Seed Universality for the Kreuzer—Skarke classification of four-dimensional reflexive polytopes. Every reflexive polytope in the Kreuzer—Skarke dataset arises from a primitive convex seed through a finite sequence of four toric operations: unimodular transformations, stellar subdivisions, polar duality, and lattice translations. Seed orbits coincide with connected components of the GKZ secondary fan, and the Hodge numbers of the associated Calabi—Yau hypersurfaces remain constant on each orbit. The seed invariant matrix is identified with the GLSM charge matrix, providing a natural toric-geometric interpretation of the construction. Four structural theorems: Seed Completeness, Orbit Connectivity, Hodge Invariance, and Exhaustiveness, together establish seed universality for the entire Kreuzer—Skarke dataset.

Article
Biology and Life Sciences
Animal Science, Veterinary Science and Zoology

Kanokpan Sanigavatee

,

Chanoknun Poochipakorn

,

Kiattisak Pimpjong

,

Salinthip Tippayaratsoontorn

,

Phimsiri Simma

,

Wanwalee Srisujja

,

Pannawat Kranpan

,

Phiravich Permpool

,

Tawan Koedkasem

,

Boonbaramee Wanichayanon

+3 authors

Abstract: Bedding serves a vital role in horse stable management. Although earlier research has examined stress responses to bedding materials, the effect of bedding types on these responses in horses of different sizes has not been studied. This study assessed the influence of bedding materials (straw vs rice husk) and body size (horses taller than the upper bar of the front gate [H1] vs shorter ones [H2]) on stress responses. Stress was evaluated using behavioural scores, recumbent time, and autonomic regulation via heart rate (HR) and heart rate variability (HRV). Microenvironmental variables such as relative humidity (RH), air temperature (AT), volatile organic compounds (VOCs), carbon dioxide (CO2), particle counts (PC; PC1, 2.5 and 5) and fine particulate matter (PM; PM1, 2.5 and 5) were also recorded. No differences were observed in RH, AT, VOCs or CO2 between stables with different bedding materials. However, PC (1, 2.5 and 5) and PM (1, 2.5 and 5) levels were higher in stables with rice husk than in those with straw. Beat-to-beat (RR) intervals increased, and HR decreased in both H1 and H2 horses on straw during the night (p < 0.05). H1 horses generally showed lower HR and higher RR intervals than H2 horses during housing, regardless of the bedding material used (p < 0.05–0.0001). Although bedding and body size affected behavioural scores and several HRV metrics, no significant within- or between-group differences were detected. These findings suggest that both bedding material and body size influence stress responses; however, short-term individual box housing may not provide sufficient stressors to produce significant differences.

Article
Physical Sciences
Theoretical Physics

Michael Bush

Abstract: Quantum mechanics predicts measurement outcomes with remarkable accuracy, yet the physical mechanism responsible for measurement remains unspecified. Standard formulations treat collapse as an external postulate or informational update, leaving the origin of measurement outside the theory’s physical description. This paper proposes a mass-induced mechanism for quantum measurement within the framework of Quantum Substrate Dynamics (QSD), in which the observer is not a privileged entity but a coherence structure formed by stable matter interacting with propagating excitations. In QSD, stable matter forms mass-phase structures possessing finite coherence envelopes that evolve through discrete Causality Intervals (CIs) governing how the substrate can reconfigure. Massless excitations, such as photons, lack coherence envelopes and therefore cannot initiate collapse; they propagate only through geometric constraints imposed by nearby mass-phase structures. Measurement occurs when the coherence envelope of a mass-phase structure intersects a propagating excitation and enforces local CI pacing and curvature--compliance limits on the substrate. Collapse is therefore realized as a structural re-locking of the substrate, in which only configurations compatible with the local mass-phase environment can persist. This mechanism reproduces key empirical features of quantum experiments, including the material dependence of diffraction and detection, the emergence of interference patterns at mass-phase boundaries, and the absence of photon--photon interaction in free space. Within this framework, longstanding interpretational paradoxes---including Wigner's friend, Schr\"odinger's cat, contextuality, and delayed-choice interference---admit consistent physical explanations without invoking observer-dependent realities, global wavefunction collapse, or branching worlds. Quantum measurement therefore emerges as a mass-induced structural process, in which observation reflects the deterministic reconfiguration of the substrate under finite coherence and curvature constraints rather than an epistemic update or interpretational supplement.

Hypothesis
Physical Sciences
Theoretical Physics

Mohamed Lefliti

Abstract: This paper introduces a novel theoretical framework in which wave energy serves as the fundamental basis for understanding the structure and dynamics of the universe. Within this model, elementary particles are interpreted as distinct vibrational modes of wave energy, while fundamental forces—such as gravity and electromagnetism—are represented as interference patterns arising from the interaction of these waves. Furthermore, spacetime is conceptualized as an emergent energy network formed by the multidimensional interference of wave energy. The framework incorporates the influence of extra spatial dimensions, offering new insights into phenomena such as dark matter and dark energy. Additionally, it provides corrected energy level calculations for the hydrogen atom that account for the effects of higher-dimensional contributions. The mathematical formulation is based on generalized wave equations in D-dimensions, with testable predictions for modified energy spectra.

Review
Biology and Life Sciences
Life Sciences

Noemi Császár-Nagy

,

István Bókkon

Abstract: The enteric nervous system (ENS) can function independently of the central nervous system (CNS) in regulating complex gastrointestinal processes and may possess forms of learning and memory. Recently, we have proposed that maternal stress, depression, or anxiety during pregnancy may induce stress-related long-term epigenetic implicit memory (SLEIM) in the foetal ENS via mechanisms distinct from those of the CNS. These stress imprints may persist throughout life. Through the bidirectional microbiota–gut–brain axis (MGBA), SLEIM signals originating in the embryonic ENS may influence the CNS. Because these signals are implicit and unrelated to conscious representation, the CNS cannot directly interpret them, yet it must regulate their physiological consequences. This interaction may activate stress-response systems, including the hypothalamic–pituitary–adrenal (HPA) axis and immune pathways, leading to cortisol release, mast cell activation, and cytokine imbalance. A self-reinforcing cycle may thus develop between the ENS and CNS. The frequent comorbidity of fibromyalgia (FM) and irritable bowel syndrome (IBS) suggests shared pathogenic mechanisms, particularly central sensitization and MGBA dysfunction. Clinically, patients with FM often display childlike behavioural traits. We hypothesize that early developmental influences—potentially linked to maternal SLEIM effects on the foetal ENS—may contribute to personality immaturity in FM, similarly to IBS. Under varying biopsychosocial conditions, IBS-related mechanisms may later manifest as systemic symptoms characteristic of FM.

Article
Physical Sciences
Theoretical Physics

Natalia Gorobey

,

Alexander Lukyanenko

,

Alexander V. Goltsev

Abstract: Within the framework of the new formalism of quantum theory - the quantum principle of least action - the initial state of the universe is determined, which is an analogue of the Hartle-Hawking no-boundary wave function. The quantum evolution of the universe is modified by additional conditions in a certain compact region of space-time, which is called the observation region. Additional conditions are Noether identities related to the general covariance of the theory and internal symmetries of matter fields. The consequences of the local law of conservation of the energy-momentum tensor of matter are considered in detail. Its consequence is the deterministic nature of the motion of the energy and momentum densities of matter in the observation area. The geometric parameters of the region boundary are also determined by the deterministic motion of the matter fields inside. The choice of boundary conditions for the energy-momentum flow at the boundary serves as a mechanism for decoherence of the quantum evolution of the universe. The result of decoherence is a certain correspondence between the final state of the universe and the state of the observer in the specified region. This correspondence allows us to formulate the extremum principle in quantum cosmology, in which the action functional constructed using the final state determines the world history of the universe as the observer sees it.

Article
Business, Economics and Management
Human Resources and Organizations

Marcin Nowak

,

Marta Pawłowska-Nowak

,

Joachim Lisiak

Abstract: The study aimed to identify employee profiles reflecting combinations of quiet quitting, passive quitting, and work engagement. Using a person-centred approach and unsupervised learning, survey data from 1,040 employees were analysed. Clustering relied on composite indices derived from abbreviated quiet and passive quitting scales and the Utrecht Work Engagement Scale-9 (UWES-9). Multiple algorithms (k-means, hierarchical clustering, spectral clustering, Gaussian mixture models) were compared, and the optimal solution was selected using separation metrics (Silhouette coefficient, Davies–Bouldin index, Calinski–Harabasz index), information criteria (Bayesian Information Criterion [BIC], Akaike Information Criterion [AIC]), and bootstrap stability (Adjusted Rand Index [ARI]). Four distinct employee profiles emerged, differing in boundaries, exhaustion, and energy. Findings suggest quiet quitting and passive quitting are related but distinct withdrawal mechanisms. The study advances profile-based research on employee withdrawal and highlights implications for targeted human resources (HR) interventions.

Article
Computer Science and Mathematics
Algebra and Number Theory

José Antoine Séqueira

Abstract: In this article, we introduce a hypercomplex algebra based on a binary superposition structure. Each algebraic unit is defined by a pair (ƒ; S) where ƒ ∈ {0; 1} encodes the logical presence of a base component, and S ∈ {−1; 1} encodes a geometric phase or orientation. This framework allows us to define an imaginary product that is both commutative and associative, properties rarely combined in higher-dimensional algebras. We demonstrate the consistency of this product through a binary and superposed formalism. This result provides a solid foundation for representing multi-level logic states, with potential applications in quantum computing processing.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Ruobing Yan

,

Yingxin Ou

,

Shihao Sun

,

Nuo Chen

,

Kan Zhou

,

Yingyi Shu

Abstract: Business risk prediction tasks such as fraud detection, credit default prediction, and equipment failure forecasting face two fundamental challenges simultaneously: severe class imbalance where anomalous events are extremely rare, and distribution shift where data patterns evolve over time due to changing business conditions or adversarial behavior. While existing approaches address these challenges in isolation, real-world deployment requires handling both simultaneously. We propose DualShiftNet, a unified framework that jointly addresses class imbalance and distribution shift through a two-stage architecture. The first stage learns imbalance-aware representations using synthetic minority oversampling, focal loss optimization, and class-balanced contrastive learning to create discriminative embeddings. The second stage employs Maximum Mean Discrepancy (MMD) based drift detection coupled with importance reweighting to adapt predictions under distribution shift. Additionally, we introduce an uncertainty-driven threshold calibration mechanism that dynamically adjusts decision boundaries based on detected shift intensity. Experiments on three benchmark datasets demonstrate that DualShiftNet achieves relative improvements of approximately 3–4% in AUC-ROC scores and 10–22% in F1-scores compared to state-of-the-art methods that address only one challenge. Our ablation studies confirm that both stages contribute meaningfully to performance, with the joint approach outperforming sequential or isolated solutions.

Article
Environmental and Earth Sciences
Ecology

Marco Casazza

Abstract: Immovable cultural heritage—archaeological sites, historic buildings, and culturally significant infrastructure—has traditionally been approached through conservation, material preservation, and identity-based perspectives. However, the evolution of heritage theory and the emergence of systemic paradigms such as One Health call for its reinterpretation as an active component within interconnected human, animal, and environmental systems. Although One Health recognizes the interdependence of these domains, no operational framework currently assesses the functional contribution of immovable cultural heritage. This study develops a formal methodological framework that operationalizes immovable cultural heritage as a functional element within the One Health system. The framework integrates environmental, animal, and human health domains through structured indicators, mathematical formalization, and internal validation procedures. It explicitly incorporates the coexistence of tangible and intangible heritage dimensions, acknowledging their embedded socio-ecological relationships. The plausibility and coherence of the framework is validated against established scientific literature, environmental assessment models, and foundational One Health principles. Results demonstrate that the proposed approach enables systematic, reproducible, and domain-complete assessment of immovable cultural heritage within the One Health paradigm, overcoming methodological fragmentation and supporting integration with sustainability analysis, environmental governance, resilience planning, and long-term socio-ecological stability.

Article
Computer Science and Mathematics
Computer Science

Alona Kudriashova

,

Iryna Pikh

,

Vsevolod Senkivskyy

,

Liubomyr Sikora

,

Nataliia Lysa

Abstract: The quality of vector images depends on a significant set of geometric and structural factors, which makes objective assessment a challenging task. This paper proposes a comprehensive approach to identifying and prioritizing these factors. Recursive feature elimination based on a random forest model was applied. A reachability matrix of factors was constructed to analyze direct and indirect relationships. Models describing relationships between the factors were developed. The rank and weight of each factor were calculated using a dependency-weighting system. An information system was developed to automate the process of prioritizing factors based on the proposed methodology. The software architecture was implemented in Python using the Tkinter, NumPy, and NetworkX libraries. Experimental results confirmed that the factor «coordinate accuracy» has the highest level of significance, whereas «file format» has the smallest influence on the quality of vector images. Due to the lack of dependence on specific selected factors, the developed system is universal and suitable for prioritizing factors in any application domain. Future research will focus on integrating the developed information system into a fuzzy-logic-based system for assessing the quality of vector images.

Article
Business, Economics and Management
Business and Management

Luis Saráuz-Estevez

,

Jessica Pupiales-Proaño

,

Danilo Cuaical-Tapia

Abstract: Information sustainability has emerged as a key dimension of social sustainability, as it supports equitable access to information, informed opinion formation, and institutional engagement. However, the expansion of digital platforms does not necessarily guarantee inclusive information ecosystems, particularly in local contexts characterized by structural inequalities. This study examines information sustainability and the digital divide by identifying media consumption profiles within a territorial context in Ecuador. Using data from a survey conducted in the province of Imbabura with 1,784 observations, a hybrid methodological approach combining cluster analysis and Random Forest (RF) algorithms was applied. Audience profiles were identified and validated based on media consumption patterns, levels of digitalization, and institutional engagement. The results reveal four distinct audience profiles with different levels of digital integration and institutional linkage. Findings indicate that the intensity and diversity of media consumption play a more decisive role than mere technological access. Digital access alone is insufficient to ensure information sustainability or foster institutional opinion formation; instead, differences in exposure, usage intensity, and media habits shape audience engagement. These findings highlight the need for segmented, territory-based communication strategies to strengthen information sustainability, reduce the digital divide, and reinforce the role of university media within local media ecosystems.

Article
Physical Sciences
Astronomy and Astrophysics

Huang Hai

Abstract: General Relativity (GR) has long been confronted with a fragmentation dilemma regarding black hole singularities and galaxy rotation curves: the former requires undetectable higher-dimensional quantum gravity to circumvent infinite curvature, while the latter similarly relies on undetectable dark matter to provide additional gravitational force. In this paper, we abandon the hypothesis of undetectable entities and reveal that the two challenges may share an intrinsic geometric solution: the universal asymptotic behavior of mainstream dark matter halo models is equivalent to a logarithmically corrected gravitational potential \( Φ(r)∼-(lnr+1)/r \), which originates from the self-response of the curvature divergence at the GR singularity \( (R_{trt}^r∝r^{-3}) \) via Poisson integration. At the microscopic scale, the sign reversal of lnr generates a repulsive effect, thereby avoiding the singularity. The constructed logarithmically corrected Schwarzschild metric is rigorously solved via the Lambert W function, revealing a layered internal structure determined by the black hole mass \( M \) (with thickness \( ∝1/M \)), which realizes the holographic screen of the renormalization group flow under the AdS/CFT correspondence. On this basis, we present parameter-free a priori predictions for the black hole shadows of Sgr A* and M87* that are consistent with Event Horizon Telescope (EHT) observations, and provide rigid falsifiable predictions for unobserved black holes, especially the crucial discriminative prediction for NGC315. On the galactic scale, the logarithmic term can fit the galaxy rotation curves of the Milky Way, Andromeda, and NGC2974 without the additional gravitational force from dark matter, and also successfully passes the test of the gravitational lensing phenomenon of the Bullet Cluster with good agreement with observations. On the other hand, the calculated solar system tidal difference \( (Δg∼10^{-18} m/s^2) \) is far below the current experimental limit, ensuring the validity of the equivalence principle without the need for a shielding mechanism; meanwhile, the Solar System Parameterized Post-Newtonian (PPN) tests are also consistent with GR. This work demonstrates that gravitational phenomena from black holes to galaxies are governed by the spacetime self-response triggered by the GR singularity. It further reveals that macroscopic gravitational systems may be "holographic projections" of quantum topological structures (quantum vortices). This framework thus pulls quantum gravity research from pure mathematical modeling back to the energy scales accessible to contemporary observations, and provides a new direction for thinking about the unification of General Relativity and quantum mechanics.

of 5,676

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated