Sort by

Article
Medicine and Pharmacology
Epidemiology and Infectious Diseases

Ibrahim Al-Busaidi

,

Mariam Al-Muqbali

Abstract: Background: Invasive pulmonary aspergillosis (IPA) is a major cause of morbidity and mortality in patients with hematological malignancies. Serum galactomannan (GM) is widely used for diagnosis, but the prognostic value of the initial GM level is not well established. Objective: To assess the association between the initial serum GM level at IPA diagnosis and the radiological and clinical outcomes at 42 and 90 days. Methods: We retrospectively reviewed adult patients with hematological malignancies, including hematopoietic stem cell transplant (HSCT) recipients, diagnosed with proven or probable IPA at Sultan Qaboos University Hospital between 2014 and 2017, according to the EORTC/MSG criteria. Demographic, microbiological, radiological, and clinical data were collected. Outcomes were assessed using three GM cut-off categories. Results: Seventy-eight patients were included. The median age was 44.5 years (range 18–76); 53.8% were male. Lymphoma (30.7%) and acute leukemia (25.6%) were the leading underlying diseases. Voriconazole was the most frequently used antifungal agent (74.3%). Prolonged neutropenia was present in 61.9%, and 30.7% had received HSCT. The mean serum GM at diagnosis was 1.95 (range 0.5–7.89). At 90 days, 30% had a complete radiological response, 51.4% partial, and 32.8% no response. Overall, 90-day mortality was 35.9%. There was no statistically significant association between initial GM level and 90-day mortality across categories (GM 0.5–3.0: 34.2% mortality; GM > 3.0: 60%; p = 0.243). Within the GM 0.5–3.0 group, complete radiological response was strongly associated with survival (95.2% alive at 90 days; p < 0.001). Conclusions: The initial serum GM level was not significantly associated with clinical or radiological outcomes at 42 or 90 days in patients with hematological malignancies and IPA. However, an early complete radiological response was strongly associated with improved survival, supporting the use of follow-up CT chest imaging to guide management.

Article
Engineering
Other

Apidul Kaewkabthong

,

Jedsada Saijai

,

Pisitwitthaya Sriphuk

,

Agustami Sitorus

,

Vasu Udompetaikul

Abstract: Sugarcane harvester performance varies substantially with field geometry, crop, and operator factors, yet separating these sources from telematics data while preserving engineering interpretability remains a methodological gap. This study models field efficiency (Eff) and harvesting capacity (Ca) separately from JDLink telematics, aligning model structure with each target's response behavior. Operational data covered 105 plots across four seasons (2019/20–2022/23) from three John Deere chopper harvesters in eastern Thailand. Six engineering-relevant predictors were retained after multicollinearity screening, and linear (MLR), additive nonlinear (GAM), and tree-based models were compared under 5-fold grouped cross-validation by BaseField (87 groups). Eff was assigned to GAM (R²CV = 0.621 ± 0.114) on the basis of its threshold-like response to turning frequency; Ca was retained for MLR (R²CV = 0.681 ± 0.121), with GAM essentially tied. Train–validation gaps were substantially smaller for additive models (0.096–0.118) than for tuned tree-based candidates (GBR 0.210–0.302, RF 0.322–0.358). Turning frequency (TF) and perimeter-to-area ratio (PAR) were the strongest predictors, and a constant-turn-time partial-out test indicated that TF's univariate effect on Eff is largely mediated by the time-budget identity. Tactical interventions (path planning, operator training, machine–field allocation) are immediately feasible, although strategic field-layout change remains constrained by smallholder land tenure.

Article
Medicine and Pharmacology
Otolaryngology

Andra-Lavinia Greța-Oanță

,

Alexandra Roman

,

Ioana Berindan-Neagoe

,

Ștefan Strilciuc

,

Stefan Cristian Vesa

,

Laura-Ancuta Pop

,

Veronica-Elena Trombitaș

,

Silviu Albu

Abstract: Bitter taste receptors (T2Rs), specifically T2R38, are present in the respiratory epithelium and react with bacterial quorum-sensing molecules to induce an innate immunity response. Although T2R38 polymorphisms have been correlated with susceptibility to chronic rhinosinusitis (CRS), they have not yet been explored in odontogenic rhinosinusitis (ORS), a distinct form of CRS with particular microbial and inflammatory features. Objectives: We aim to establish a proof-of-concept methodology for investigating T2R38 genetic variants in ORS using direct maxillary sinus tissue analysis and demonstrate the feasibility of this translational approach. Methods: We conducted a prospective case-control study of 36 ORS patients and 37 controls undergoing septoplasty without sinonasal disease. Maxillary sinus mucosal biopsies were obtained intraoperatively with informed consent. Genomic DNA was extracted using the PureLink Genomic DNA Mini Kit and quantified via NanoDrop spectrophotometry. T2R38 haplotypes were determined and classified as taster (PAV/PAV), non-taster (AVI/AVI), or intermediate (PAV/AVI) phenotype. Results: T2R38 phenotype distributions between ORS patients and controls were: tasters 11.1% vs 18.9%, non-tasters 27.8% vs 18.9%, and intermediate phenotypes 50.0% vs 37.8%, respectively. Statistical analysis revealed no significant association between T2R38 phenotypes and ORS susceptibility (Pearson χ² = 0.372, df = 1, p = 0.542; Fisher's exact test p = 0.595). The effect size was minimal (φ = 0.07). Non-taster phenotype showed a non-significant trend toward higher prevalence in ORS patients (OR = 1.4, 95% CI: 0.5–3.9, p > 0.5), though this finding lacks statistical power given the sample size. Conclusion: This proof-of-concept study successfully demonstrates the feasibility of T2R38 genotyping from maxillary sinus mucosa in ORS patients, establishing a novel methodological framework for investigating genetic factors in odontogenic sinonasal disease. While preliminary findings suggest potential phenotype differences (non-taster prevalence: 27.8% vs 18.9%), the study's primary value lies in validating the translational approach and informing power calculations for definitive multicenter investigations. This methodology provides the foundation for future studies to elucidate the role of taste receptor genetics in ORS pathogenesis and potentially guide personalized therapeutic strategies.

Brief Report
Medicine and Pharmacology
Psychiatry and Mental Health

Justin Mausz

,

Elizabeth A. Donnelly

,

Alan M. Batt

,

Meghan M. McConnell

,

Nadia Aleem

,

Walter Tavares

Abstract: Objectives: Paramedics are at elevated risk for adverse mental health outcomes due to occupational exposures including trauma, workplace violence, and chronic operational stress. Community Paramedicine (CP) represents an evolving model of care in which paramedics provide scheduled, non-urgent clinical and psychosocial support, potentially altering exposure profiles and the associated mental health risks. Our objective was to estimate the prevalence of mental health concerns among community paramedics and compare their risk of adverse mental health outcomes with paramedics working in 9-1-1 emergency response roles. Methods: We conducted a cross-sectional survey of paramedics from two Ontario services during compulsory in-person continuing medical education sessions from September to December 2024. Participants completed validated self-report screening tools for posttraumatic stress disorder, major depressive disorder, generalized anxiety disorder, insomnia, alcohol use, and burnout. We used logistic regression models adjusted for demographic variables to assess the association between CP role and mental health outcomes. Results: A total of 995 paramedics participated (96% of eligible), including 63 (6%) assigned to CP roles. Overall, 12% screened positive for PTSD, 25% for major depressive disorder, 23% for generalized anxiety disorder, 30% for insomnia, and 36% for at least moderate burnout. CPs had significantly lower adjusted risk of major depressive disorder compared to paramedics in 9-1-1 response roles (adjusted odds ratio [aOR] 0.45, 95% confidence interval [CI] 0.20-0.98). For other outcomes, our point estimates favored lower risk among CPs but did not reach statistical significance, including a composite outcome of PTSD, major depressive disorder, or generalized anxiety disorder (aOR 0.81, 95% CI 0.46-1.45). Conclusions: Community paramedics demonstrated a lower adjusted risk of major depressive disorder and a consistent, though non-significant, pattern toward lower risk across multiple mental health outcomes compared to paramedics in 9-1-1 response roles. These findings suggest a potentially different occupational risk profile associated with CP practice environments. Further longitudinal and mixed-methods research is warranted.

Article
Medicine and Pharmacology
Surgery

Ozan Baskurt

,

Mehmet Arda Inan

,

Kubilay Ukinc

,

Nurperi Gazioglu

Abstract: Background/Objectives: Sellar solitary fibrous tumors (SFTs) are exceptionally rare mesenchymal neoplasms that frequently mimic non-functioning pituitary adenomas both clinically and radiologically. Because of their nonspecific imaging characteristics, accurate preoperative diagnosis remains challenging and often requires histopathological and immunohistochemical confirmation. Nuclear STAT6 expression has become a key diagnostic marker for this entity. Methods: We present a case-based diagnostic analysis of a high-grade (WHO grade 3) sellar SFT initially misdiagnosed as a pituitary adenoma. Clinical, radiological, intraoperative, and histopathological findings were systematically evaluated and correlated. In addition, previously reported sellar SFT cases were reviewed to identify recurring diagnostic patterns and pitfalls. Results: A 65-year-old male presented with headache, progressive visual impairment, and hypopituitarism. Magnetic resonance imaging demonstrated a heterogeneously enhancing sellar mass with suprasellar extension and cavernous sinus involvement, leading to a presumptive diagnosis of pituitary adenoma. Intraoperatively, the lesion was markedly hypervascular and fibrous, raising suspicion for an alternative diagnosis. Histopathological examination revealed a spindle-cell neoplasm with a hemangiopericytoma-like vascular pattern, increased mitotic activity, and strong nuclear STAT6 positivity, confirming a WHO grade 3 SFT. Literature analysis showed that most reported sellar SFTs share overlapping MRI features with pituitary adenomas and are frequently misdiagnosed preoperatively. Conclusions: Sellar SFT should be considered in the differential diagnosis of atypical sellar lesions, particularly when imaging findings are inconclusive and intraoperative features suggest a hypervascular and fibrous tumor. Radiological–pathological correlation, including STAT6 immunohistochemistry, is critical for accurate diagnosis. Increased awareness of these diagnostic pitfalls may improve recognition of this rare entity and guide surgical and pathological decision-making.

Article
Engineering
Bioengineering

Mark Korang Yeboah

,

Ahmad Addo

,

Nana Yaw Asiedu

Abstract: Consolidated bioprocessing (CBP), where enzyme production, substrate hydrolysis, and fermentation occur in a single bioreactor, provides a promising pathway for lignocellulosic ethanol production. Nevertheless, CBP operation involves trade-offs among ethanol titer, productivity, substrate conversion, soluble sugar accumulation, batch cycle time, and the operating severity associated with temperature and pH profiles. This study introduces a feasibility-aware multi-objective dynamic optimization approach for identifying Pareto-optimal operating policies for batch CBP processes. A simplified, mechanistically driven dynamic model is developed to represent biomass growth, enzyme activity, insoluble substrate hydrolysis, soluble sugar formation and consumption, ethanol production, and inhibition under time-varying temperature and pH profiles. The multi-objective optimization simultaneously maximizes ethanol titer, productivity, and substrate conversion while minimizing sugar accumulation, operating severity, control effort, and batch time. In the main simulation run, 120,000 dynamic policies were evaluated, resulting in 5,017 feasible policies and 328 feasible Pareto-optimal policies under a minimum conversion threshold of 0.42. The optimized dynamic policy achieved an ethanol titer of 1.265 g L−1, a maximum productivity of 0.017 g L−1 h−1, and a maximum conversion of 0.440. Compared with the best static policies, the dynamic Pareto policies improved ethanol titer, productivity, and conversion by 10.6%, 8.3%, and 14.3%, respectively. The feasibility analysis showed that a conversion threshold of 0.42 is stringent but achievable, whereas thresholds of 0.44 and 0.55 were not attainable under the current dynamic model and operating range. Independent-seed repetition confirmed the existence of a consistent high-performing region across different stochastic searches. The resulting Pareto front and operating-policy charts provide a useful basis for selecting temperature and pH profiles for CBP process operation.

Article
Public Health and Healthcare
Public, Environmental and Occupational Health

Urban Schwegler

,

Mahesh Sarki

,

George Austin-Cliff

,

Albert Marti

,

Martin WG Brinkhof

Abstract: Vocational integration (VI) services aim to support sustainable employment for persons with disabilities. However, in individuals with spinal cord injury (SCI), evidence on effective intervention targets and the evaluation of sustainable integration remains limited. The Work-Life Study aims to build an evidence base for supporting sustainable employment in Switzerland, by: (1) identifying typical work-life trajectories; (2) examining key work-life transitions and their predictors; (3) establishing a multi-state model for intervention targets; (4) exploring individual work-life narratives; and (5) developing guidelines for personalized VI practice. The study combines a mixed-methods design with a collaborative Integrated Knowledge Translation approach, actively involving VI professionals and individuals with SCI. Participants are recruited from the Swiss SCI Cohort Study (SwiSCI). Work-life history data are collected through a Biographical Survey and Biographical Interviews and analyzed alongside SwiSCI data. Guideline development includes a stakeholder meeting with representatives from the Swiss Paraplegic Group, SCI clinics, individuals with SCI, employers, and disability insurers. Of 2,041 eligible SwiSCI participants, 478 (23.4%) completed the Biographical Survey (median age 57.5 years; median time since SCI 19.1 years), with responders and non-responders showing comparable characteristics. Work-life data closely matched existing SwiSCI data (rho > 0.8), indicating good recall. The resulting guidelines will help VI providers coordinate rehabilitation services to optimally promote sustainable employment for individuals with SCI.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Ziyad Azzaz

,

Omar Mohamed

,

Esraa Khatab

,

Hany Said

,

Omar Shalash

Abstract: Identity verification across pre-operative and post-operative facial images remains a challenging task, particularly following eyelid surgery, where localized periocular changes can disrupt conventional face recognition systems. This research introduces a novel verification framework using an ensemble-based autoencoder-initialized siamese eye-region periocular verification network designed to remain resilient to surgically induced appearance variation. The proposed approach integrates anatomy-guided periocular normalization with a Siamese deep metric learning architecture initialized through unsupervised autoencoder pretraining, allowing the model to acquire periocular-specific representations prior to supervised learning. Robustness in this data-limited clinical setting is further enhanced through staged hard-negative mining, validation-weighted multi-seed ensemble learning, and bootstrap-based threshold calibration. Ensemble Grad-CAM is employed to provide visual explanations that support clinical interpretability. Experimental evaluation demonstrates strong and consistent performance, achieving recognition rates of 94.71% on training data, 96.77% on validation, and 96.08% on the test set, with an overall recognition rate of 95.24%. Compared to previously reported periocular verification methods which reported an overall recognition rate of only 91.8% under similar conditions. These results highlight the effectiveness and stability of the proposed framework for post-surgical periocular identity verification in clinical and forensic applications.

Review
Medicine and Pharmacology
Medicine and Pharmacology

Paraskevi Zagana

,

Alexandra Paxinou

,

Athina Latsi

Abstract: Cancer drug development still relies heavily on preclinical models that often fail to predict clinical efficacy. Although two-dimensional (2D) cell cultures and animal models have contributed significantly to cancer research, they do not adequately capture the complexity, heterogeneity, and microenvironmental conditions of human tumors. As a result, pharmacological findings generated with these systems frequently show limited clinical translation. This review discusses the conceptual distinction between drug activity and predictive pharmacology, arguing that successful target modulation in simplified experimental systems does not necessarily predict therapeutic benefit in patients. The limitations of conventional preclinical approaches, including homogeneous drug exposure in 2D cultures and species-specific differences in animal models, are briefly examined. This review further highlights the potential of human-relevant models, such as patient-derived organoids and microphysiological systems, to improve the predictive value of preclinical testing. These platforms allow more realistic evaluation of drug response, resistance mechanisms, and functional biomarkers under conditions that better resemble human tumor biology. Altogether, the integration of functionally informative models into drug development pipelines may support more accurate and clinically relevant pharmacological decision-making.

Article
Medicine and Pharmacology
Surgery

Külahcıoğlu Emre

,

Özçelik Sinan

,

Koçak Nuh Can

,

Çiçekyurt Emre

,

Akkaya Bekir Boğaçhan

,

Aytekin Bahadır

,

İşcan Hakkı Zafer

Abstract: Objective: To evaluate the association of preoperative morphometric and morphovolumetric parameters with post-endovascular aneurysm repair (EVAR) sac remodeling, endoleak development, and secondary interventions, and to assess the role of volumetric analysis in post-EVAR surveillance. Methods: This retrospective single-center study included 383 patients who underwent elective EVAR for infrarenal abdominal aortic aneurysm between 2016 and 2024, with available pre- and postoperative computed tomography angiography and at least 1 year of follow-up. Diameter- and volume-based sac dynamics were analyzed using standardized morphometric and 3-dimensional morphovolumetric measurements. Endoleak subtype distribution, risk factors, secondary interventions, and survival were assessed using regression and survival analyses. Results: Endoleaks were detected in 26.1% of patients (n = 100), with type II endoleak being the most frequent subtype (12.3%, n = 47), followed by type Ib (6.8%, n = 26), type III (5.5%, n = 21), type Ia (4.2%, n = 16), and 1 patient with type V endoleak in the revised manuscript framework. Secondary interventions were required in 14.1% of patients (n = 54), mainly for type I and III endoleaks, with a mean time to reintervention of 21.7 ± 10 months. Diameter and volume changes were strongly correlated; a 10% increase in aneurysm volume corresponded to an average 4 mm increase in diameter (R² = 0.72, p < 0.001). Volumetric analysis detected sac change earlier than diameter measurements, particularly in stable sacs and type II endoleaks. Significant predictors of overall endoleak included dual antiplatelet therapy, aneurysm length >133 mm, elevated pre- and postoperative D-dimer levels, aneurysm diameter >59 mm, aneurysm volume >164 cm³, and thrombus volume >89 cm³. Subtype-specific analyses identified distinct risk profiles for type Ia, Ib, II, and III endoleaks. Overall survival did not differ significantly between patients with and without endoleaks (p = 0.227), although worse survival was observed in type Ia and III endoleaks than in type II and Ib endoleaks. Conclusion: Preoperative morphovolumetric parameters are significant predictors of post-EVAR endoleaks and secondary interventions. Volumetric analysis appears more sensitive than diameter-based assessment for early detection of sac remodeling, especially in type II endoleaks. Post-EVAR management should integrate endoleak subtype, sac behavior, and patient-specific morphovolumetric risk factors to improve surveillance and treatment selection.

Article
Public Health and Healthcare
Health Policy and Services

Aleksej Omeljančiuk

,

Eimantas Peičius

,

Aušra Urbonienė

,

Gvidas Urbonas

Abstract: Background/Objectives: Artificial intelligence reshapes clinical practice and its effect on physician-patient relationship requires reconsideration of frameworks that have shaped modern medical ethics. When physician delegate expertise to algorithms they cannot verify, it becomes unclear who bears clinical responsibility. Methods: This article applies theoretically grounded normative approach to explore ethical conditions under which artificial intelligence can be integrated into clinical practice without compromising the moral foundations of medicine. The analysis is primarily based on Pellegrino and Thomasma’s concept of internal morality of medicine and the physician’s act of profession. It further draws on Kantian ethics of human dignity, Levinasian relational ethics, virtue ethics, and Vallor’s concept of technomoral wisdom. Results: AI systems do not satisfy the conditions under which moral responsibility can be ascribed to them. Clinical moral agency lies in the capacity to bear three distinct responsibilities – epistemic, relational, and phronetic – none of which can be fulfilled by AI. The implementation of AI in healthcare, therefore, must occur strictly under the condition of Meaningful Human Control, rather technical function of human oversight over algorithmic outputs. To ensure that MHC can function as an effective and ethically grounded safeguard, we propose five normative requirements: primacy of clinical judgement, prohibition of forced automation, traceability and explainability, transparency towards patients, and clinical authority over diagnostic tools. A dialog between the physician and the patients should remain the foundation of clinical decision-making. Proposed normative requirements aim to preserve internal morality of medicine in a form that harmoniously combines both technological progress and established medical ethics.

Review
Medicine and Pharmacology
Pulmonary and Respiratory Medicine

Amrit Kooner

,

Lee Man

,

Justin Best

,

Nicholas Litsky

,

Brianna Yee

,

Justin Jeffries

Abstract: SRBD encompasses a spectrum of diseases that disrupt ventilation during sleep, that lead to fragmented sleep and impaired gas exchange. Their high prevalence and substantial neurocognitive and mental health outcomes make SRBD clinically significant across multiple medical disciplines. Traditional management includes lifestyle modifications and PAP. When non-surgical measures fail or anatomical factors predominate, a range of surgical approaches may be employed, such as UPPP or MMA. There are many notable emerging surgical advancements, such as hypoglossal nerve stimulation, transoral robotic surgery, and minimally invasive radiofrequency technologies that have offered improved outcomes for select patients. There are evolving advances in diagnostic tools, such as portable home sleep technologies and drug-induced sleep endoscopy, that further support precision-based care. Collectively, the expanding range of therapeutic and diagnostic innovations is enabling clinicians to deliver individualized care and improve long-term outcomes for patients with SRBD.

Article
Engineering
Telecommunications

Majd Hamdan

,

Lina Yılmaz

,

Ibraheem Shayea

,

Leila Rzayeva

Abstract: The combination of ultra-dense network deployments and high mobility results in an unfavorable outcome, rendering the task of handover more difficult than in environments typical of previous generations. 5G and 6G necessitate the deployment of heterogeneous networks and small cells to meet the demand, which at the same time introduces certain challenges. This scenario introduces small cells (such as femtocells, picocells, and microcells) that have very limited coverage areas, which, combined with the high speed of user equipment, create an excessive number of handover triggers, leading to the “ping-pong effect,” which wastes network resources and degrades the overall Quality of Service. Furthermore, high mobility means that a user might enter and exit a cell in less time than the mobile terminal’s dwell time, dropping the connection and resulting in handover failures and radio link failures. The conventional handover methods that rely on thresholds of certain factors such as the received signal strength could be insufficient for these environments. Different criteria should be balanced to avoid the drop, such as the user’s velocity, dwell time, target cell load, available bandwidth, device battery, and application latency requirements. Predictive methods could be a more efficient alternative to the existing reactive ones. This paper presents a decision-tree-based algorithm as one predictive method that learns the patterns among all the criteria mentioned and is particularly useful for avoiding ping-pongs and limiting handover failures. The classifier is trained on real multi-operator drive-test data with ping-pong events excluded from the positive class, and evaluated under Leave-One-Trace-Out cross-validation on 16 traces covering UMTS, HSUPA, HSPA+, and LTE cells. The proposed system achieves F1=0.642 and AUC =0.797 under LOTO, with a +0.052F1 lift over the best threshold-based baseline, while remaining interpretable and deployable in real time. The paper aims to present a solution applicable also to 5G NR and 6G.

Article
Biology and Life Sciences
Food Science and Technology

Evangelia A. Karamani

,

Eirini Kerousi

,

Margarita Adosidi

,

Georgios Vafeiadis

,

Ioannis S. Boziaris

,

Efstathios Giaouris

,

Foteini F. Parlapani

Abstract: This study examined the biofilm-forming capacity and antibiotic resistance of 93 L. monocytogenes isolates from poultry to better understand how this pathogen persists and how it can be more effectively controlled throughout the poultry production process. Biofilms were evaluated on polystyrene microtiter plates at 12 and 30 °C in a nutrient-rich laboratory medium (Brain Heart Infusion, BHI). Susceptibility to eight clinically and food-relevant antibiotics was tested using disk diffusion and interpreted according to European Committee on Antimicrobial Susceptibility Testing (EUCAST) breakpoints when available. Most isolates produced detectable but generally weak biofilms at both temperatures, with a subset shifting to moderate biofilm formation after prolonged incubation at 12 °C; no strong biofilm producers were identified. This highlights the significant influence of incubation time and temperature on surface colonization. Overall, the isolates remained largely susceptible to ampicillin, penicillin G, vancomycin, tetracycline, and chloramphenicol, although resistant or low-susceptibility subpopulations were noted for trimethoprim–sulphamethoxazole (TMP-SMX) and, particularly, erythromycin and streptomycin. No consistent correlation was found between biofilm-forming ability and antibiotic susceptibility, indicating that these phenotypic traits are largely independent in this collection. These findings reveal that poultry-derived L. monocytogenes isolates can form weak to moderate biofilms under the tested monoculture conditions while generally maintaining susceptibility to first-line antibiotics. However, the development of macrolide- and aminoglycoside-resistant subpopulations, along with the potential for increased colonization within complex multispecies biofilms in real processing environments, emphasizes the importance of ongoing integrated surveillance across animal food systems.

Article
Public Health and Healthcare
Public Health and Health Services

Debora Di Mauro

,

Fabrizio Calapai

,

Ilaria Ammendolia

,

Mariaconcetta Currò

,

Fabio Trimarchi

,

Carmen Mannucci

Abstract: Background/Objectives: L-carnitine is a naturally occurring compound involved in energy metabolism, while Coenzyme Q10 (CoQ10) is primarily indicated for CoQ10 deficiency and as adjuvant therapy in chronic heart failure. Both are widely used off-label in sports to enhance performance, reduce fatigue, and improve recovery. Despite their popularity, their safety profiles are mainly derived from pre-marketing studies conducted in deficient or clinical populations, not in athletes. Given this limitation, the present study aimed to evaluate and compare the real-world safety profiles of L-carnitine and CoQ10 using spontaneous reports of adverse drug reactions (ADRs) from the EudraVigilance database. Methods: EudraVigilance, managed by the European Medicines Agency (EMA), collects spontaneous reports of suspected ADRs related to authorized medicines. ADRs associated with L-carnitine and CoQ10 were analyzed and compared at the System Organ Class (SOC) level using reporting odds ratio (ROR) and proportional reporting ratio (PRR). Results: For L-carnitine, the most frequently reported ADRs were gastrointestinal disorders, followed by skin and subcutaneous tissue disorders, general disorders, and nervous system disorders. For CoQ10, the most common ADRs were general disorders and administration site conditions, followed by nervous system disorders, investigations, and gastrointestinal disorders. Comparative analysis (ROR and PRR) showed that CoQ10 was associated with a higher probability of reporting certain ADR categories, particularly blood and lymphatic disorders, musculoskeletal and connective tissue disorders, and nervous system disorders. Conclusions: Although L-carnitine and CoQ10 are widely perceived as safe and commonly used by athletes, real-world data highlight the need for increased awareness of potential risks. Continuous monitoring and periodic reassessment of their benefit–risk profile are essential, especially considering their widespread off-label use.

Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Byunghyun Ban

Abstract: Artificial intelligence (AI) research in logistics has rapidly shifted since 2024 toward operationally specific systems for forecasting, routing, warehouse optimization, supply-chain visibility, port scheduling, and smart-port maintenance. This paper presents an evidence-oriented operational review of studies published since 2024 on AI applications in logistics. Instead of classifying the literature by model type alone, the review organizes studies by logistics decision function and evaluates the evidence profile of each application: whether real operational data were used, whether the study relied on simulation or benchmark instances, whether the data scale was reported, and whether field validation was conducted. The analysis shows that recent AI logistics studies increasingly address concrete operational tasks, including demand forecasting, late-delivery prediction, route deviation prediction, dynamic vehicle routing, warehouse order picking, robotic fulfillment scheduling, vessel arrival prediction, berth allocation, quay crane scheduling, container dwell-time prediction, and predictive maintenance at smart ports. However, the evidence base remains uneven. Several studies use real operational or survey data, while many warehouse and routing studies rely on simulation, generated instances, or benchmark settings. Field validation remains rare. The paper argues that the next stage of logistics AI research should move beyond model accuracy and report operational evidence: data provenance, data scale, logistics KPIs, field validation, integration requirements, and human oversight.

Article
Public Health and Healthcare
Public, Environmental and Occupational Health

Mohamad Iqbal Mazeli

,

Nor Zam Azihan Mohd Hassan

,

Mohd Azahadi Omar

Abstract: Background: In 2016, the World Bank Group estimated that health costs related to PM2.5 pollution totalled approximately $5.7 trillion worldwide. Information on the estimated health costs from the environmental burden of disease caused by ambient air pollutant PM2.5 in Malaysia is limited. Therefore, this study aimed to estimate the environmental health costs associated with PM2.5 for all-cause and respiratory mortality at the national level. Methods: The population-weighted exposure level (PWEL) of PM10 concentrations across all Malaysian states for 2000, 2008, and 2013 was calculated using publicly available remote sensing data, air quality data from the Department of Environment, and burden-of-disease mortality statistics from the Institute of Public Health. The PWEL was then converted to PM2.5 using the World Health Organization's ambient air conversion factors. The AirQ+ 2.2.4 software was used to calculate mortality proportions for all-cause mortality, chronic obstructive pulmonary disease (COPD), lung cancer (LC), and acute lower respiratory infection (ALRI) in children under five, based on the National Burden of Disease data from 2000, 2008, and 2013. Results: The cost per disability-adjusted life-year (DALY) ranged from one (low estimate) to three times the Gross Domestic Product (GDP) per capita (high estimate). These costs were projected for 2022 using a GDP deflator. The estimated PWEL for PM2.5 in 2000, 2008, and 2013 was 22µg/m³, 18µg/m³, and 24µg/m³, respectively. The mortality cost of all-cause deaths increased from MYR 3.3 billion in 2000 to MYR 5.1 billion in 2008, and then to MYR 12.8 billion in 2013, accounting for nearly 1% of Malaysia's 2013 GDP. Conclusions:This indicates a rise in disease burden and mortality costs due to ambient air PM2.5 levels. Therefore, policymakers should remain highly vigilant.

Article
Social Sciences
Psychology

Sofia Oliveira

,

Ricardo Pacheco

,

Luís Curral

,

Alexandra Marques-Pinto

Abstract: Transition to higher education represents a critical period marked by academic, emotional, and social challenges that can affect students’ well-being. Although social and emotional competence (SEC) and self-care practices have been identified as protective factors of well-being, there is a gap in understanding how these concepts intersect within higher education. In a two-phase mixed-methods study, we began by exploring the main challenges perceived by higher education students in adapting to university and which SEC and self-care practices they perceived as most relevant to promoting their personal and academic well-being. Building on these insights, we then investigated the mediating role of self-care practices in the relationship between students’ SEC and their well-being. In the first stage of the study, 16 higher education students (81.3% female, M = 22.19 years) participated in semi-structured interviews; additionally, 204 higher education students (77.9% female, M = 22.10 years) responded to an online survey. Qualitative findings suggested that the most significant challenges in the adaptation to university were of a social and emotional nature, related to emotional challenges, interpersonal relationships, and personal organization. To overcome these, students primarily valued intrapersonal competencies such as self-awareness and self-regulation. Participants predominantly described using personal self-care practices, focusing on psychological and emotional care. Generalized linear model-based mediation analysis sustained that both personal and academic self-care practices mediated SEC effects on students’ personal well-being. However, only academic self-care practices mediated SEC effects on their academic well-being. Self-regulation competencies had the strongest effect on students’ personal and academic well-being. This research contributes to a strengthened theoretical understanding of the interplay between higher education students’ SEC, self-care practices, and well-being, offering new empirical evidence on how these relate.

Article
Biology and Life Sciences
Animal Science, Veterinary Science and Zoology

David Martínez-Matamoros

,

Miriam Sánchez-Vivanco

,

Jessica Valdivieso-Tituana

,

Orlando Meneses-Quelal

Abstract: Canine periodontal disease is a highly prevalent chronic inflammatory condition with a multifactorial etiology, influenced by host factors and complex subgingival bacterial communities; however, evidence from populations in underrepresented regions remains limited. The objective of this study was to evaluate the association between host factors (age, diet, and cranial morphology) and the presence and severity of periodontal disease, as well as to characterize the subgingival bacterial profile using culture-based methods in an urban clinical population in Ecuador. A cross-sectional, analytical, observational study was conducted on 100 dogs treated at veterinary clinics in Loja. Periodontal status was classified according to AVDC criteria, defining the outcomes as the presence of periodontal disease (stages 1–4 vs. 0) and advanced periodontitis (stages 2–4 vs. 0–1). Subgingival samples were collected using sterile paper points and processed under aerobic and anaerobic conditions, with analyses performed individually. Periodontal disease was present in 68.0% of dogs, with 37.0% in advanced stages. Age was the only factor independently associated with both the presence (OR = 1.18; 95% CI: 1.02–1.36; p = 0.021) and severity (OR = 1.22; 95% CI: 1.05–1.41; p = 0.009), while diet, sex, and cranial morphology showed no significant associations (p > 0.05). The bacterial profile was polymicrobial (3.86 positive isolates per individual), and no taxon showed a significant association after FDR correction. Taken together, these results support a multifactorial and polymicrobial model, highlighting age as the main associated factor and emphasizing the need for molecular approaches in future studies.

Review
Social Sciences
Education

Patrícia Albergaria-Almeida

Abstract: Questioning is widely recognised as a key dimension of learning in science education, yet learner questioning has often been treated as a secondary aspect of classroom participation rather than as a central pedagogical and epistemic practice. This article offers a conceptual examination of questioning in relation to science education for sustainability, informed by a critical interpretive engagement with literature on questioning, participation, classroom dialogue, engagement, and science education. It argues that science education for sustainability requires more than the transmission of scientific knowledge, calling instead for pedagogical spaces in which learners can engage with complexity, uncertainty, interpretation, and the ethical and social dimensions of socio-scientific issues. The article’s main contribution lies in repositioning learner questioning as a central condition of science education for sustainability and in showing that questioning is shaped not only by knowledge and motivation, but also by participation, hesitation, silence, and broader dynamics of voice, legitimacy, and power. In this perspective, fostering questioning becomes essential to more inclusive, dialogic, reflexive, and transformative approaches to science education for sustainability. The article further argues that fostering questioning in this way contributes directly to the educational ambitions embedded in SDG 4, SDG 13, and SDG 16 - making questioning-centred pedagogy not merely a methodological choice, but a condition for more democratic, just, and transformative science education for sustainable development.

of 5,884

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated