Sort by

Article
Public Health and Healthcare
Health Policy and Services

Ji-Soo Kim

,

Younghee Noh

,

Jong-Hwa Jang

Abstract: Background/Objectives: Adolescence is a critical period for establishing lifelong oral health behaviours; however, persistent oral health problems and limitations in conventional school-based oral health education (OHE) highlight the need for more engaging and scalable approaches. Emerging digital modalities, such as artificial intelligence (AI)-based virtual human (VH) education, offer a promising alternative but remain insufficiently evaluated in adolescent populations. This study aimed to evaluate the effectiveness of AI-based virtual human–based oral health education (VOHE) program compared with conventional face-to-face oral health education (FOHE) among adolescents. Methods: A cluster randomised pretest–post-test intervention design was employed. Participants received either VOHE or FOHE, followed by assessment using a structured questionnaire based on the Knowledge–Attitude–Practice (KAP) model. A total of 268 middle school students were assessed for changes in oral health literacy (OHL) and oral health-related KAP. A linear mixed-effects model was applied to evaluate the effects of time, group (VOHE vs. FOHE), and their interaction, with participants treated as random effects to account for within-individual correlations. Results: Both groups demonstrated significant improvements in OHL and oral health related KAP following the intervention (all p < 0.05). However, no significant group × time interaction effects were observed for any outcome variables (all p > 0.05), suggesting that VOHE achieved educational outcomes comparable to those of FOHE. These findings indicate that AI-based VH education may provide an effective and scalable approach for adolescent OHE. Conclusions: VOHE demonstrated effectiveness comparable to FOHE and may serve as a feasible alternative or complementary approach for adolescent OHE. AI-based VH education also has potential applicability as an accessible digital health intervention for school- and community-based oral health promotion, particularly in digitally mediated or resource-limited educational settings.

Article
Biology and Life Sciences
Food Science and Technology

Chirasak Phoemchalard

,

Neungrutai Senarath

,

Patcharee Malila

,

Tanom Tathong

,

Ronnachai Prommachart

Abstract: Adulteration of beef (Bos indicus) with buffalo meat (Bubalus bubalis) is a common form of food fraud with economic and religious implications, but quantitatively detecting its presence in ground beef products is difficult. Ten replicates of each of six binary mixtures (100:0 to 0:100 % w/w) of ground beef and buffalo meat were characterized using untargeted 1H NMR metabolomics (43 metabolites after QC filtering), physicochemical measurements (pH, CIE L*a*b* color, water activity, and electronic nose), and proximate composition. Fifteen pairwise OPLS-DA models and a 1000-fold permutation test were performed for discrimination and biomarker identification. PCA explained 54.2% of the total variance, and the adulteration groups separated along the PC1 axis. All OPLS-DA models were statistically valid (R2Y = 0.738–0.981; Q2 = 0.532–0.961; pQ2 < 0.001), with no evidence of overfitting. Three metabolites met all three criteria (VIP > 1.0, FDR < 0.05, < FC > 2 or < 0.5) and had AUC = 1.00 in the internal data set: betaine (−82.6% in buffalo vs. beef), glycerol (+154.7%), and malonate (+656%). No individual biomarker exceeded the multi-criterion threshold at buffalo substitution levels less than10%. The selection of external discovery-phase candidates for beef authentication using NMR includes betaine, glycerol, and malonate.

Review
Medicine and Pharmacology
Cardiac and Cardiovascular Systems

Ahmad Ibrahim Alshdaifat

,

Wamadeva Balachandran

,

Ziad Hunaiti

Abstract: Coronary Artery Disease (CAD) is the leading cause of death worldwide, highlighting the need for more reliable and efficient diagnostic tools beyond conventional methods. Artificial Intelligence (AI), particularly Machine Learning (ML) and Deep Learning (DL), has shown strong potential for detecting obstructive CAD by learning complex patterns from Electrocardiogram (ECG) and Coronary Computed Tomography Angiography (CCTA) data. This rapid systematic review assesses and compares the diagnostic performance and methodological quality of AI models built for CAD prediction using ECG and CCTA data. A systematic search following PRISMA 2020 guidelines was conducted for primary studies published between 2021 and 2025. Eleven studies were included, six using ECG data and five using CCTA data. Methodological quality was evaluated using the PROBAST+AI tool. ECG-based models achieved AUC (0.72--0.961); however, only 33\% of these studies used external validation cohorts. CCTA-based models showed slightly stronger top-end performance, with AUC (0.77--0.97), and were more methodologically rigorous, with 80\% applying external validation. Despite these strong results, PROBAST+AI assessment revealed a high risk of bias in 90.9\% of the included studies, largely due to weaknesses in the analysis domain, including poor handling of missing data and the absence of model calibration reporting. AI models show strong diagnostic accuracy for CAD, with CCTA-based approaches demonstrating greater validation maturity. However, the widespread methodological bias means these tools should currently support clinical decision-making rather than replace standard diagnostic methods. Future studies should focus on prospective multi-centre validation and the use of multimodal data

Article
Social Sciences
Media studies

Andrés García-Umaña

,

Nelson Carrión-Bósquez

,

Jorge Bernal Peralta

,

Gabriel Estuardo Cevallos Uve

,

Évelyn Córdoba Pillajo

Abstract: Comparative research on digital social influence and sustainable food consumption has grown substantially; however, most transnational studies do not verify measurement invariance nor assess whether observed structural differences reflect genuine cultural variation or measurement artifacts. This study addresses this gap by applying the Stimulus–Organism–Response (SOR) model to examine whether Social Media Content (SMC) and Online Member Group Support (OMGS) influence Organic Product Purchasing Behavior (OPPB) through Environmental Attitude (EA) and Subjective Norms (SN) in Ecuador, Chile, and Peru. A cross-sectional quantitative design was implemented with 809 organic consumers, analyzed using PLS-SEM in two stages: assessment of compositional invariance via the MICOM procedure and multigroup analysis (MGA) based on permutations. Full compositional invariance was confirmed across the three national groups, validating transnational structural comparability. The SOR model held consistently, with EA emerging as a stable predictor of OPPB. Significant structural differences were identified: the SMC→SN path was significantly stronger in Chile (β = .671 vs. β = .558 in Peru; p <.01), whereas the OMGS→EA path was stronger in Peru (β = .284 vs. β = .211 in Chile; p < .05). These findings underscore the need to formally verify invariance before drawing transnational conclusions and highlight the cultural contingency of sustainable digital marketing strategies in Andean emerging markets.

Essay
Arts and Humanities
Philosophy

Alkis Gounaris

,

George Kosteletos

Abstract: This chapter examines the ontological assumptions, epistemological challenges, and ethical implications involved in using Agentic AI to assist, guide, or potentially replace human agents in the making of moral and legal decisions. It argues that different metaphysical assumptions regarding the ontology of ethics, justice, cognition, and AI decisively shape the framework within which such systems are evaluated. In this respect, the analysis distinguishes between two different levels of severity in the moral issues raised, corresponding to two distinct levels of AI autonomy: first, AI systems operating as advisory systems, with limited autonomy; and secondly, AI systems operating as regulatory systems, with full autonomy, that is, as entities entrusted with final decision-making authority. The text adopts a critical perspective on the use of Agentic AI in contexts of moral and legal judgement, highlighting both the conceptual fragility and the epistemological challenges that accompany proposals for such applications. At the same time, it considers the conditions under which such systems could genuinely contribute to human flourishing. Particular attention is given to the risk that ostensibly advisory systems may, in practice, become tacitly regulatory, especially under the pressure of widespread assumptions concerning AI objectivity and effectiveness. The chapter’s structure follows an algorithmic logic, in which a series of key questions serve as branching yes/no nodes, each possible answer leading to a distinct line of philosophical analysis.

Article
Computer Science and Mathematics
Computer Science

Dazeng Yuan

,

Xiheng Liu

,

Bin Liu

Abstract: Multi-server private information retrieval (PIR) based on function secret sharing (FSS) has emerged as a prominent paradigm for achieving sublinear communication. However, standard FSS constructions strictly require full server participation, making them highly vulnerable to single-node fail-stop faults. Existing fault-tolerant schemes mitigate this but inevitably inflate the downlink response overhead to scale with the database size N (e.g., \( O(\sqrt{N}) \)). To overcome this limitation, we propose a (t,p)-fault-tolerant PIR (FT-PIR) protocol grounded in a newly designed generalized (t,p)-fault-tolerant distributed point function (FT-DPF). By introducing a hierarchical recursive patching mechanism, our scheme transforms rigid all-party evaluations into flexible t-out-of-p reconstructions. This architecture completely decouples the response communication from N and ensures efficient client-side reconstruction via lightweight XOR aggregations, fundamentally bypassing heavy algebraic interpolations. Formal analysis proves that our strictly stateless protocol guarantees (t-1)-computational privacy under the semi-honest model. Asymptotic evaluations demonstrate that the proposed FT-PIR achieves an optimal downlink complexity bounded to O(\( poly(t,p) \cdot \log p \)), significantly outperforming existing robust baselines for large-scale datasets.

Article
Public Health and Healthcare
Public Health and Health Services

Noura Khalid Alfhead

,

Sameerah Yasain Shaheen

,

Murid Javed

,

Hamad Alsufyan

Abstract: Objective: High sperm DNA fragmentation (SDF) results in more aneuploid embryos. Although sperm retrieved by testicular sperm extraction (TESE) has low SDF as compared to the ejaculated sperm, there is little data comparing the prevalence of chromosomal abnormalities in the resulting embryos. The objective of this study was to compare rates of chromosomal abnormalities in embryos generated by TESE or ejaculated sperm with increased SDF. Methods: The blastocysts were generated by ICSI. The preimplantation genetic testing for aneuploidy (PGT-A) was achieved by next-generation sequencing (NGS). This study utilized 400 embryos; 200 in each group. The sperm DNA fragmentation was determined by Halosperm G2 assay. The rates of euploid, aneuploid and mosaic embryos were compared by multivariate logistic regression and mixed-effects models that accounted for female age and ovarian reserve. Results: The TESE-derived embryos showed significantly higher percentage of euploid embryos (67.8%) as compared to those derived from ejaculated sperm with high SDF (48.2%, p = 0.003). The multivariate logistic regression indicated that the sperm source (TESE / ejaculate) was an independent predictor of euploid embryos [OR = 1.85; 95% CI = 1.05 - 3.26; p = 0.034]. The probability of having euploid embryos decreased by 6% for every 1% increase in the SDF. The increased age of female was a major negative predictor [OR = 0.92 per year; 95% CI = 0.88 - 96; p = 0.001]. Conclusion: TESE-derived sperm with low SDF resulted in significantly higher euploid embryos as compared to the ejaculated sperm with increased SDF.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Yunguo Yu

Abstract: Clinical artificial intelligence systems are transitioning from predictive tools that generate diagnostic outputs for human interpretation to agentic systems capable of autonomous multistep action within clinical workflows, including ordering laboratory tests, initiating medication reconciliation, and updating patient records. Existing trust frameworks, designed for advisory systems and built on output verification and confidence calibration, do not address the governance requirements of autonomous action. We identify an agency gap: the structural mismatch between validated predictions and unvalidated action policies. Using a partially observable constrained decision process (PO-CDP) formalism, we establish the principle of agency nontransferability, demonstrating that trust calibrated at the diagnostic level does not imply safe or appropriate action policies under real-world clinical, institutional, and legal constraints. To address this gap, we propose a three-layer governance stack—epistemic soundness, policy safety, and institutional traceability—that provides verifiable guarantees at each stage of the agentic decision pipeline. This paper presents a theoretical governance framework; the phasespecific milestones in the backcasting roadmap define the empirical validation agenda for each deployment stage. A compositional risk analysis formally predicts that individually safe components can produce unsafe system-level behavior through nonlinear error propagation. An extended backcasting roadmap defines three empirically testable phases for the transition to governed agentic systems: sandboxed action proposals (2027–2029), credentialed policy systems (2030–2032), and supervised autonomy (2033–2035). The transition to agentic clinical AI constitutes a paradigm shift from prediction correctness to policy safety under constraint, requiring institutional design rather than technical improvement alone.

Article
Business, Economics and Management
Economics

Hai Phu Do

Abstract: Digital traceability has become a critical capability in international trade, yet existing research has not fully explained how institutional, technological, and coordination-related conditions combine to produce successful outcomes. This study applies fuzzy-set Qualitative Comparative Analysis (fsQCA) to 24 trade-corridor cases to identify the configurational drivers of Digital Traceability Success (DTS). The findings show that Digital Trade Readiness (DTR), Market Strictness (MKT), Digital Infrastructure (DIF), and Cross-border Coordination (COO) are necessary conditions for DTS, whereas Blockchain-enabled Traceability (BCT) is not. The sufficiency analysis identifies one dominant pathway DTR * PRK * MKT * DIF * COO with perfect consistency and substantial coverage. These findings demonstrate that digital traceability success is not driven by blockchain adoption alone, but by the joint alignment of institutional readiness, regulatory pressure, infrastructure, risk exposure, and inter-organizational coordination. The study makes two main contributions. Scientifically, it advances the literature on digital trade and supply-chain traceability by offering a configurational explanation grounded in conjunctural causation and causal asymmetry. Practically, it suggests that policymakers and firms should prioritize system-wide readiness, interoperable digital infrastructure, and cross-border governance rather than relying narrowly on blockchain solutions.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Gregor Herbert Wegener

Abstract: Advanced artificial intelligence systems increasingly exhibit behaviors that are not adequately captured by component-local metrics, benchmark scores, or layer-specific monitoring. These behaviors arise across coupling surfaces, control regimes, deployment boundaries, and emergent interaction patterns, indicating that the relevant analytical object is the composed system rather than the isolated component. This article introduces SORT-AI as a canonical domain architecture for the structural diagnosis of advanced AI systems. The framework organizes the AI domain along four axes: Domain as the problem space, Cluster as the structural problem class, Application as a recurrent structural problem form, and Structural Dimensions V1 to V4 as the diagnostic grammar linking observed phenomena to structural causes, effect spaces, and decision surfaces. The current AI domain comprises 52 applications distributed across five clusters: Coupling, Learning, Control, Emergence, and Evidence. To make the domain paper self-contained, a compact mathematical basis is provided using a closed set of 22 idempotent operators, a global consistency projector, a calibrated projection kernel, and a structured projection space in which AI systems are read as operator chains on structured execution states. Runtime Control Coherence, represented by AI.04, is used as the canonical example to illustrate how locally correct control mechanisms can generate globally incoherent behavior under scale. The paper further incorporates SORT-Sovereign as a meta-domain that projects technical structural findings into strategic, regulatory, and state decision spaces. In this form, SORT-AI is positioned as a reusable scientific foundation for subsequent domain-specific analyses and application-level studies across the AI domain.

Review
Medicine and Pharmacology
Oncology and Oncogenics

Birandra K. Sinha

Abstract: Ferroptosis is an iron-dependent, lipid peroxidation–driven form of regulated cell death that has emerged as a promising strategy to eliminate therapy-resistant cancers. However, both intrinsic and acquired resistance to ferroptosis-inducing agents (FINs) limit their clinical efficacy. From this perspective, an integrated model is proposed in which ferroptosis resistance emerges through coordinated redox, metabolic, and transport adaptations that collectively suppress lipid peroxidation and support tumor cell survival. Central to this defense is the cysteine–glutathione–glutathione peroxidase 4 (GPX4) axis, supported by parallel CoQ10-dependent antioxidant systems including ferroptosis suppressor protein 1 (FSP1), dihydroorotate dehydrogenase (DHODH), NAD(P)H quinone oxidoreductase 1 (NQO1), and the GCH1–tetrahydrobiopterin (BH4) pathway. These systems are further reinforced by NrF2-mediated transcriptional programs, iron sequestration and export mechanisms, lipid remodeling that limits polyunsaturated fatty acid availability, and ATP-binding cassette (ABC) transporters that regulate drug and glutathione flux. Tumor heterogeneity—including differences in differentiation state, epithelial–mesenchymal plasticity, and metabolic reprogramming—generates subpopulations with distinct ferroptosis sensitivities and facilitates therapeutic escape. Emerging strategies that simultaneously target multiple resistance nodes, including GPX4 or FSP1 inhibition, combination chemotherapy, and nanoparticle-based delivery systems, may enhance ferroptosis-based therapies. A deeper understanding of oxidant–antioxidant networks governing ferroptosis resistance will enable the rational design of next-generation anticancer strategies to overcome drug resistance.

Article
Arts and Humanities
History

Beáta Pošteková

,

Vladimír Filip

,

Jaroslav Subiak

Abstract: This study examines the north-western access corridor to Žilina through the Kysuca valley (the Kysucká brána area) and the Budatín crossing during the revolutionary years 1848–1849. Using local archival excerpts, a regional chronicle manuscript and a cartographic reading of historical and present-day topography, we reconstruct the probable road alignment between Brodno, Budatín and the bridgehead towards Žilina and identify its recurrent military use by Imperial, Hungarian and Russian forces. The paper argues that the corridor’s strategic value stemmed from a combination of terrain constraints (narrow valley and floodplain), bridge dependence and the connectivity of the Jablunkov Pass trade route. We provide a chronology of troop movements in 1848–1849 and discuss source limitations, including internal inconsistencies in local narratives that require verification against primary military records. The article contributes a microhistorical case study to military geography of Upper Hungary and highlights the analytical potential of regional sources when integrated with critical source evaluation.

Article
Physical Sciences
Applied Physics

Joe Yazbeck

,

John B. Rundle

Abstract: Interpreting interferometric synthetic aperture radar (InSAR) imagery is a critical task in monitoring volcanic and seismic activity, yet the process usually requires expert knowledge and manual analysis. As the volume of satellite observations continues to increase, automated methods capable of describing and interpreting these images become increasingly important in order to assist geophysical monitoring efforts. In this work, we investigate the feasibility of automated image captioning for InSAR data using modern vision-language models. We utilize the Hephaestus dataset which is a large collection of annotated interferograms focused on volcanic deformation, and apply a series of preprocessing steps to curate a balanced dataset of deforming and non-deforming images. Two generative image captioning architectures, the Generative Image-to-Text Transformer (GIT) and Bootstrapping Language-Image Pretraining (BLIP), are fine-tuned to output natural language descriptions of the InSAR images. In addition, we implement a retrieval-based model that aligns image and text representations within a shared embedding space and retrieves the most semantically similar caption. The performance of these approaches is evaluated using standard captioning metrics and qualitative inspection of generated descriptions. Our results suggest that pre-trained vision–language models can adapt to specialized scientific imagery despite being trained primarily on natural image datasets. This study represents an initial step towards automated interpretation systems capable of assisting researchers in large-scale InSAR monitoring applications.

Article
Environmental and Earth Sciences
Atmospheric Science and Meteorology

Manesh Chawla

,

Chander Shekhar

,

Amreek Singh

Abstract:

It is known that snowcover properties change rapidly due to effect of weather and radiation, detailed models mapping effect of weather and radiation processes to evolution of snowpack have been developed. These models are capable of accurately simulating entire evolution of snowpack at a specific point if a sufficiently detailed time-series of weather and radiation parameters affecting the point is known. In this study we consider the reverse problem of finding the weather and radiation parameters that lead to changes in snowpack parameters, we have used a simulation approach to study the feasibility of finding this reverse map. We mapped a time-series of snowcover states to their corresponding time-series of weather and radiation states using a machine learning model. The data of snowcover states was generated using a well known and rigorously validated snowcover simulation model (SNOWPACK). The results of our experiments show that snow surface time-series contains important information about the meteorological time-series affecting it. We were able to find the meteorological parameters from the simulated data under certain conditions, we expect these results to generalize with actual data. There maybe important applications of these results in optimization of weather data collection systems, weather interpolation algorithms and downscaling algorithms, combining the snowpack data with weather observations can lead to improvements in these algorithms. This study makes a preliminary feasibility study of the reverse problem, our results are positive and encourage further field work using actual data.

Article
Social Sciences
Other

Kiara Geoconda Peralta Jaramillo

,

Gina Sandy Tapia Montero

Abstract: Introduction: The educational promotion of inclusive recreational spaces remains a key challenge for public policies, particularly in contexts where gaps persist between regulation and implementation. Objective: This study analyzed the impact of public policies on the promotion of inclu-sive recreational spaces in the canton of Milagro, focusing on accessibility, pedagogical use, and social inclusion. Methodology: A mixed-methods approach was applied, with a non-experimental, cross-sectional, descriptive-correlational design and an explanatory component. The sample included teachers, administrators, technical staff, and community members. Data were collected through questionnaires and semi-structured interviews. Results: Findings showed a favorable regulatory framework but only partial local im-plementation. Significant relationships were identified between policy implementation and accessibility, educational use, and inclusion. Gaps were also observed in teacher training and adaptation to functional diversity. Discussion: Results align with previous studies emphasizing the need to integrate pub-lic policies, education, and territorial planning. Conclusions: Strengthening policy implementation and coordination is essential to consolidate inclusive recreational spaces with sustainable educational impact.

Article
Computer Science and Mathematics
Algebra and Number Theory

Frank Vega

Abstract: Robin's criterion equates the Riemann hypothesis with the inequality $\sigma(n) < e^{\gamma}\,n\,\log\log n$ for every $n > 5040$. By a theorem of Robin, if the Riemann hypothesis is false then infinitely many colossally abundant numbers fail this inequality, so the existence of a counterexample to Robin's criterion is equivalent to the existence of a colossally abundant counterexample. Combining structural properties of colossally abundant numbers with explicit estimates for the Chebyshev theta function and the Mertens product due to Aoudjit, Berkane, and Dusart, we prove an unconditional and effective upper bound on any such counterexample. Specifically, if $n > 5040$ is colossally abundant and violates Robin's inequality, then $n < N_{k}^{Y_{k}}$, where $p_{k}$ is the largest prime factor of $n$, $N_{k} = \prod_{i=1}^{k} p_{i}$ is the primorial of order $k$, and $Y_{k} \to 1^{+}$ is an explicit constant. Together with the lower bound $n \geq N_{k}$ that holds for any Hardy--Ramanujan integer, this confines every hypothetical colossally abundant counterexample to a narrow explicit window above the corresponding primorial. We deduce a conditional reformulation of the Riemann hypothesis localized to this window. This work refines the approach taken in the author's earlier article ``Robin's criterion on divisibility'', published in The Ramanujan Journal.

Article
Biology and Life Sciences
Biology and Biotechnology

Nursen Senturk

,

Ozden Cobanoglu

,

Sena Ardicli

Abstract: Racing homing pigeons (Columba livia) have been selectively bred for centuries for superior flight capacity. Yet, the quantitative structure of flight performance traits and the extent to which sex influences these parameters remain poorly characterized, par-ticularly in Turkish populations. This study aimed to evaluate flight performance in racing pigeons raised in the South Marmara region of Turkiye using three key kine-matic traits (flight duration, speed, and distance) and to explore the multivariate structure and individual variation of these parameters through an integrative machine learning framework. Data were compiled from 166 individually registered pigeons (77 females, 89 males), totaling 781 race records used for pattern analysis. A composite Flight Performance Score (FPS) was constructed using min–max normalized compo-nent variables, and its internal consistency was assessed via Cronbach's alpha and principal component analysis. Univariate comparisons revealed no statistically signif-icant sex-related differences in any of the three flight parameters (P > 0.05 for all traits). Principal component analysis confirmed substantial overlap between male and female individuals in multivariate trait space, and Random Forest classification failed to dis-criminate between sexes above chance level (accuracy = 0.490; ROC-AUC = 0.500), col-lectively indicating that sex is not a dominant determinant of flight performance in this population. Internal consistency analysis revealed that flight duration, speed, and dis-tance are functionally independent dimensions (Cronbach's α = 0.135; r = −0.749 be-tween duration and speed), with their variance structure being effectively two-dimensional (PC1: 60.1%; PC2: 39.7%), supporting the equal-weighting scheme applied in FPS construction. Pattern analysis of race records identified four biologically distinct flight performance profiles, characterized by differential trade-offs among flight duration, speed, and distance, suggesting that individual-level performance strategy, rather than sex, is the primary axis of variation in this dataset. These findings challenge common breeder assumptions about sex-based differences in performance and highlight the multidimensional, individual-specific nature of flight performance in racing pigeons.

Review
Environmental and Earth Sciences
Waste Management and Disposal

Silvia González-Rojo

,

Alvaro Martínez-Sánchez

,

Xiomar Gómez

Abstract: The transition to a circular economy requires the safe management of sewage sludge through nutrient and energy recovery. However, the presence of pharmaceuticals and personal care products (PPCPs) poses a critical obstacle, as these compounds tend to accumulate in the sludge matrix through sorption processes, thereby shifting the environmental problem from the water stream to the sludge stream. This manuscript provides a comprehensive review of the scientific literature on technical alternatives for valorizing sewage sludge and removing emerging contaminants. The study evaluates the limitations of conventional biological methods, such as anaerobic digestion and composting, which exhibit variable efficacy and are often insufficient to degrade recalcitrant molecules, including some commonly used pharmaceuticals. On the contrary, thermal treatments (pyrolysis, gasification, and hydrothermal processes) are considered robust alternatives capable of achieving removals exceeding 90-99% thanks to the thermal degradation of contaminants. Furthermore, the article emphasizes the innovative potential of utilizing carbon-based byproducts (biochar and hydrochar) as adsorbents or catalysts to enhance the removal of PPCPs within the treatment infrastructure itself. The integration of advanced thermal technologies is essential to mitigate the risks of contaminant transfer to the food chain and ensure a safe and sustainable nutrient cycle.

Review
Medicine and Pharmacology
Oncology and Oncogenics

Alcides Chaux

Abstract: Background: Immune checkpoint inhibitors (ICI) targeting the PD-1/PD-L1 and CTLA-4 axes have transformed the treatment landscape of genitourinary (GU) malignancies, yielding durable responses in subsets of patients with urothelial carcinoma, renal cell carcinoma (RCC), and selected cases of metastatic castration-resistant prostate cancer (mCRPC). However, most patients exhibit primary resistance, and those who initially respond frequently develop acquired resistance, substantially limiting long-term clinical benefit. A systematic understanding of the biological mechanisms driving resistance, the identification of robust predictive biomarkers, and the design of rational combination strategies are essential to extend the therapeutic reach of ICI-based regimens. Methods: An integrative review was conducted following the Whittemore and Knafl (2005) framework. Systematic searches were performed in PubMed/MEDLINE, Cochrane Central Register of Controlled Trials, Embase, and Web of Science, covering the period January 2020 to April 2026. Studies were selected according to predefined inclusion criteria encompassing original research, clinical trials, systematic reviews, and narrative reviews reporting on resistance mechanisms, predictive biomarkers, or emerging therapeutic strategies for ICI in GU cancers. Methodological quality was assessed using RoB 2, ROBINS-I, and AMSTAR-2 as appropriate. Results: A total of 32 studies met eligibility criteria. Three interconnected resistance categories were identified: 1) tumor cell-intrinsic mechanisms, including low tumor mutational burden (TMB), loss of antigen presentation via major histocompatibility complex class I (MHC-I) downregulation, activation of the Wnt/β-catenin and JAK/STAT pathways, and epigenetic reprogramming; 2) tumor microenvironment (TME)-mediated immunosuppression, driven by myeloid-derived suppressor cells (MDSCs), regulatory T cells (Tregs), cancer-associated fibroblasts (CAFs), and immunosuppressive cytokines including TGF-β and VEGF; and 3) acquired post-treatment resistance involving T-cell exhaustion and upregulation of alternative immune checkpoints. Among validated biomarkers, PD-L1 expression demonstrated variable predictive utility across GU cancer types, while TMB-high status (≥10 mutations/megabase) predicted improved response to pembrolizumab across solid tumors. Emerging therapeutic strategies include ICI plus tyrosine kinase inhibitor (TKI) combinations, antibody-drug conjugates (ADCs), MDSC-targeted interventions, therapeutic vaccines, and radiotherapy sensitization. Conclusion: ICI resistance in GU cancers is a multidimensional phenomenon with distinct biological drivers across tumor subtypes. Precision combinations targeting both intrinsic tumor factors and the immunosuppressive TME represent the most promising avenue to overcome resistance. Standardization of composite biomarker platforms is urgently needed to individualize ICI selection in clinical practice.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Tatiana Petrova

Abstract: Megacity mobility research has long relied on aggregate statistical indicators (composite traffic-planning indices andaccessibility surfaces) that capture the system at rest but are disconnected from the operational decisions shaping mobilityminute to minute. In parallel, Agentic AI offers a reference paradigm of semantically interoperable autonomous agents that negotiate and coordinate across open networks. We propose a unifying Agentic AI reference architecture for urbantransportation that maps any composite traffic index and any accessibility surface onto agent utility functions, negotiation-protocol primitives, and shared semantic ontologies. The architecture is instantiated in simulation only: no live agent-to-agent endpoints, no runtime large language models, no cross-organisation interoperability experiment; these are explicitly listed as next steps. A 12-borough London case study, evaluated over $N=30$ seeds on an origin-destination matrix calibrated to the 2021 UK Census commuter-flow aggregates, benchmarks four regimes (historic, adaptive, MaxPressure, and our Agentic policy) across four scenarios covering equity, corridor prioritisation, incident response, and their combination. The Agentic regime reduces the accessibility-deficit Gini coefficient by \textbf{23--58\,\%}, travel-time coefficient of variation by up to \textbf{41\,\%}, and mean travel time by \textbf{4--9\,\%} relative to the historic baseline; in the joint equity-plus-incident scenario it attains the per-column best within $95\,\%$ confidence on travel-time coefficient of variation and mean travel time while improving on every metric over the historic baseline. Amicroscopic testbed in the SUMO simulator on a $4\!\times\!4$ signalised grid ($120$~runs across three demand regimes and five peripheral-boost values) traces an explicit equity--efficiency Pareto frontier; in the saturation and over-saturation regimes the agentic policy matches or beats a SCOOT-style adaptive controller on mean travel time and throughput at every boost level, with travel-time variance reduced by up to a third.

of 5,891

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated