Sort by

Article
Biology and Life Sciences
Agricultural Science and Agronomy

Nsalambi V. Nkongolo

,

Darceline Anangi Mokea

,

Maria Luisa Fernandez-Marcos

Abstract: Plant species can significantly influence soil nutrients. We assessed how soil micronutrients (B, Fe, Cu, Zn, Mn) and Aluminum were affected by plant species in agricultural fields at Masako Forest Reserve. Soil samples were collected in June 2022 and 2023 at three depths: 0–10 cm, 10-20 and 20-30 cm in fields grown to Costus lucanusianus, Manihot esculenta, Zea mays, Triumfetta cordifolia, and Xanthosoma sagittifolium. A completely randomized design was used with 3 soil depths (SD) x 5 plant species (PS) replicated 4 times. Soil samples were air-dried, sieved 2 mm and sent to Brookside Laboratories (OH, USA) for analyses. Results showed that in 2022, Fe, Mn, Cu and Zn were significantly affected by soil depth (p< 0.05). Mn, Cu and Zn concentrations were higher in 0-10 cm while Fe dominated in 10-30 cm depth. Only Cu (p=0.0001) was affected by plant species. The soil under Xanthosoma sagittifolium (0.19 mg/kg) and Triumfetta cordifolia (0.47 mg/kg) had significantly the lowest level of Cu. In 2023, however, only Zn was significantly affected by SD (p=0.0004) with its highest level (1.22 mg/kg) in 0-10 cm. PS significantly affected Fe, Mn, Cu, Zn and aluminum (p< 0.05). The soil under Manihot esculenta had the highest concentration of Fe (171.42 mg/kg) while Zn (1.03 mg/kg) was significantly higher in the soil under Zea mays. The 0-10 cm layer significantly held more micronutrients. Plant species such as Manihot esculenta had a noticeable effect on soil micronutrients.

Article
Public Health and Healthcare
Public Health and Health Services

Melody Moezzi

,

Bjørn M. Hofmann

Abstract: Background/Objectives: Artificial intelligence (AI) is increasingly integrated into dental diagnostics, promising improved detection, efficiency, and patient communication. While these developments offer potential clinical benefits, emerging commercial applications raise important ethical concerns. This study explores how providers of diagnostic AI systems frame their technologies in marketing materials, with particular attention to features designed to influence patient acceptance and increase revenue. Methods: An exploratory qualitative thematic analysis of publicly available promotional content from leading dental AI companies between September and October 2025. Materials were analyzed for recurring rhetorical strategies related to commercialization, persuasion, technological authority, and representations of objectivity. Ethical interpretation was guided by principlism, standard codes in professional ethics, and virtue-based perspectives. Results: AI is frequently marketed not only as a diagnostic aid but also as a tool for boosting case acceptance, return on investment, and practice growth. Visualizations and performance metrics are used rhetorically to position AI as authoritative and objective, encouraging patient compliance while downplaying uncertainty and potential harms. These practices risk undermining patient autonomy, promoting diagnostic inflation and overtreatment, and compromising professional integrity by shifting attention from patient welfare toward commercial outcomes. Conclusion: Pervasive marketing of persuasive diagnostic AI amplifies existing tensions between professional obligations and commercial incentives in dentistry. Without appropriate safeguards, AI risks reinforcing a transactional model of care in which patients are treated as consumers and diagnostics become instruments of persuasion. To preserve trust and ethical practice, dentists and professional organizations must ensure that AI remains a supportive clinical tool rather than a commercial device, prioritizing transparency, informed consent, and patient-centered care.

Review
Biology and Life Sciences
Biophysics

Zeno Földes-Papp

Abstract: This article addresses a current point of contention in the field of single molecule/single particle tracking, as well as relevant literature, and supplements it with some published cell-based experiments to illustrate our conclusions and known theorems. We attempt to explain the controversy surrounding the differing biophysical and cell biological results of studies on the individual molecule and those “at the single-molecule level” as well as at the level of many molecules in such a way that even readers who are unfamiliar with the subject can understand it without having to read all the mathematical, physical, and biophysical references. Given this abundance of studies in the literature, it is obvious that genuine single-molecule studies are urgently needed, i.e., single-molecule studies that focus on increasing the sensitivity of the temporal resolution of single-molecule measurements and not just on spatial resolution.

Review
Medicine and Pharmacology
Oncology and Oncogenics

Hui Wang

,

Jiaming Cui

,

Caiyun Niu

,

Chengpeng Zhao

,

Lihui Guo

,

Weiyang Zhang

,

Zhicheng He

,

Zexing Song

,

Mengwei Tao

Abstract: Non-mass lesions in breast ultrasound refer to abnormalities that exhibit different echogenicity from surrounding tissues but lack a distinct mass shape. Malignant breast lesions may also present as non-mass lesions on ultrasound, making accurate detection and diagnosis crucial for resolution. Currently, “non-mass breast lesions” are not included in ultrasound terminology of the 5th Edition Breast Imaging Reporting and Data System (BI-RADS). Although multiple classification systems have been proposed in the literatures, there remains no standardized ultrasound definition or malignant risk grading for non-mass lesions. The ultrasound features of benign and malignant non-mass breast lesions are often subtle and partially overlapping, complicating differential diagnosis and impacting clinical evaluation and management. The authors reviewed definitions and classification systems for non-mass breast lesions in the literatures, summarized their ultrasound features, and introduced the diagnostic applications and value of ultrasound technologies. This aims to enhance the diagnostic proficiency of sonographers in evaluating non-mass breast lesions. This paper reviews the ultrasound definitions and classifications of non-mass breast lesions, exploring the correlation between their ultrasound features and pathological histology as well as malignant risk. It also discusses the diagnostic values of conventional ultrasound, automated breast ultrasound, ultrasound elastography, and contrast-enhanced ultrasound for non-mass breast lesions. Finally, it compares the diagnostic accuracy of various ultrasound-guided needle biopsy techniques for non-mass lesions. Through deepening their understanding and mastery of non-mass breast lesions, sonographers can enhance diagnostic accuracy and improve their capabilities in image analysis and clinical interpretation.

Article
Physical Sciences
Quantum Science and Technology

Cheng Jinjun

,

Cheng Dian

Abstract: This paper represents a further academic deepening and upgrading of the authors' 2019 publication A Hypothesis on the Spatial Motion Mode of Photons. It should be explicitly stated that this paper falls within the category of natural philosophical thought experiments—its core value lies in constructing a unified physical image of the nature of light through rigorous logical deduction, and proposing verifiable theoretical hypotheses and experimental schemes; the validity of all conclusions must ultimately be verified by rigorous and extensive scientific experiments before being incorporated into the theoretical system of physics. As a foundational concept of quantum mechanics, the wave-particle duality of light has been accompanied by profound philosophical perplexities and theoretical tensions since its proposal, becoming a core bottleneck in the integration of classical and quantum physics. This paper systematically sorts out the logical incompleteness in the current quantum interpretation system—including the self-negation of the complementarity concept, the problem of photon localization, the fundamental opposition between the statistical and non-statistical interpretations of the wave function, and the philosophical controversy over the Heisenberg Uncertainty Principle, revealing the inherent contradictions of the traditional wave-particle duality framework. On this basis, adopting classical physical images and the logic of reduction to absurdity, and based on six axioms and six preparatory propositions, this paper puts forward a natural philosophical hypothesis on the essence of photons: a photon is an energetic mass point with a diameter smaller than the Planck length, moving in a uniform spiral linear motion in space. The paper deduces the core characteristics such as velocity, frequency, and wavelength of the photon's uniform spiral linear motion, and designs three operable, repeatable, and quantifiable physical experimental schemes to provide specific paths for the empirical verification of the hypothesis. The research deduces that the angular momentum of photon spatial motion (excluding photon spin motion) is always the reduced Planck constant ℏ, the energy E=mc² is naturally unified with E=hν (the standard formula for wave energy), and the standard expression of the Heisenberg Uncertainty Principle ΔxΔpₓ≥ℏ/2 can be given a classical physical interpretation from the perspective of superposition of measurement deviations. This paper systematically responds to potential questions regarding the origin of photon particle nature, wave nature, and compatibility with relativity, arguing that the hypothesis provides a logically consistent and clearly visualized path for understanding the nature of light, builds a new natural philosophical framework for the integration of quantum and classical theories of light, and also offers a new thinking perspective for the paradigm shift in the study of the nature of light.

Article
Physical Sciences
Space Science

Fatemeh Fazel Hesar

,

Mojtaba Raouf

,

Amirmohammad Chegeni

,

Peyman Soltani

,

Bernard Foing

,

Elias Chatzitheodoridis

,

Michiel J.A. de Dood

,

Fons J. Verbeek

Abstract: We present an innovative, cost-effective framework integrating laboratory Hyperspectral Imaging (HSI) of the BECHAR 010 lunar meteorite with ground-based lunar HSI and supervised Machine Learning (ML) to generate high-fidelity mineralogical maps. A \SI{3}{\milli\metre} thin section of BECHAR 010 was imaged under a microscope with a \SI{30}{\milli\metre} focal length lens at \SI{150}{\milli\metre} working distance, using 6x binning to increase the signal-to-noise ratio, producing a data cube (X $\times$ Y $\times$ $\lambda$ = $791 \times 1024 \times 224$, \SI{0.24}{\milli\metre} $\times$ \SI{0.2}{\milli\metre} resolution) across \SIrange{400}{1000}{\nano\metre} (224 bands, \SI{2.7}{\nano\metre} spectral sampling, \SI{5.5}{\nano\metre} FWHM spectral resolution) using a Specim FX10 camera. Ground-based lunar HSI was captured with a Celestron 8SE telescope (\SI{3}{\kilo\metre}/pixel), yielded a data cube ($371 \times 1024 \times 224$). Solar calibration was performed using a Spectralon reference (\SI{99}{\percent} reflectance \SI{< 2}{\percent} error) ensured accurate reflectance spectra. A Support Vector Machine (SVM) with a radial basis function kernel, trained on expert-labeled spectra, achieved \SI{93.7}{\percent} classification accuracy (5-fold cross-validation) for olivine (\SI{92}{\percent} precision, \SI{90}{\percent} recall) and pyroxene (\SI{88}{\percent} precision, \SI{86}{\percent} recall) in BECHAR 010. Local Interpretable Model-agnostic Explanations (LIME) identified key wavelengths (e.g., \SI{485}{\nano\metre}, \SI{22.4}{\percent} for M3; \SI{715}{\nano\metre}, \SI{20.6}{\percent} for M6) across 10 pre selected regions (M1 to M10), indicating olivine-rich (Highland-like) and pyroxene-rich (Mare-like) compositions. Spectral Angle Mapper (SAM) analysis revealed angles from \SI{0.26}{\radian} to \SI{0.66}{\radian}, linking M3 and M9 to Highlands and M6 and M10 to Mares. K-means clustering of lunar data identified 10 mineralogical clusters (\SI{88}{\percent} accuracy), validated against Chandrayaan-1 Moon mineralogy Mapper ($\rm M^3$) data (\SI{140}{\metre}/pixel, \SI{10}{\nano\metre} spectral resolution). A novel push-broom HSI approach with telescope, achieves 0.8 arcsec resolution for lunar spectroscopy, inspiring full-sky multi-object spectral mapping.

Article
Biology and Life Sciences
Immunology and Microbiology

Igor D. Zlotnikov

,

Alexander A. Vinogradov

,

Elena V. Kudryashova

Abstract:

The secondary immunomodulatory effects of conventional therapeutics, such as antibiotics and cytostatics, are frequently overlooked despite their significant clinical implications. Building on our previous findings that drugs like paclitaxel and doxorubicin heavily influence macrophage polarization—potentially driving metastasis or inflammation—this study systematically evaluates the secondary immune-modulating actions of standard drugs and natural adjuvants. Using patient-derived bronchoalveolar lavage (BAL) fluid (ex vivo alveolar macrophages), we developed an analytical platform using synthetic carbohydrate-functionalized fluorescent ligands targeting key receptors (CD206, CD209, CD280, CD301). Integrating ligand-binding profiles with Linear Discriminant Analysis (LDA) yielded quantitative immune-state vectors capable of differentiating between favorable prognostic signatures and imbalanced immune states. Profiling samples across heterogeneous respiratory conditions revealed highly context-dependent responses. While some treatments synergistically corrected imbalanced profiles, others provoked dysregulation. Notably, in pneumonia or bronchitis with an asthma-prone M2-dominant profile, specific antibiotic regimens are critical; doxycycline, for instance, may exacerbate patient deterioration by further driving M2a polarization. Crucially, we identified that natural adjuvants (e.g., curcumin, coumarins, polyphenols) exhibit potent properties capable of correcting these adverse secondary drug effects. Ultimately, this profiling platform highlights the necessity of evaluating patient-specific secondary drug effects, offering a functional blueprint for precision immunotherapy and the rational design of adjuvant-enhanced treatments.

Article
Public Health and Healthcare
Public Health and Health Services

Andri Hondir

,

Ferdi Antonio

,

Natalin Allorerung

,

Suriadi Huang

Abstract: This study aims to explore the implementation of integrated medical services and social compassion in charity-based patient care at Tzu Chi Hospital Jakarta. The study focuses on healthcare practices that emphasize empathy, humanitarian values, and social support for patients from vulnerable populations. A qualitative approach with a case study design was employed. Data were collected through document analysis involving healthcare professionals and patients, observation, and followed by in-depth interviews with hospital management that provides charity-based care. The findings indicate that the integration of medical services and social compassion is realized through the provision of equal and non-discriminatory clinical care, empathetic communication, social assistance, and administrative as well as spiritual support for charity patients. This integrated approach enhances patients’ sense of safety, trust, and adherence to treatment. Moreover, compassion emerges as an institutional value and work culture that strengthens the professionalism and commitment of healthcare workers. This study provides insight that compassion-based healthcare offers a holistic and socially model of care in private hospitals. The integration of medical and social dimensions can improve the quality of healthcare services, improving equity and access to care for vulnerable groups in hospitals.

Article
Chemistry and Materials Science
Organic Chemistry

Vladislav S. Polyakov

,

Yuri K. Grishin

,

Ekaterina S. Ivanova

,

Alexander A. Shtil

,

Elena K. Beloglazkina

Abstract:

Aiming at p53-reactivating compounds, a convergent scheme for the preparation of conjugates with the dispiro-indolinone-pyrrolidine-thioimidazolone and glutarimide moieties connected via a triazole-containing linker were proposed. Target conjugates were synthesized by azide-alkyne cycloaddition reactions between propargylthio-substituted dispiro-indolinone-pyrrolidine-imidazolones and an azido-glutarimide derivative. The starting compounds were available isothiocyanates, glycine, substituted benzaldehydes, chloroacetamide, and ethyl acrylate. The key azide-alkyne cycloaddition step was carried out using TBTA as a catalyst, achieving >70% product yields. The resulting bifunctional compounds contained a fragment of dispiroindolinone (p53-MDM2 interaction inhibitor) and glutarimide, an ubiquitin ligase ligand. The dispiroindolinone-glutarimide conjugate with 5-bromoisatine and 4-bromophenyl moieties showed a potential for p53 re-activation as determined by preferential cytotoxicity against HCT116 colon carcinoma cells (wild type53) compared to the isogenic HCT116p53-/- subline.

Article
Medicine and Pharmacology
Oncology and Oncogenics

Thibault Gauduchon

,

Jérôme Fayette

,

Mona Amini-Adle

,

Eve-Marie Neidhart-Berard

,

Mehdi Brahmi

,

Armelle Dufresne

,

Margaux Dupont

,

Clelia Coutzac

,

Axel De Bernardi

,

Philippe Toussaint

+7 authors

Abstract: Immune checkpoint inhibitors such as anti–PD-1 antibodies are essential in cancer therapy. Emerging data suggest that lower doses may be effective and more economical, though further evidence is needed. We conducted a retrospective study at Centre Léon Bérard to assess the efficacy and safety of low-dose nivolumab (20 mg every three weeks) in patients with advanced cancer, mainly squamous cell carcinomas (SCC). Between 2023 and 2024, 53 patients were treated, with a median age of 74 years; 39.6% were over 80. Most were male (64%) and had ECOG ≥2 (69.9%). Primary tumor sites included cutaneous SCC (34%), head and neck SCC (32%), and soft tissue sarcoma (15%). After a median follow-up of 8.3 months, median overall survival was 7.5 months. The objective response rate (ORR) was 20.8% overall, rising to 35.3% in cutaneous SCC and 23.5% in head and neck SCC—comparable to standard-dose nivolumab. Toxicity was manageable: 18.7% experienced immune-related adverse events, with only 3.7% grade 3. Low-dose nivolumab demonstrates encouraging efficacy and tolerability in a frail population, supporting its potential role in resource-limited settings. Prospective trials are warranted to confirm these findings in broader populations.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Mohsen Mostafa

Abstract: We present HPST, a unified framework that inte- grates symbolic theorem proving, statistical physical analysis, and graph neural networks (GNNs) for ro-bust and interpretable modeling of fluid flows. HPST combines an associative-commutative (AC) matching rewriting engine to verify algebraic identities, a data- driven module to compute physical invariants (e.g. ,Bernoulli’s principle and adaptive thresholds), and a GNN surrogate based on EdgeConv layers that learns velocity fields from scattered point clouds. Exten- sive experiments on a synthetic cylinder wake dataset (40,000 points) demonstrate that HPST successfully proves three fundamental theorems (commutativity, associativity, distributivity) and that the optimized GNN achieves a coefficient of determination up to R2 = 0.208 with an average R2 = 0.164 ± 0.03 over mul- tiple runs, while also reducing the mean absolute di-vergence to 0.27—a measure of physical consistency.Comparison with baseline models (k-nearest neigh- bors, linear regression) shows that HPST offers com- petitive accuracy while providing interpretability and a layer of mathematical verification. The framework’s modular design and robust performance make it di-rectly applicable to industrial scenarios such as aero-dynamic shape optimization, automotive drag prediction, and wind-farm layout planning. All experi-ments were conducted on a Kaggle environment with an NVIDIA P100 GPU (16 GB RAM), and the complete source code is publicly available.

Article
Chemistry and Materials Science
Biomaterials

Gulzeynep Begimova

,

Aishat Kuldanova

,

Irina Kuxina

,

Nazira Chinibekova

Abstract: This study focused on the development and characterization of bioactive polymeric patches based on agar–chitosan and gellan–chitosan matrices, with and without naringin, aiming to identify formulations with optimal physicochemical and biological performance. FTIR spectroscopy, thermogravimetric (TGA), and differential scanning calorimetry (DSC) analyses confirmed effective crosslinking, stable incorporation of the bioactive compound, and high thermal stability of the patches. Antimicrobial testing against Staphylococcus aureus ATCC 33591 demonstrated that naringin-loaded agar–chitosan films, particularly those with lower chitosan and glutaraldehyde content, exhibited significant activity (MIC = 12.5 mg/mL; inhibition zone 27.67 ± 0.58 mm). Biocompatibility studies, including local skin irritation in rabbits and 28-day topical application in mice, showed no adverse effects. Anti-inflammatory evaluation using the λ-carrageenan-induced paw edema model indicated modest activity of naringin under acute conditions. Overall, agar–chitosan films offered tunable properties and reproducible bioactive incorporation, while gellan–chitosan films provided mechanically robust matrices suitable for further optimization. The results highlight the potential of agar–chitosan patches as biocompatible, structurally stable, and antimicrobial platforms for topical and transdermal delivery of bioactive flavonoids.

Article
Medicine and Pharmacology
Ophthalmology

Noriko Toyokawa

,

Kaoru Araki-Sasaki

,

Hideya Kimura

,

Shinichiro Kuroda

Abstract: Background/Objectives: A disadvantage of Descemet stripping automated endothelial keratoplasty (DSAEK) in eyes with prior glaucoma filtration surgery is the difficulty in maintaining air tamponade during the procedure. Herein, we report the use of bleb compressive sutures in managing air tamponade in the anterior chamber during DSAEK in eyes with blebs following trabeculectomy. Methods: This retrospective case series included 34 eyes of 33 patients that developed bullous keratopathy following trabeculectomy. Bleb compression suturing was performed using a10-0 nylon suture in eyes with an intraocular pressure (IOP) &lt; 10 mmHg or a fragile ischemic bleb. Postoperative IOP, air ingress into the bleb, rebubbling, bleb leakage, and bleb damage were evaluated. Results: Of the 34 eyes, 13 underwent bleb compression suturing before DSAEK (suture group), whereas 21 eyes did not (non-suture group). Mean preoperative IOP was lower in the suture group than in the non-suture group, whereas postoperative IOP at 2 h was similar. Preoperative to 2-h postoperative IOP increased by 18±9.3 and 11.7±3.1 mmHg in the suture and non-suture groups, respectively, with no significant differences. At 2-h postoperatively, two eyes in the suture group and one eye in the non-suture group exhibited an IOP spike (≥30 mmHg). One eye in the non-suture group required rebubbling owing to air ingress into the bleb. Postoperatively (1–2 weeks), the mean IOP was 7.1±3.2 and 9.4±4.6 mmHg in the suture and non-suture groups, respectively. Preoperative and postoperative IOP did not significantly differ in either group. No suture-related complications were observed. Conclusion: In DSAEK for eyes with bleb, bleb compression suturing provides effective air tamponade during graft adhesion.

Article
Physical Sciences
Mathematical Physics

Yosef Akhtman

Abstract: We present a strictly finitist formulation of Schr\"odinger-type and Dirac-type dynamics in the Finite Ring Continuum, together with exact information counts for reversible and compressive shell maps. The construction uses a symmetry-complete prime field, its quadratic extension, and the Frobenius involution to define finite Hermitian state spaces and finitist Hamiltonians. On Euclidean shells, continuum time evolution is replaced by a Cayley update that preserves the Hermitian form exactly and therefore produces periodic trajectories. On the Lorentzian extension, we construct explicit gamma matrices, a finitist Dirac operator, its associated Klein-Gordon factorization, and a covariant lifted boost action. To connect the formalism with entropy and information theory without leaving strict finiteness, we measure finite maps by their image counts and exact loss factors. This separates reversible transformations, which preserve distinguishability exactly, from shell power maps, which merge states by a computable arithmetic factor. All results are finite, algebraic, and exact; no limits, differential calculus, or continuum structures are used.

Article
Computer Science and Mathematics
Computer Science

R Karthick

Abstract: The real-time nature of the digital world has limited cloud-based marketing analytics. Latency and privacy issues are hindering optimization of the customer journey. The paper presents an Edge AI Conversion modelling framework which deploys lightweight transformer-conformer hybrids on user devices for dynamically optimizing TOFU-to-BOFU funnel dynamics using an intent-based inference mechanism. This model combines various methods such as text, voice, and behavioural data via on-device processing to predict the chances of conversion along with recommending actions specific to the stage like personal nurturing to MOFU leads or urgency tactics to BOFU closures. This is formulated as reinforcement learning with Markov decision processes. This would help maximize the revenues by minimizing drop-offs on the funnels as well as lifetime value. The system achieves 32% uplift in the return on ad spend (ROAS) on a suite of simulated e-commerce as well as the SaaS campaigns. Several key innovations comprise quantized edge inference at a latency of under 50ms, federated updates for scalabilit and privacy-preserving synchronization. Our evaluations on a 1M-session dataset show that our approach outperforms centralized baselines in terms of accuracy (92% intent detection) and responsiveness, thereby addressing critical gaps in intent-driven marketing. This project will lead to the development of self-sufficient revenue engines.

Article
Computer Science and Mathematics
Computer Science

Jesse Van Griensven

,

Victor Oliveira Santos

,

Bahram Gharabaghi

Abstract: The literature indicates that the qubit requirements for factoring RSA-2048 remain on the order of 1 million, under commonly assumed architectures and error-correction models, leaving a substantial gap between current resource estimates and near-term practical feasibility. Reducing this requirement to the low-thousands-qubit regime therefore remains an important open research objective. This work proposes a hybrid classical-quantum algorithm that uses a classical modular exponentiation subroutine with a Quantum Number Theoretic Transform (QNTT) circuit to increase the speed and reduce the required quantum resources relative to Shor’s algorithm for integer factorization, which underpins cryptographic systems like RSA and ECC. We evaluate multiple coprime numbers, the result of multiplication of two primes, in both simulation and real quantum hardware, using IBM’s reference Shor implementation as the baseline. Because Shor and proposed Jesse–Victor–Gharabaghi (JVG) use different register sizes for the same coprime N, the reported gate/depth reductions should be interpreted as end-to-end quantum-resource budgets for factoring the same N, rather than a per-qubit or transform-only efficiency claim. In simulation, the JVG algorithm achieved substantial practical reductions in computational resources, decreasing runtime from 174.1 s to 5.4 s, memory usage from 12.5 GB to 0.27 GB, and quantum gate counts by approximately 99%. On quantum hardware, JVG reduced the required runtime from 67.8 s to 2 s, and the quantum gate counts by over 98%. We showed that the proposed algorithm can address RSA-1024 relevant case scenario, establishing that this method can provide validation for large-scale situations. Furthermore, extrapolation to RSA-2048 indicates that the JVG algorithm significantly outperforms Shor’s approach, requiring a projected quantum runtime of 29 hours for ten thousand runs for factorization under identical scaling assumptions. Overall, these results support JVG as a more hardware-compatible and robust noise-tolerant substitute for Shor’s framework, offering a viable research direction toward practical quantum integer factorization on near-term Noisy Intermediate-Scale Quantum (NISQ) devices.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Qianxi Liu

,

Ye Zhang

,

Sheng Chen

,

Zhaocheng Liu

,

Yuqiu Xu

,

Hengguang Cui

Abstract: This paper addresses the issues of insufficient credibility in attribution inference and susceptibility to noise-induced fluctuations in budget allocation in multi-touchpoint marketing scenarios. It proposes a unified framework for marketing attribution inference and budget decision-making agents that incorporates uncertainty modeling. The method uses user interaction paths as sequence input, generating touchpoint weights through sequence encoding and importance modeling. Simultaneously, it outputs the expected incremental contribution and uncertainty characterization at the channel level, extending attribution results from single-point estimation to distributed signals usable for risk measurement. At the decision-making end, a risk-aware budget optimization objective is constructed, coupling contribution expectation and uncertainty penalty into the budget allocation process. Smoothing constraints are introduced to suppress frequent adjustments, forming a closed-loop update mechanism from data to attribution to budget, enabling the strategy to achieve a balance between revenue and stability under constraints. Multi-touchpoint path and cost characteristics are constructed based on publicly available programmatic advertising datasets. An evaluation system covering attribution error, probability calibration, and budget stability is designed. Comparative experiments verify the framework's comprehensive advantages in attribution reliability and budget decision quality, demonstrating the crucial role and engineering usability of uncertainty in the attribution-to-decision transmission process.

Review
Medicine and Pharmacology
Clinical Medicine

Andreas Kind

,

Helena Pernice

,

Gina Barzen

,

Jan Gröschel

,

Aurelian Eroni Schumacher

,

Stefanie Werhahn

,

Paul Julius Wetzel

,

Frank Edelmann

,

Gerhard Hindricks

,

Katrin Hahn

+1 authors

Abstract: Wearable sensors enable continuous recording of electrocardiographic, photoplethys-mographic, and inertial signals and have accelerated the development of digital bi-omarkers in cardiovascular medicine. Transthyretin amyloidosis (ATTR) is a progressive multisystem disease characterized by arrhythmia, conduction disturbances, hemody-namic impairment, autonomic dysfunction, and gait abnormalities, making it theoreti-cally suitable for multimodal wearable monitoring. This review summarizes current knowledge on wearable applications in ATTR, evaluates the plausibility of extrapolating signal-based biomarkers from related cardiovascular and neurological cohorts, and out-lines methodological and implementation challenges. ATTR-specific data remain limited to small observational studies, mainly on long-term rhythm monitoring and supervised functional assessment. More comprehensive findings support the extraction of metrics such as atrial fibrillation burden, activity patterns, gait variability, and heart rate var-iability. However, ATTR-related structural remodeling and high arrhythmia burden may distort conventional digital biomarkers, necessitating disease-specific preprocessing and prospective validation. Wearable monitoring in ATTR is technically feasible and biologically plausible but remains investigational. Before routine integration into care pathways can be recommended, standardized, phenotype-stratified studies are needed that link wearable-derived characteristics to assessed clinical outcomes.

Review
Medicine and Pharmacology
Neuroscience and Neurology

Maria Pina Dore

,

Giuseppe Lasaracina

,

Giovanni Mario Pes

,

Paolo Solla

,

Elettra Merola

Abstract: Objective: Helicobacter pylori infects nearly half of the global population and has traditionally been viewed as a pathogen restricted to the gastric mucosa. Growing evidence, however, suggests that chronic infection may exert systemic effects extending to the central nervous system. This review critically examines the potential neurological implications of H. pylori infection within the emerging framework of the gut–brain axis. Methods: We performed a narrative, hypothesis-generating review of human observational and interventional studies complemented by mechanistic experimental research. The literature was evaluated with particular attention to study design, heterogeneity, and potential confounding in reported associations between H. pylori infection and neurological disorders. Results: Across multiple studies, H. pylori infection has been linked to a modestly increased prevalence of Parkinson’s disease and dementia, although findings remain heterogeneous. In Parkinson’s disease, infection may exacerbate motor fluctuations and reduce levodopa bioavailability, with partial clinical improvement reported following eradication in selected patients. Experimental studies further demonstrate that bacterial outer membrane vesicles can access the brain and promote neuroinflammatory and amyloidogenic processes, supporting biological plausibility. By contrast, several epidemiological studies report an inverse association with multiple sclerosis, suggesting potential immunomodulatory effects. Evidence relating H. pylori to migraine and mood disorders remains inconsistent. Conclusions: Current data do not support H. pylori as a primary cause of neurological disease. Instead, the infection may act as a context-dependent modifier within the complex inflammatory and immunometabolic networks of the gut–brain axis. Clarifying this relationship will require prospective studies integrating microbial strain profiling, biomarker-defined neurological phenotypes, and adequately powered interventional trials.

Article
Environmental and Earth Sciences
Environmental Science

Olha Biedunkova

,

Pavlo Kuznietsov

,

Oksana Tsos

,

Olha Karaim

Abstract: Sustainable development of regional water resources requires objective classification of lake systems according to dominant hydrochemical processes. The aim of the study was to develop a data-driven hydrochemical typology of natural lakes in Polissya based on the Self-Organizing Map (SOM) method to identify functionally distinct water quality regimes and justify management decisions within the basin approach. The study covered nine lakes of different genesis and trophic status. Key water quality indicators were analyzed: total nitrogen (TN), biochemical and chemical oxygen demand (BOD₅, COD), suspended solids (TSS), iron (Fe), and total dissolved solids (TDS). Descriptive statistics, correlation analysis, and neural network SOM modeling with subsequent clustering were applied. The results revealed strong positive correlations between TN, BOD₅, COD, and TSS, indicating joint control by biogenic and organic processes, while TDS showed negative correlations with organic indicators, reflecting mineralization control. SOM classification allowed us to identify three hydrochemical clusters: background systems with low anthropogenic load; organically enriched lakes with intense biogeochemical cycling; and mineralization-controlled water bodies dominated by geogenic factors. It has been established that spatial features of land use and morphometric characteristics (depth, type of feeding, hydrological connectivity) determine the sensitivity of lakes to external loads and their location.

of 5,685

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated