Sort by

Article
Physical Sciences
Astronomy and Astrophysics

Jinwen Hu

Abstract: In this paper we used a new and special dispersion relation regarding the particles and antiparticles to investigate the issue of Baryon asymmetry of the Universe. With the new dispersion relation of particles and antiparticles, we found that the number density of antiparticles is greater than that of particles. Since our objective world has a significantly higher number density of particles than that of antiparticles, we then referred to previous work and introduced the CPT-odd leptogenesis and Sphaleron process, which can convert the asymmetry of leptons and antileptons into the asymmetry of baryons and antibaryons, that is, the remaining antileptons can be converted into the remaining baryons. Finally, by comparing with the observed baryon asymmetry data in today's universe, we obtained the concrete value of Q that the only undetermined parameter in the introduced Lorentz violation model, and we found that the obtained value of Q did not exceed the upper bound of Q that was derived from various experimental results in previous work. However, since the value of Q obtained in this paper is so small, it brings a challenge for future experiments to detect it.

Article
Medicine and Pharmacology
Dentistry and Oral Surgery

Keisuke Seki

,

Minori Kashima

,

Taiki Akiyama

,

Atsushi Kobayashi

,

Ko Dezawa

,

Yoshimasa Takeuchi

,

Mika Furuchi

,

Atsushi Kamimoto

Abstract: The mandibular cortical index (MCI) is a valuable screening tool for osteoporosis on dental panoramic radiographs; however, inter-examiner variability remains a significant challenge. This study aimed to evaluate the diagnostic performance and reproducibility of a closed-type generative AI (NotebookLM, Google) compared with eight dentists of varying experience levels. One hundred radiographs were evaluated in two sessions with an interval of at least two weeks. The intra-examiner reliability for the AI was exceptionally high (κ = 0.987), and its processing speed was approximately six times faster than that of the dentists. However, the agreement between the AI and the dentists remained at "slight agreement" or lower (κ < 0.2), statistically rejecting the null hypothesis of diagnostic equivalence. Notably, a "two-level discrepancy" was observed, where the AI interchanged Class 1 (normal) and Class 3 (severe) in over 10% of cases. In contrast, dentists demonstrated a significant learning effect, with inter-examiner agreement improving between sessions. These results suggest that while generative AI offers superior speed and reproducibility, its current decision-making logic deviates fundamentally from human expert criteria. Future integration should focus on hybrid models where AI serves as a standardized feedback tool while dentists provide final confirmatory diagnoses.

Review
Medicine and Pharmacology
Oncology and Oncogenics

Andreu Ivars

,

Blanca Paño

,

Josep Puig

,

María Fresno

,

Leonardo Rodriguez

,

Carmen Sebastià

,

Carlos Nicolau

Abstract: Renal cell carcinoma encompasses a heterogeneous group of kidney tumors with wide variations in biological behavior, histologic subtype, and clinical aggressiveness. Accurate preoperative characterization is essential for management; however, it remains challenging due to overlapping imaging features and tumor complexity. CT is the most widely used imaging technique for renal mass evaluation, providing broad availability, high spatial resolution, and multiphasic acquisition capabilities. However, its ability to distinguish histologic subtypes and predict tumor aggressiveness remains limited. This review provides an updated overview of renal cell carcinoma epidemiology and evidence supporting CT as an essential imaging modality. It outlines key radiologic features of main histologic subtypes, highlights markers of aggressive behavior, and discusses the relationship between CT findings and the International Society of Urological Pathology (ISUP) grading system. We explore radiomics, summarizing its methodological foundations and applications in characterizing solid renal masses, emphasizing the need for multicenter studies and standardized radiomic workflows to develop accurate, reproducible tools for improving diagnostic accuracy and risk stratification for renal cell carcinoma.

Review
Medicine and Pharmacology
Neuroscience and Neurology

Poulami Kar

,

Dipayan Roy

,

Bhoomika R. Kar

Abstract: Parkinson’s disease (PD) encompasses marked heterogeneity across motor, cognitive, and non-motor domains, reflecting variable balances of neurodegeneration and compensation across distributed brain circuits. Diffusion MRI tractography enables pathway-specific characterization of white matter alterations and offers a framework for linking clinical subtypes to patterns of degeneration and compensation along individual tracts that are often obscured by skeleton-based or connectomic averaging. Although several tract-specific correlational diffusion studies have linked individual pathways to clinical features and progression, much of the literature has relied on group-level skeleton or network representations, limiting generalizability and reproducibility across subtypes. Here, we synthesize tractography-based evidence across PD subtypes—including tremor-dominant, postural instability/gait difficulty, freezing of gait, and cognitive phenotypes—while situating these findings within a complementary multimodal imaging context. We review diffusion models ranging from diffusion tensor imaging to advanced free-water, neurite and fixel-based frameworks, highlighting how these approaches constrain and interpret tract-level findings and help distinguish degenerative processes from adaptive neuroplasticity. Emerging analytical approaches, including harmonization pipelines, radiomic tractometry (the extraction of along-tract microstructural and radiomic features), and machine learning classifiers, further enhance tract-level sensitivity and reproducibility. Cognitive subtypes illustrate how degeneration of posterior association and limbic tracts, in interaction with non-dopaminergic systems such as cholinergic and noradrenergic pathways, shapes clinical progression. Integrating tractography with molecular, genetic, and functional markers enables subtype-specific biomarkers for risk stratification, prognosis, and targeted therapeutic intervention. We propose a conceptual and methodological roadmap for leveraging tractography to refine PD subtype definitions and inform precision neuromodulation and rehabilitation strategies.

Article
Medicine and Pharmacology
Internal Medicine

Margot Evelin Bernedo-Itusaca

,

Judith Marie Merma-Valero

,

Tatiana Milagros Cruz-Riquelme

,

Rocio Milagros Ccorimanya-Suni

,

Maria Emilia Pancaya-Flores

,

Zhenia Milagros Guevara-Mamani

,

Doris Chambi-Rodrigo

,

Mahely Adriana Coa-Coila

,

Wilma Apaza-Cansaya

,

Mirian Milagros Apaza-Quispe

+6 authors

Abstract: Introduction: A major health issue in individuals living at high altitude regions is an increase in the number of red blood cells (RBCs). This condition generates a series of physiological alterations, including the nervous system, where damage can occur due to increased blood viscosity. This increased viscosity, in turn, could compromise oxygen uptake, potentially leading to a degree of cognitive impairment. Objective: To determine the association between exposure to chronic hypoxia and sleep quality with the degree of cognitive impairment (IQ) in a young adult population residing at different altitude levels. Methodology: Two hundred apparently healthy subjects of both sexes, aged 21 to 26 years, permanently residing in four cities at different altitudes—Lima, Arequipa, Puno, and La Rinconada (50 participants per location)—were evaluated. Physiological variables such as oxygen saturation (SpO2), blood pressure (BP), heart rate (HR), and hemoglobin (Hb) and hematocrit (Hct) levels were measured. Cognitive impairment was assessed using the Montreal Cognitive Assessment (MoCA), and sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI). ANOVA, chi-square, and linear regression models were used to analyze correlations. Results: Hemoglobin (Hb) levels increased gradually with altitude, reaching a maximum of 19.47 ± 3.01 g/dL in La Rinconada, while SpO2 decreased to 81.64% at the same site. Moderate to severe cognitive impairment was a finding exclusive to the La Rinconada population (5100 m), where only 10% of subjects remained unaffected. Regression analysis showed that for each unit increase in Hb, the MoCA score decreased by 0.59 points, indicating that elevated Hb levels were associated with varying degrees of cognitive impairment. No association was found between sleep quality and the degree of cognitive impairment. Conclusions: Chronic exposure to severe hypoxia (>5000 m) is associated with a greater presence of cognitive impairment, while sleep quality is not associated with any degree of cognitive impairment.

Review
Public Health and Healthcare
Physical Therapy, Sports Therapy and Rehabilitation

Ali Alali

,

Harman Bains

,

Bhavinbhai Patel

,

Deborah Falla

,

Andrew Soundy

Abstract: Background: Physical activity is a recommended first‑line treatment for the management of chronic low back pain, yet adherence to structured exercise often remains poor due to pain, fear, fatigue, and contextual barriers. Snacktivity™, which promotes brief, frequent bouts of movement embedded in daily routines, has emerged as a potentially feasible alternative. However, it remains unclear how, why, and for whom Snacktivity supports engagement in physical activity for people living with chronic low back pain. Objective: To develop and refine programme theories explaining how Snacktivity‑type interventions support physical activity engagement and related outcomes in adults with chronic low back pain. Methods: A realist review was conducted following RAMESES standards. Initial programme theories were developed and iteratively refined through synthesis of quantitative, qualitative, and mixed‑methods evidence from Snacktivity and related sedentary‑reduction interventions in low back pain and other adult populations in order to test developed programme theories. Evidence was analysed to identify context–mechanism–outcome configurations. Results: A total of four studies met the inclusion criteria for Snacktivity-type studies related to low back pain and were included to develop and test the initial programme theories. This was then supported by 38 studies that contributed evidence to programme theory refinement. Five refined programme theories were supported. Snacktivity appears to enable engagement by lowering perceived burden and threat rather than eliminating fear, generating mastery experiences that enhance self‑efficacy, and reducing symptom interference through brief, distributed activity. Education and coaching components supported meaning‑making by reframing movement as legitimate and achievable, while environmental cues and routines promoted habit formation. Psychosocial outcomes (confidence, mood, vitality) and habit formation improved more consistently than performance‑based outcomes, and engagement was sustained even when pain or fatigue persisted. Conclusions: Snacktivity functions as a participation‑enabling intervention rather than traditional exercise prescription. Its effectiveness for chronic low back pain is explained by psychosocial and contextual mechanisms that support psychological safety, mastery, and habit formation. These findings support a shift from dose‑response exercise models toward interventions that prioritise feasibility, meaning, and sustained participation in daily life.

Article
Public Health and Healthcare
Public, Environmental and Occupational Health

Piero Zucchelli

,

Natalie Smith

Abstract: Fatigue is a leading contributor to maritime accidents, yet recreational sailors lack the regulatory frameworks and fatigue management tools available to commercial mariners. Peer-reviewed research published in Nature demonstrates that after 17 hours of sustained wakefulness, cognitive performance degrades to a level equivalent to a blood alcohol concentration of 0.05% — the legal driving limit in most countries (Dawson & Reid, 1997). After 24 hours, this rises to 0.10%, well past the threshold for legal intoxication. These findings have been independently replicated (Williamson & Feyer, 2000) and confirmed in field studies aboard racing yachts (Hurdiel et al., 2014). This paper synthesises more than three decades of peer-reviewed research spanning chronobiology, sleep medicine, occupational health, and maritime safety into a biomathematical fatigue model calibrated specifically for pleasure boat passage-making. The model integrates sleep-wake homeostasis, circadian rhythm modulation, sleep fragmentation effects, environmental sleep degradation from sea state, and cumulative multi-day sleep debt into a single framework that outputs impairment as a BAC (blood alcohol concentration) equivalence — an intuitive metric that any sailor can understand. Critically, the model is not merely theoretical. It has been implemented as a freely available, openaccess passage fatigue calculator for mobile and web platforms, making it accessible to the widest possible population of recreational mariners. The application faithfully reproduces every formula, constant, and coefficient described in this paper, allowing sailors to simulate any passage plan — varying crew size, watch schedule, departure time, pre-departure sleep, and sea state — and see the predicted fatigue trajectory hour by hour. The purpose is to bridge the gap between laboratory science and practical seamanship: to give pleasure boat crews the same evidence-based fatigue awareness that professional mariners receive through regulation.

Review
Biology and Life Sciences
Biochemistry and Molecular Biology

Béatrice Vibert

,

Faustine Henot

,

Oriane Frances

,

Jérôme Boisbouvier

Abstract: Monoclonal antibodies (mAbs) have been the subject of extensive study in recent years due to their recognition as highly promising therapeutic molecules offering high specificity and a low risk of side effects. Monitoring the structure of these molecules is crucial for developing new therapeutics, characterizing interactions with antigens or receptors, and explaining potential changes in activity between antibody production batches. However, commonly used biophysical approaches provide only low spatial resolution information, and conventional structural biology techniques such as crystallography and cryo-electron microscopy (cryo-EM) are difficult to apply to these highly dynamic proteins. Solution nuclear magnetic resonance (NMR) spectroscopy is the method of choice for structural studies of flexible proteins at atomic resolution; however, it has traditionally been limited to low-molecular-weight biological systems. In this review, we present recent advances in NMR spectroscopy and advanced isotopic labeling methods that have enabled the atomic-resolution study of both the crystallizable (Fc) and antigen-binding (Fab) fragments of antibodies. We show how NMR is becoming a powerful tool for investigating full-length mAbs at an atomic level, opening up new possibilities for the characterization and in-depth quality control of therapeutic antibodies in solution.

Article
Medicine and Pharmacology
Clinical Medicine

Roxana-Cristina Mehedinti

,

Dorin Ioan Cocoș

,

Ada Stefanescu

,

Madalina Nicoleta Matei

,

Gabriel Valeriu Popa

,

Dana Tutunaru

Abstract: Prolonged contact between oral mucosa and dental amalgam restorations may influence local epithelial homeostasis, although the immunohistochemical profile of clinically non-dysplastic mucosa exposed to long-term restorative materials remains insufficiently defined. This study investigated histopathological remodeling and the expression patterns of cytokeratin 19 (CK19), Ki67, and p53 in oral mucosal specimens adjacent to long-standing amalgam restorations. A total of 108 specimens were retrospectively analyzed, including 78 samples from mucosa in direct contact with amalgam restorations and 30 control specimens without amalgam exposure. Exposed cases were categorized according to contact duration: 5–10 years, 11–20 years, and ≥21 years. Epithelial and stromal changes were semi-quantitatively assessed, and immunohistochemical staining was evaluated using predefined scoring criteria. An exploratory Integrated Epithelial Remodeling Score (IERS), combining basal hyperplasia, inflammatory infiltrate, CK19 distribution, and Ki67 proliferative index, was used to estimate cumulative remodeling intensity. Longer amalgam exposure was significantly associated with increased inflammatory infiltrate, basal epithelial expansion, suprabasal CK19 expression, and higher Ki67 labeling indices (all p &lt; 0.001). CK19 redistribution showed positive associations with both inflammatory intensity and epithelial proliferative activity. IERS values differed significantly among exposure groups (p &lt; 0.001), with more pronounced remodeling in intermediate- and long-duration exposure categories. p53 expression showed statistically detectable but heterogeneous variation. No epithelial dysplasia was observed. These findings suggest that long-term contact with dental amalgam restorations is associated with a coordinated, non-dysplastic remodeling phenotype of the oral mucosa, characterized by inflammatory activation, CK19 redistribution, and reactive proliferative reinforcement. In this context, suprabasal CK19 expression may reflect adaptive epithelial plasticity rather than preneoplastic transformation.

Article
Business, Economics and Management
Business and Management

Andrés Polo

,

Lina María Puentes Espejo

,

Fredy Cervera-Galindo

,

Marylin Beltran-Rodríguez

Abstract: This working paper develops a methodological approach for integrating mathematical optimization with a digital twin environment in the analysis of first-mile milk collection systems. The approach combines a mixed-integer linear programming (MILP) model for network design with a digital representation that enables the evaluation of system behavior under changing operating conditions. The optimization model determines the baseline configuration, including the location of collection points, capacity allocation, and producer assignments. This configuration is then embedded into the digital twin, where its performance is examined under a representative perturbation scenario involving a 20% reduction in milk supply. The analysis shows that the baseline configuration, while efficient under nominal conditions, is sensitive to variations in supply, leading to reduced utilization and higher unit costs. Allowing limited operational adjustments within the fixed network structure improves performance, although economic indicators do not fully return to baseline levels. The results also reveal uneven effects across performance dimensions, indicating the presence of trade-offs between economic, operational, and environmental outcomes. The contribution of this study lies in connecting optimization-based design with a digital evaluation environment that enables the assessment of network configurations beyond their initial formulation. The approach provides a structured way to examine how a given design responds to changing conditions without requiring immediate structural modifications. The analysis is illustrative and intended to demonstrate the integration mechanism. Future work will extend this approach through systematic scenario design, quantitative validation, and the incorporation of real-time data.

Article
Medicine and Pharmacology
Emergency Medicine

Ameline Saouli

,

Ali AlRahma

,

Hadeel Farhan

,

Abu Omayer

,

Radwa Nour

,

Azza Yousif

,

Ives Hubloue

,

Nabil Zary

Abstract: The use of technology-enhanced training for prehospital mass-casualty incident (MCI) preparedness has grown quickly, but there has been no comprehensive overview of how these technologies operate throughout the training process or how competencies are evaluated. This scoping review, conducted as part of the MCIPHER (Mass-Casualty Incident Prehospital Emergency Response) project, followed the Arksey and O'Malley framework and PRISMA-ScR guidelines. We searched seven databases and additional sources, screened 2,105 records, and included 28 studies published from 2015 to 2025. Virtual reality was the most common method (43%), followed by hybrid approaches (29%) and screen-based simulations (21%). We identified five key analytical constructs. Three were derived from the data: the Technology Function Spectrum revealed that half of the studies used dual-purpose platforms for both training and performance assessment; the Data Capture Architecture linked embedded data collection to advanced learning outcomes (L2+); and the Pedagogical Transparency Gap showed that 75% of studies did not specify a training design framework. Two other constructs — the Immersion-Evaluation Paradox and the Scalability-Rigor Tension — suggest areas for future research. Using a modified Kirkpatrick framework with an L2+ (Applied Learning) sub-level, 56% of completed studies demonstrated applied learning through embedded performance assessments. Overall, these findings suggest that investments in MCI preparedness should focus more on measurement capabilities than immersion, incorporate assessment into training platforms, and work to reduce geographic and resource disparities.

Article
Engineering
Electrical and Electronic Engineering

Araavind Sridhar

,

David Steen

,

Le Anh Tuan

Abstract: The growing adoption of electric vehicles (EVs) and the rapid expansion of public charging infrastructure pose new challenges and opportunities for energy systems, particularly in urban settings. This study presents an optimization-based evaluation of different EV charging strategies including direct charging, average-based methods, smart charging, and vehicle-to-grid (V2G) at public parking lots using real-world charging session data. This data-driven model is set to optimize the public EV charging of vehicles in Gothenburg, without sacrificing on the energy requirement while minimizing charging costs for the operators. Results indicate that direct charging scenarios lead to significantly higher peak loads (up to 1286 kW) and costs (around 370 k€), highlighting their inefficiency under unmanaged operation. In contrast, smart charging reduces peak loads by approximately 47% and overall costs by around 74%, showcasing its potential for cost-effective grid-friendly operation. Two different V2G scenarios were tested based on the impact of discharged power accounted for in peak costs, though it enables energy discharge back to the grid, the benefits remain modest under current assumptions due to tight operational constraints and limited incentives. The study emphasizes the value of smart optimization and appropriate market design in enhancing the flexibility and cost efficiency of public EV charging systems.

Article
Medicine and Pharmacology
Veterinary Medicine

Fidel San Román-llorens

,

Alejandro Blanco

,

Fidel San Roman

,

Cristina Gonzalez

,

Alberto Climent

,

Julia Laliena

,

Manuel Alamán

,

Ana Whyte

Abstract: Cranial cruciate ligament (CrCL) rupture in cats is less common than in dogs, and its optimal treatment remains a subject of debate. The aim of this study was to evaluate the application of cranial tibial wedge osteotomy technique (CTWO) as a dynamic stabilization technique in cats with CrCL rupture, describing the technical aspects and clinical outcomes obtained. Five cases with a confirmed diagnosis of CrCL rupture between 2020 and 2024 were included in this study. All patients were treated with CTWO using specific osteosynthesis locking plates for this technique in dogs and a complementary cerclage wire. Radiographic rechecks were performed at 8 and 12 weeks postoperatively and clinical evaluations were performed 24 hours, 8 weeks, 12 weeks and six months postoperatively in every patient. Successful and complete bone healing of the tibial osteotomy was observed in every case. No intraoperative or postoperative complications related to implants or soft tissues were recorded. All cats achieved a complete functional recovery without lameness at the last recheck six months after surgery. The technique was performed without significant technical difficulties, providing adequate stability and favorable clinical outcomes in all cases. Although the use of a cranial tibial wedge osteotomy in combination with a tibial plateau leveling osteotomy (TPLO) was reported by Hoot et al to treat a cruciate ligament rupture in a cat presenting a deformity of the proximal tibia (Hoot et al), to the authors´ knowledge the use of the cranial tibial wedge osteotomy as single technique to treat the CrCL rupture in cats has not been previously reported in the literature. These preliminary results support the use of CTWO as an effective surgical alternative for the treatment of CrCL rupture in cats. However, further studies with a larger number of cases and a longer follow-up are required to evaluate better its clinical application, outcomes and influence on osteoarthritis progression in the long term.

Article
Environmental and Earth Sciences
Geophysics and Geology

Jianchun Xu

,

Yanxu Liu

,

Baodi Wang

,

Xuanjie Zhang

,

Yanan Zhang

,

Xin Wang

Abstract: The Jiaduoling area is located in the northern segment of the Southwest Sanjiang Metallogenic Belt, a region characterized by complex geological structures and abundant mineral resources. This study systematically identifies the spatial correlation between subsurface magnetic bodies and tectonic structures by utilizing 1:50,000 high-precision aeromagnetic data. Advanced processing techniques—including upward continuation, vertical derivatives, total gradient modulus, and Euler deconvolution—were integrated to refine the structural framework and clarify the mechanisms of fault-controlled mineralization.The results indicate that the aeromagnetic anomaly pattern is predominantly governed by NW-trending faults. Specifically, the deep-seated major fault F1 (with a calculated depth exceeding 3 km) served as the primary migration channel for ore-forming fluids, while secondary faults created localized ore-hosting spaces. Physical property analysis reveals a significant magnetic contrast, where Mesozoic intermediate-acid magmatic rocks act as the essential source for mineralization, providing both material and thermal energy for the formation of porphyrite-type iron deposits.Based on these findings, a three-dimensional "aeromagnetic anomaly-structural framework-mineralization" correlation model was established. Finally, two high-potential metallogenic prospective zones (P1 and P2) were delineated, providing precise geophysical evidence and strategic guidance for regional mineral exploration and the targeting of concealed ore bodies.

Article
Engineering
Chemical Engineering

Muhamad Fouad

Abstract: The Zeta-Minimizer Theorem establishes a variational foundation for the Riemann zeta function by minimizing a phase functional derived from the compressibility factor. Starting from the classical virial expansion, the theorem performs an exact exponential resummation that yields the Euler product form of ζ(s) over a finite helical basis. In a symmetric measure space equipped with non-proper Archimedean conical helices, four geometric constraints—rational signed cosines, positive integer representation dimensions, non-zero integer differences, and prime-modulated exponential decays—force primes to emerge as indivisible cycles in the representation graph, via Hilbert’s irreducibility theorem and Maschke’s theorem. Corollaries include the deductive proof of the Riemann Hypothesis (non-trivial zeros spectrally centered on Re⁡(s)=1/2), stacked phases as stratified orbifolds, emergent layered geometries, bounded prime descent, and dimensional resistance. The three axioms abstract thermodynamic equilibrium conditions purely: strict concavity of entropy on measures, non-vanishing spectral Gibbs minima, and covariance with flux conservation. Number-theoretic structures, complex numbers, polynomials, and quantization itself appear as projected artifacts of the underlying variational optimization. Applications range from atomic stratification (quantized shells arising from phase jumps) and angular-momentum tensors to the fine-structure constant (emergent from cycle sums with β=5 leaps) and covariant mappings to arbitrary conjugate variables via category-theoretic functors and renormalization-group universality. By demoting elementary mathematical constructs to derived descriptions of thermodynamic optimization on the helical manifold, ZMT provides a unified deductive framework for analytic number theory, algebraic geometry, and spectral theory.

Article
Computer Science and Mathematics
Other

Huayou Si

,

Mengyang Li

,

Yuanyuan Qi

,

Neal N. Xiong

,

Wei Chen

,

Loc Nguyen The

,

Shichong Wang

Abstract: This paper proposes a decentralized data trading approach based on the Automated Market Maker (AMM) mechanism, aiming to break through the bottlenecks in data trading efficiency and fairness via the collaborative innovation of market-oriented pricing mechanisms and automated trading processes. We focus on constructing an automatic pricing and matching mechanism based on liquidity pools. Subsequently, mathematical modeling and simulations reveal slippage generation mechanisms in data liquidity pools under trading shocks and imbalances. To address these issues, a novel dual slippage optimization mechanism integrating dynamic trade splitting and alternating order sorting is proposed, which decomposes orders into sub-orders and reorganizes their timing, establishing a dynamic equilibrium model. Experiments show the method reduces average slippage amplitude from 2.1% to 0.5% and representing a 76.2% reduction, significantly enhancing price stability and market efficiency. This research explores the mechanism of applying AMM to data asset trading and overcomes AMM's limitations, providing a theoretical and empirical foundation for building low-cost, high-fairness data trading systems through mechanism innovation and technical optimization.

Review
Biology and Life Sciences
Biochemistry and Molecular Biology

Christianna Stanley

,

Abdullah Alje

,

Mohammad Mahmoudi

,

Aubrey Babb

,

Jianghong Qian

,

Marifah Albalawi

,

Jody Berry

,

Hazim Aljewari

Abstract: Controls are fundamental to ensuring accuracy and reliability in molecular diagnostics, yet their roles are often oversimplified or conflated with broader quality assurance frameworks. As molecular testing expands from centralized laboratories to point-of-care (POC) and over the counter (OTC) settings, the design, implementation, and interpretation of controls must evolve to address diverse operational environments and clinical risks.This review introduces a comprehensive framework for understanding control strategies in molecular diagnostics, integrating internal, external, and orthogonal controls within a tiered, risk-based testing model. We categorize diagnostic systems into three tiers—screening (OTC/POC), confirmatory laboratory testing, and reference-level or adjudication testing—and examine how control requirements scale with analytical complexity, user variability, and clinical impact. Across these tiers, controls serve distinct but complementary roles, including verification of assay functionality, mitigation of contamination, maintenance of cross-platform consistency, and resolution of diagnostic uncertainty. We further analyze common failure modes in molecular diagnostics, including sample-related errors, inhibition, contamination, and interpretation challenges, and map how specific control strategies mitigate these risks. Regulatory perspectives from the U.S. Food and Drug Administration (FDA), Clinical Laboratory Improvement Amendments (CLIA), International Organization for Standardization (ISO), and World Health Organization (WHO) guidelines are discussed, highlighting the shift toward risk-based and context-dependent control design rather than rigid, one-size-fits-all requirements.Importantly, we address the balance between control burden and clinical utility, emphasizing that excessive control implementation may increase system complexity without proportionate gains in diagnostic value particularly in decentralized settings. Emerging trends, including artificial intelligence (AI)-assisted diagnostics and decentralized molecular platforms, are also explored as transformative approaches to enhancing control integration and result validation.We propose that a tier-adaptive, risk-based control framework is essential for next-generation molecular diagnostics, enabling accurate, scalable, and user-centered testing systems. This perspective supports the development of robust diagnostic platforms that maintain analytical integrity while improving accessibility and real-world performance.

Article
Public Health and Healthcare
Public Health and Health Services

Fangya Tan

,

Bowen Long

Abstract: Background: Missing data, particularly progression-driven dropout, introduces substantial bias in longitudinal oncology studies, directly impacting response classification based on RECIST criteria. While machine learning–based imputation methods are increasingly used, their performance is rarely evaluated in a clinically interpretable framework centered on patient-level endpoints such as Best Overall Response (BOR). Methods: We propose a clinically grounded evaluation RECIST 1.1 framework focused on patient-level response classification. Longitudinal tumor trajectories were simulated for 270 patients (1:1 HER2+ and HER2−) across nine follow-up visits using both Gompertz and Stein–Fojo growth models. Realistic missingness was introduced through a combination of random mechanisms and progression-driven dropout. Two machine learning imputation methods, long short-term memory (LSTM) and MissForest, were evaluated under both direct (MAR-based) and Non-responder imputation strategies. Performance was assessed using BOR classification metrics, including accuracy and Cohen’s kappa. Result: Across both simulation frameworks, imputation substantially improved BOR classification performance. Under the Gompertz model, accuracy increased from 0.83–0.87 with direct imputation to 0.93–0.98 with non-responder imputation, with corresponding kappa improvements from 0.71–0.79 to 0.89–0.97. Similar trends were observed under the Stein–Fojo model (accuracy: 0.82–0.84 vs. 0.91–0.96; kappa: 0.69–0.72 vs. 0.86–0.94). Among the evaluated methods, MissForest combined with non-responder imputation demonstrated the most stable and consistently high performance across simulation settings. In contrast, LSTM exhibited greater variability, particularly under complex missingness patterns. Conclusion: Imputation strategies aligned with clinical estimands, such as non-responder imputation, substantially improve patient-level response classification. This study establishes a clinically interpretable evaluation framework linking machine learning–based imputation to RECIST-based endpoints, supporting more robust and regulator-relevant handling of patient-level interpretability under missing data in oncology trials.

Case Report
Medicine and Pharmacology
Cardiac and Cardiovascular Systems

Oana Elena Branea

,

Mihaly Veres

,

Oana Frandeș

,

Matild Keresztes

,

Mihai Claudiu Pui

,

Ciprian Fișcă

,

Radu Bălău

,

Leonard Azamfirei

Abstract: Cardiogenic shock secondary to acute myocardial infarction complicated by mechanical failure remains associated with high mortality despite advances in cardiac surgery and mechanical circulatory support. We report the case of a 42-year-old patient with poste-rior papillary muscle rupture leading to severe mitral regurgitation, managed with emergency surgical intervention and extracorporeal membrane oxygenation. The pa-tient, with a history of Type I Bipolar Disorder under long-term lithium therapy and chronic Cannabis use, presented in critical condition with cardiogenic shock (Killip IV), acute pulmonary edema, and ST-segment elevation myocardial infarction in the in-fero-posterior territory. Coronary angiography revealed right coronary artery occlusion and involvement of an obtuse marginal branch. Emergency mitral valve replacement with a mechanical prosthesis and aortocoronary bypass were performed. Due to failure to wean from cardiopulmonary bypass, central veno-arterial ECMO was initiated. The postoperative course was complicated by hemodynamic instability and recurrent peri-cardial collections requiring repeated surgical interventions and conversion to periph-eral ECMO. Multiorgan dysfunction developed, including hepato-renal failure requiring hemofiltration, neurological injury, respiratory impairment, and neuropsychiatric complications. Despite these challenges, progressive recovery was achieved under in-tensive multidisciplinary management. This case emphasizes the importance of early surgical correction and tailored ECMO support in managing post-infarction mechanical complications.

Review
Social Sciences
Safety Research

Cromwel Tepap Zemnou

Abstract: Artificial intelligence is fast expanding in clinical research and medicinal development. In response, a considerable governance literature has arisen, characterised by ambitious theoretical frameworks but persisting gaps in practical implementation. This critical analysis assesses the underlying assumptions, organisational constraints, and institutional flaws that undermine responsible AI governance in healthcare and clinical research. The analysis combines findings from AI ethics, organisational governance, computational toxicology, clinical trial methodology, and patient safety science. The core thesis is that, despite significant agreement among governments, corporations, and academia on stated objectives, the responsible AI field has persistently failed to bridge the gap between normative goals and organisational realities. In clinical settings, this failure has direct consequences for patient safety. The analysis is structured around five interconnected critiques: the conceptual inadequacy of the performance-centric evaluation paradigm, which conflates statistical reliability with clinical safety; the inadequacy of explainability methods as substitutes for genuine accountability; the practical unimplementability of principled administrative frameworks in most healthcare research institutions; and the characterisation of regulatory fragmentation as a political economy. Drawing on a large body of research, the review suggests that solving the governance gap in clinical AI requires facing more fundamental assumptions than current studies acknowledges.

of 5,855

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated