Sort by

Article
Computer Science and Mathematics
Computer Networks and Communications

Aiman Moldagulova

,

Zhuldyz Kalpeyeva

,

Raissa Uskenbayeva

,

Nurdaulet Tasmurzayev

,

Bibars Amangeldy

,

Yeldos Altay

Abstract: Low-cost air quality sensors enable dense monitoring networks but suffer from significant measurement noise and instability particularly in dynamic environments. Conventional fixed-window smoothing reduces noise but introduces a trade-off between signal stability and temporal responsiveness, often attenuating short-term pollution events. This paper proposes an adaptive filtering algorithm that dynamically adjusts the averaging window size based on short-term signal variability. The method relies on real-time variance estimation to balance noise suppression and sensitivity to rapid changes without increasing computational complexity. The approach is implemented within an IoT-based monitoring framework and evaluated using parallel measurements with a certified reference device. Comparative analysis against raw data and fixed-window filtering demonstrates improved statistical accuracy and stronger temporal correlation with reference measurements. In addition, this method enhances event detection stability in threshold-based monitoring scenarios. To support automated decision-making, the filtered signal integrated into an event-driven architecture with Robotic Process Automation (RPA), enabling reliable triggering of predefined workflows. The results show that proposed adaptive filtering provides an efficient and lightweight solution for real-time signal processing on resource-constrained devices, making it suitable for large-scale deployment in environmental monitoring systems.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Md Shakib Hasan

,

Mst Mosaddeka Naher Jabe

Abstract: As the world embarks on an artificial intelligence revolution, governments and supranational organizations are taking highly divergent approaches to regulation in an effort to regulate the effects of AI. Although there are new educational theories, which propose that AI might precipitate a paradigm shift in how knowledge is produced, which values human-AI co-creation [1], empirical studies on the actual way states will make the transition are in short supply. To fill this gap, this research paper applies a qualitative comparative policy review of 35 representative excerpts extracted from seven authoritative legislative and strategic documents across China, Singapore, and the European Union. We use a six-dimensional framework (inter-coder reliability κ = 1.00) to investigate the extent to which these policies are framed around optimization or restructuring: focusing on infrastructural scale and efficiency versus requiring systemic, pedagogical, and epistemic transformation. As findings indicate, there are radically different policy imaginaries. Relying solely on restructuring-based legal requirements, the EU compensates for high-risk algorithmic harms and implements tight ethical protection. China displays a characterized temporal development, as it alters macroeconomic optimization in 2017 to a hybrid system that requires interactive exploration and multimodal creation in 2025. Singapore, on the other hand, takes the calculated risk of a middle way, with massive reorganization of human-focused pedagogical functions and with optimization safely applied to scale up the infrastructures of the public services. Finally, this research paper proves that there is no single global AI educational governance. We state that negotiating this optimization-restructuring tension is the key to institutions that seek to develop authentic student agency without undermining ethical protection.

Article
Business, Economics and Management
Business and Management

Saran S Singh

Abstract: This study examines the relationship between capital structure and financial performance amongst firms constituting the Dow Jones Industrial Average (DJIA) over the period 2009 to 2020. Utilising secondary data from the Bloomberg database and employing standardised multiple linear regression, the study analyses the influence of the Total Debt to Total Equity (TDTE) ratio on two performance indicators: Earnings Per Share (EPS) and Return on Equity (ROE). Macroeconomic controls, including inflation, tax rate, GDP growth, and current account balance, were incorporated into both model specifications. The findings indicate that TDTE did not exert a statistically significant effect on either EPS or ROE. Firm size, however, emerged as a significant positive determinant of EPS, whilst the statutory tax rate exhibited a significant negative association. These results suggest that, in the post-crisis environment, operational scale and conservative financing practices were more consequential than leverage in shaping financial performance.

Article
Biology and Life Sciences
Biochemistry and Molecular Biology

Dmitry N. Shcherbakov

,

Ekaterina D. Mordvinova

,

Vadim O. Trufanov

,

Natalia V. Volkova

,

Yulia V. Meshkova

,

Maria K. Marenina

,

Anna V. Zaykovskaya

,

Ekaterina A. Volosnikova

,

Sophia S. Borisevich

,

Svetlana V. Belenkaya

Abstract: A cell-based screening system for viral protease inhibitors was developed using firefly luciferase fragment complementation and validated on the SARS-CoV-2 3CLpro model. The optimal luciferase variant incorporating the VLQSGF proteolytic site (Luc III) retained 88% of its native activity. A critical requirement for system performance was the use of an extended nsp4–nsp6 fragment of the viral polyprotein rather than the mature protease, underscoring the importance of the native context for 3CLpro activity. The bicistronic construct pCAG-Luc-III-IRES-nsp4-6 enables coordinated expression of the reporter and protease, thereby increasing assay reproducibility. IC50 values obtained in this system for nirmatrelvir and GC376 correlated with live-virus assay data but differed significantly from those of a cell-free FRET assay, reflecting the impact of cellular barriers. This approach combines simplicity, a standard substrate, and high reproducibility, making it promising for high-throughput screening in basic laboratory settings and adaptable to other viral proteases.

Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Md Nurul Absar Siddiky

Abstract: Large language models (LLMs) are now used in chatbots, search engines, writing assistants, coding tools, educational systems, and AI agents. At the same time, they are vulnerable to a wide range of attacks. Some attacks attempt to make the model ignore its rules and produce harmful or manipulated outputs, while others aim to extract private or sensitive information from the model or its training data. This paper presents a concept-level survey of major LLM attack methods in language that is simple enough for broad readers while remaining structured like a research paper. We organize the literature into two high-level groups: security attacks and privacy attacks. Under security attacks, we discuss prompt injection, jailbreaking, backdoor attacks, and data poisoning attacks. Under privacy attacks, we discuss gradient leak-age, membership inference, and personally identifiable information (PII) leakage. For each family, we explain the core idea, summarize representative methods from the literature, and provide descriptive toy examples that help readers understand the mechanism without requiring advanced background knowledge. The goal of this paper is pedagogical: to help new researchers, students, and general readers build a clear mental model of the LLM attack landscape.

Article
Social Sciences
Safety Research

Priyanshu Jain

Abstract: The post-World War II international order is undergoing simultaneous collapse on two fronts: a geopolitical fragmentation driven by twenty consecutive years of democratic decline, and an accelerating concentration of economic power driven by advances in artificial intelligence. This paper argues that the convergence of these two forces is producing a structural transformation unprecedented in human history, one that could stabilize into a neo-feudal equilibrium in which a vanishingly small class of infrastructure owners wields power comparable to pre-Enlightenment monarchs, while the vast majority of humanity loses both its labor value and its political leverage. Unlike previous feudal orders, this one may prove uniquely resistant to revolution, because the mechanisms of enforcement (autonomous weapons, AI surveillance, algorithmic propaganda) do not require human cooperation and therefore cannot be undermined by human dissent. The paper examines the historical parallels (and crucial disanalogies) between contemporary populist-authoritarian movements and their twentieth-century predecessors, models the emerging class structure under conditions of artificial general intelligence, evaluates Universal Basic Income through the lens of incentive structure, arguing that without the revolutionary threat that historically forced redistribution, UBI will default to a pacification mechanism rather than a genuine solution, examines the future of the nation-state under conditions where AI infrastructure owners command more wealth and capability than most governments, and argues that the effective altruism community's near-exclusive focus on existential risk from AI has created a dangerous blind spot around the political economy of who controls AI and who benefits from it.

Article
Physical Sciences
Theoretical Physics

George Davey

Abstract: We propose a finite, probabilistic, and algebraic formulation of quantum gravity based on the Chrono-Emergence (CE) emergence law and the Discrete-State Time Density (DTD) field ˆt. This framework reinterprets gravity as being driven by elastic strain in the ˆt field. We derive a finite critical time density, ˆtcrit = 2π/c, at which spacetime enters a dense-time phase, suppressing coherent wave transport and black body radiation. This mechanism defines the Chrono-Quantum Mirror (CQM), a finite phase boundary that replaces the classical event horizon and supplants the evaporation paradigm. We define the Timeon Lattice Potential (TLP) and the Chrono-Emergent Tunneling (CET) mechanism as the microscopic drivers for matter-confinement within this field. Cosmologically, we propose a multiverse lattice of interacting domains, where the Lattice Gravitational Strain (LGS) from causal bleed manifests as dark matter, and the expansionary momentum from a cyclical Chrono-Shear Event manifests as dark energy. This cycle culminates in a final, acausal deconfinement and reconvergence of matter-energy that seeds the next domain’s causal onset phase. We propose distinct, falsifiable experiments to test the theory. The result is a unitary, thermodynamically closed, and observationally-grounded ecosystem of solutions for foundational physics termed Timeon Lattice Multiverse Cosmology.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Anton Svystunov

,

Yaroslav Tereshchenko

Abstract: Rapid advancements in large language models with code generation abilities have enabled new paradigms in automated software development, positioning AI both as a coding assistant and an active actor within complex software ecosystems. Traditional code generation pipelines, mostly relying on tool calling via ReAct approach, require a complete code snippet to be generated and followed by validation and correction, often leading to significant latency and resource overhead due to sequential inference and execution processes. This research introduces a novel asynchronous inference algorithm that integrates context-free grammar parsing with real-time REPL-based execution, enabling early detection of syntax, semantic, and runtime errors without completing entire code snippets. We formally define the suitability criteria for LLMs in a target programming language, establish parse-tree-based identification of top-level statements, and present an incremental buffer-parsing mechanism that triggers execution upon recognition of complete statements. Implemented for Python 3 using the Lark parser and evaluated on a modified MBPP split ($N{=}113$ tasks; dataset and prompts in the Appendix) across six models---CodeAct--Mistral, GPT-OSS~20B, Gemma~3, Llama~3.2, Phi~4, and Qwen3-Coder~30B---our method is compared to a synchronous baseline using paired Wilcoxon tests with Bonferroni correction. Empirical results show significantly faster time-to-first-output for every model, large reductions in total latency where top-level script execution dominates (up to roughly an order of magnitude for CodeAct--Mistral), and no material change in pass or correctness rates, indicating that incremental execution improves responsiveness without altering task outcomes. With special prompting or finetuning, the method shows up to 4x reduction in latency for valid code generation. The benchmark results confirm that synchronous inference constraints can be alleviated through grammar-guided incremental execution, allowing more efficient and responsive agent-driven code execution workflows. Future research will explore predictive parsing techniques, deeper integration with agentic system architectures, security constraints, and formulating runtime requirements for scalable deployment of LLM-generated code execution environments.

Article
Computer Science and Mathematics
Computer Science

Aleksandar Ivanović

,

Miloš Radenković

,

Sergei Prokhorov

,

Aleksandra Labus

,

Božidar Radenković

Abstract: Several fundamental problems in software systems and AI remain without a unified formal solution. Deterministic reproducibility of execution, formal consistency between runtime state and historical record, and equivalence of governance and operational execution are unresolved across contemporary architectural paradigms. In AI systems, traceable decision processes and structurally enforced purpose-constrained autonomy remain open problems for the same reason. The common root is ontological: no formally defined execution substrate exists in which execution, governance, persistence, system evolution, and AI reasoning share a single causally ordered knowledge structure. This paper introduces the Zero Tier Execution Substrate (ZTES), an axiomatic execution model derived through formal synthesis of the Mesarović–Takahara system ontology, Lamport-consistent causal ordering, and the DEVS formalism. The Three-Phase execution kernel acts as semantic closure of this synthesis. The append-only historical knowledge base becomes the canonical computational medium in which governance and operational execution are formally equivalent transition processes over a single causally ordered structure. System execution is formally identified with the causal evolution of knowledge: Execution(Σ) ≡ Evolution(K). The substrate is universal for discrete processes: any discrete process admits execution within ZTES without loss of process identity, event ordering, or executable semantics. The scope of this work is foundational: the formal model establishes a stable foundation from which concrete realizations, empirical validations, and higher-level abstractions may be derived. ZTES does not introduce new computational primitives; it defines the minimal semantic discipline under which existing mechanisms — append-only persistence, causal ordering, and discrete-event transition semantics — are interpreted and composed as a structurally closed execution substrate. The formal model establishes deterministic event serialization, projection-defined runtime state, and compensa-tion-based correction without destructive mutation. Sixth Normal Form emerges as a natural ontological consequence of atomic event semantics rather than merely a storage design choice. A closure-based structural maturity model and benchmark for execution architectures are introduced as methodological contributions. These formal properties directly address the open problems identified above. ZTES therefore addresses several pre-viously unresolved structural problems: deterministic reproducibility of distributed execution, structural consistency between runtime state and historical record, and governance–execution equivalence within a single operational model. In AI systems, it establishes a substrate for historically consistent reasoning, traceable decision processes, and purpose-constrained autonomy as structural consequences of substrate closure. Software systems and AI infrastructures are therefore formally interpretable not as layered architectures but as causally evolving knowledge structures governed by formally defined execution semantics.

Article
Physical Sciences
Quantum Science and Technology

Daniel A Nolan

Abstract: We simulate the propagation of a W states through an optical fiber in the presence of mode coupling. We illustrate the propagating quantum state graphically on a group of higher order Poincaré spheres. The spheres show the propagation of the light in time and in distance as the transmission proceeds. Thus, the amplitudes and the relative phases of the modal propagation can be visualized throughout the transmission, which is novel and very useful to understand the propagation. At the fiber output we show how to recover the input quantum state using the simulated quantum state information displayed on the multiple spheres. The geometry of these states is an SU(N) quantum geometry. Applications include higher dimensional quantum communications, quantum cryptography, and quantum networks, and longer-term quantum optical computing.

Article
Medicine and Pharmacology
Hematology

Davide Negrini

,

Laura Pighi

,

Simone Mignolli

,

Gian Luca Salvagno

,

Giuseppe Lippi

Abstract: Background/Objectives: Iron deficiency remains a prevalent condition, needing specific laboratory tests for diagnosis. This study aimed to evaluate whether routine complete blood cell count (CBC) parameters can be used within a machine learning framework to predict iron deficiency, potentially optimizing laboratory test utilization. Methods: A ret-rospective dataset of outpatients (2023–2026) undergoing both CBC and iron testing was analyzed. Iron deficiency was defined using sex-specific thresholds for ferritin and trans-ferrin saturation. After cleaning data and excluding incomplete records, demographic variables and CBC indices were tested as potential predictors. The dataset was split into training and test sets with stratified sampling. Multiple supervised machine learning models, including logistic regression, decision tree, random forest, XGBoost, support vec-tor machine, k-nearest neighbors, and Naive Bayes, were trained. Hyperparameter tuning and model selection were performed using repeated stratified 10-fold cross-validation, op-timizing the area under the curve (AUC). Model performance was assessed by AUC, sen-sitivity, and specificity, and validated on an independent test set. Results: All models demonstrated predictive capability using CBC parameters alone. Ensemble methods, es-pecially random forest and XGBoost, reached the best performance (AUC values of 0.80–0.87 for ferritin and 0.85–0.96 for transferrin saturation). Sensitivity and specificity were balanced, supporting clinical screening applicability. Results were maintained across validation and confirmed in the test set. Prediction of transferrin saturation showed slightly higher accuracy than ferritin. Feature importance analysis identified MCV, MCH, and RDW as key predictors. Conclusions: CBC-based machine learning models can relia-bly identify subjects with iron deficiency, supporting subsequent, more targeted analyses.

Article
Medicine and Pharmacology
Immunology and Allergy

Anna Rybachuk

,

Christian Neuhof

,

Edmund Curtius

,

Cengizhan Acikel

,

Susann Fragel

,

Hacer Sahin

,

Nadine Katzke

,

Kija Shah-Hosseini

,

Silke Allekotte

,

Esther Raskopf

Abstract: Background/Objectives: Allergen immunotherapy (AIT) is the only disease-modifying treatment for grass pollen allergy. However, the proportion of patients interested in AIT who meet guideline-defined eligibility criteria remains unclear. This study aimed to characterise symptom burden, medication use, and AIT eligibility in adult patients with grass pollen allergy during the peak pollen season. Methods: In this multicentre, prospective, non-interventional epidemiological study, 479 adults with confirmed grass pollen allergy recorded daily nasal, ocular, and systemic symptoms, as well as anti-allergic medication use, via a validated electronic diary (CCC STUDY Diary) over a 30-day period in June/July 2025. A combined symptom-medication score (CSMS) was calculated daily, with a predefined threshold of ≥1.5 indicating clinically relevant symptom severity and potential eligibility for AIT. Both additive and weighted calculation approaches for CSMS and the daily medication score (dMS) were evaluated to assess methodological robustness and reproducibility. Results: The mean additive CSMS was 2.14, indicating moderate symptom burden. Overall, 63.3% of participants exceeded the CSMS threshold of 1.5 and were considered eligible for AIT. Sensitivity analyses demonstrated excellent concordance between additive and weighted CSMS/dMS calculations (Spearman’s ρ >0.98; p<0.001), and Bland–Altman analysis confirmed minimal bias (0.157) and narrow limits of agreement. Asthma was reported as a comorbidity in 36% of patients, generally associated with mild to moderate daily respiratory symptoms. Limitations included the self-reported nature of the data and a slightly reduced sample size; however, the results are representative of adult patients seeking care in specialised allergy centres in Germany. Conclusions: The CSMS also in its additive and therefore modernized form is a reliable, reproducible, and clinically meaningful tool for quantifying symptom severity and identifying patients suitable for AIT. Approximately two-thirds of adults interested in grass pollen AIT exhibited moderate to severe symptoms and were eligible for treatment according to current guideline recommendations.

Article
Engineering
Electrical and Electronic Engineering

John Alexander Taborda Giraldo

,

Cesar Enrique Polo Castro

,

Miguel E. Iglesias Martínez

Abstract: Just energy transitions in the Global South unfold under conditions of institutional fragmentation, fiscal constraints, and high socio-ecological turbulence, making governance capacity a critical bottleneck for effective decarbonization and climate justice. This study proposes the Cybernetic Environmental Hub (CEH) framework, which extends the Viable System Model (VSM) to sustainability governance by integrating AIoT-enabled environmental monitoring, Early Warning Systems, decentralized data governance, and justice-centered institutional design. Methodologically, the research adopts a hybrid conceptual–empirical approach combining theoretical development with participatory territorial diagnostics. Empirical validation is illustrated through a case study in the Caribbean Mining Corridor, where socio-ecological challenges were collected through participatory innovation workshops, thematically coded, and mapped onto the five VSM subsystems to identify systemic “variety gaps.” The analysis demonstrates that fragmented operational initiatives coexist with weak meta-systemic coordination, limiting adaptive capacity in energy transition processes. The CEH architecture addresses these deficiencies by embedding AIoT sensing, federated learning, blockchain-based coordination, and Early Warning Systems within recursive governance structures. Additionally, the study introduces a Territorial Governance Maturity Model (H1–H3) to diagnose systemic learning capacities and transition readiness across technological, institutional, data governance, and justice dimensions. The findings suggest that cybernetic environmental hubs can function as socio-technical infrastructures enabling coordinated, adaptive, and justice-centered energy transitions in the Global South.

Article
Computer Science and Mathematics
Computer Science

Melchor Gómez García

,

Derlis Cáceres Troche

,

Moussa Boumadan Hamed

,

Roberto Soto Varela

Abstract: The rapid expansion of Generative Artificial Intelligence (GAI) is transforming higher education systems, particularly public institutions seeking to advance toward smart governance models and digital transformation. In this context, digital teaching competence emerges as a strategic factor for the effective, ethical, and pedagogically sound adoption of these technologies. This study assesses the level of digital competence among public higher education faculty in Paraguay and examines its predictive capacity regarding the adoption of GAI tools using machine learning models. A nationwide quantitative study was conducted with a sample of 800 faculty members from public universities across Paraguay. Data were collected through a structured questionnaire based on international digital competence frameworks, incorporating additional variables such as attitudes toward GAI, technological experience, institutional infrastructure, and perceived organizational support. Data analysis involved the application of machine learning techniques, including Logistic Regression, Random Forest, and Gradient Boosting, to identify the variables with the strongest predictive power regarding faculty readiness and willingness to integrate GAI into teaching practices. Model performance was evaluated using metrics such as accuracy, F1-score, and AUC-ROC. The findings identify key predictors of technological readiness and structural gaps within Paraguay’s public higher education system. This research provides empirical evidence from Latin America on the factors influencing GAI adoption in public sector educational contexts and contributes to the design of educational policies aimed at fostering smart universities and digitally sustainable academic ecosystems.

Article
Computer Science and Mathematics
Computer Vision and Graphics

Jianhua Zhu

,

Changjiang Liu

,

Danling Liang

Abstract: Multi-modal remote sensing image registration is a challenging task due to differences in resolution, viewpoint, and intensity, which often leads to inaccurate and time-consuming results with existing algorithms. To address these issues, we propose an algorithm based on Curvature Scale Space Contour Point Features (CSSCPF). Our approach combines multi-scale Sobel edge detection, dominant direction determination, an improved curvature scale space corner detector, a new gradient definition, and enhanced SIFT descriptors. Test results on publicly available datasets show that our algorithm outperforms existing methods in overall performance. Our code will be released at https://github.com/JianhuaZhu-IR.

Article
Arts and Humanities
Philosophy

Gerd Leidig

Abstract: This article addresses the “Hard Problem” of consciousness not as an immutable ontological barrier of nature, but as an iatrogenic separation—a methodological artifact induced by the reductive third-person perspective (3P). By systematically and intentionally removing the subject from the world-description to achieve a veneer of objectivity, modern physicalism creates a restrictive “substance grammar” that subsequently struggles to locate the qualitative dimension of experience within its own datasets. Using Hans Holbein the Younger’s painting The Ambassadors (1533) as a primary epistemic model, we analyze the anamorphic “blot” as a representation of the Real that eludes frontal, mathematical domestication. We argue that the resolution of this parallax requires more than a simple shift in focus; it demands a “step to the side”—a transition from static representation to the processual performance of enactive inference. Integrating Karl Friston’s Free Energy Principle (FEP) and the Neurophenomenological Enactive System Schema (NESS), we define meaning not as an intrinsic property of objects, but as a temporal alignment and an energetic achievement of a system striving for coherence under the constant pressure of existential concern (Sorge). The paper concludes by proposing a “processual perspectivism” and the figure of the Sovereign Witness, suggesting that the Hard Problem is dissolved when subjectivity is understood as the active, embodied performance of the world-relation.

Article
Business, Economics and Management
Business and Management

Dejan Kelemina

,

Tjaša Štrukelj

,

Maja Rožman

Abstract: Sustainability has become a crucial strategic priority for firms operating in re-source-intensive industries such as food processing, where long-term competitiveness depends on responsible governance, strategic orientation, resource management and or-ganizational resilience. While prior research has established a general link between sus-tainability and organizational performance, less is known about how specific internal sustainability governance and management components contribute to firm financial per-formance. Drawing on the Resource-Based View and institutional theory, this study ex-amines the direct effects of a sustainability-oriented vision, business policy, organization-al culture, and strategies on firm financial performance. The study is based on survey data collected from 247 food processing firms operating in Slovenia, an EU member state. Ex-ploratory factor analysis and multiple regression analysis were used to test the proposed relationships. The results show that sustainability strategies have the strongest direct, sta-tistically significant positive effect on firm financial performance, followed by sustainabil-ity-oriented organizational culture. In contrast, sustainability vision and business policy exhibit a statistically significant, but negative, direct association, suggesting that formal sustainability commitments alone may not yield financial benefits without effective cul-tural support and strategic integration. These findings indicate that firm financial perfor-mance is directly driven primarily by sustainability strategies that are via practices opera-tionally embedded and supported by organizational capabilities such as organizational culture, rather than by normative symbolic commitments alone. This opens up possibili-ties for further research, based on the probability that the sustainable development vision and business policy serve as a catalyst for defining sustainable development strategies and implementing sustainable development practices, and that the normative commit-ments, in particular, indirectly influence financial performance. The study contributes to the sustainability governance and management literature by distinguishing the norma-tive, cultural, and strategic dimensions of sustainability and demonstrating their distinct direct implications for financial performance. The findings also provide practical insights for owners/governors and managers by highlighting the importance of integrating sus-tainability into organizational culture and core strategic processes to achieve long-term financial value.

Article
Computer Science and Mathematics
Applied Mathematics

Xianqi Zhang

,

Zewei Wang

,

Dan Xue

,

Zikang Han

Abstract: Servo motors typically utilize Field-Oriented Control (FOC). However, the conventional cascaded PI control framework is inherently constrained by its fixed-parameter design, making it highly susceptible to parameter variations and unmodeled disturbances. While intelligent control strategies—such as model predictive control (MPC)—provide a robust, multi-objective alternative, their intensive stepwise computational demand often degrades transient response. Motivated by the stochastic dynamics of motor operation, we propose a novel physics-informed control paradigm. Specifically, we formulate the FOC-based motor control as an online stochastic optimization problem, wherein the objective function is updated iteratively using stochastic gradient estimates, and the resulting time-varying subproblems are solved efficiently by the MSALM algorithm. Our approach significantly outperforms conventional PI controllers in environmental adaptability and disturbance rejection. Experimental results demonstrate that the proposed method achieves comparable high-precision tracking performance while significantly reducing computational time per iteration, ensuring rapid dynamic response and strict enforcement of physical constraints.

Article
Engineering
Bioengineering

Eva Góngora-Rodríguez

,

Irene Rivas-Blanco

,

Álvaro Galán-Cuenca

,

Carmen López-Casado

,

Isabel García-Morales

,

Víctor F. Muñoz

Abstract: Robotic assistance in minimally invasive surgery has significantly improved precision and dexterity; however, many supportive tasks, such as blood aspiration, still rely on manual operation. This work presents the design and implementation of an autonomous robotic aspirator capable of detecting and removing intraoperative bleeding without continuous human intervention. The proposed system integrates a perception module based on a convolutional neural network for real-time blood segmentation, a task planner for high-level actions execution, and a control strategy based on artificial potential fields for autonomous navigation. Additionally, a mixed-reality human–robot interaction interface is incorporated to enable system supervision and seamless transition to teleoperation when required. The system was experimentally validated with a set of in-vitro experiments under three representative bleeding scenarios, evaluating four suction strategies based on the computation method for the target selection. Results demonstrate fast reaction times (below 0.04 s) and high blood removal rates (above 80% in all cases). The comparative analysis reveals that the performance of the suction strategies is scenario-dependent and highlights a trade-off between suction efficiency and removed area. These findings support the feasibility of autonomous robotic aspiration and provide insights into the design of adaptive strategies for surgical assistance, contributing toward increased autonomy and improved workflow efficiency in minimally invasive procedures.

Article
Biology and Life Sciences
Virology

Balazs Sax

,

Adam Koppanyi

,

Katalin Kristóf

,

Akos Kiraly

,

Gyula Prinz

,

Istvan Hartyanszky

,

Gergely Gyorgy Nagy

,

Istvan Nemet

,

Fanni Temesvary-Kis

,

Balazs Kiss

+1 authors

Abstract: Percutaneous cable infection of left ventricular assist device (LVAD) patients is a significant source of morbidity, often caused by biofilm producing or multidrug resistant bacteria. We hypothesized that bacteriophage viruses can be identified from biological samples of patients with active driveline infection. Six patients with local percutaneous lead infections were enrolled. Microbiological samples were collected from the infected wound and other skin regions. The isolated viral strains and phages from wastewater samples were then tested against the pathogen bacterial cultures in vitro. Biofilm disruption assay and genetic analysis of the strains were also performed. Bacteriophages with lytic activity could be identified from samples of two patients. One patient contained four strains showing strong efficacy against his own Staphylococcus epidermidis. Furthermore, this bacterium was susceptible to phages identified from another patient and strains from wastewater samples. Genomic analysis suggested lysogenic lifestyle of the phages. However, none of them have shown any microbiological signs of lysogeny. In conclusion, we have been able to prove in vitro lytic activity of bacteriophages originating from the same LVAD patient. We also found effective phages in biological samples of other patients and wastewater samples, suggesting that patients implanted in the same center may share bacteriophage flora.

of 5,753

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated