Sort by

Article
Computer Science and Mathematics
Analysis

Mohsen Soltanifar

Abstract: This paper introduces the radius of integrability, a quantitative invariant that transforms the qualitative ϵ-δ formulation of Riemann integration into a measurable property of function spaces. For a Riemann integrable function and a prescribed accuracy ϵ, the radius identifies the largest partition mesh δ that guarantees every tagged Riemann sum approximates the integral within the specified error. The framework is developed for both compact domain intervals, via pointwise and uniform radii, and unbounded intervals, through the tail integrability radius which quantifies the necessary truncation window for improper integrals. Key theoretical results include the establishment of a bottleneck identity relating local and global mesh requirements and a structural theorem showing that for C1 integrands, the radius is asymptotically governed by the inverse of the function’s total variation. Furthermore, this work completes a hierarchical program of regularity radii—encompassing convergence, continuity, and differentiability—by revealing a dimensional progression of geometric anchors. We demonstrate that while continuity is anchored at a point and differentiability at a line, integrability is anchored at a two-dimensional region. The theory is illustrated through explicit computations for several classical functions, including the normal density, the stretched exponential, and the Thomae function, providing a new quantitative lens for classifying integrable functions based on their partition sensitivity and tail decay regimes.

Article
Medicine and Pharmacology
Hematology

Justine M. Grixti

,

Etheresia Pretorius

,

Douglas B. Kell

Abstract: Blood can clot into anomalous, fibrinolysis-resistant forms that arise from prothrombotic seeding areas, including damaged cellular debris and membrane-derived surfaces, giving rise to what we have termed fibrinaloid microclot complexes (colloquially: microclots).Their proteolytic resistance is due to the fact that they are amyloid in nature, and they can also entrap inhibitors of proteolysis. They consist of a variety of proteins besides the expected fibrin, and are highly enriched for other amyloidogenic proteins (in contrast to normal clots, whose proteome largely reflects the soluble plasma proteome). They also contain DNA in the form of neutrophil extracellular traps (NETs). Importantly, fibrinaloid microclot complexes are heterogeneous structures comprising multiple phenotypic forms, including those that nucleate and grow on cellular debris such as damaged membranes, microparticles, and immune-derived material. These debris-associated complexes act as catalytic scaffolds that recruit fibrin(ogen) and inflammatory molecules, thereby amplifying amyloidogenic transformation and prothrombotic activity. Fibrinaloid microclot complexes have been reported in a widening range of chronic inflammatory and thrombo-inflammatory diseases in which they have been sought, and are highly enriched for amyloidogenic proteins. Additionally, the thrombi extracted from ischaemic stroke also contain proteins in an amyloid form, implying that such macroclots can form via the accretion of microclots that already contain amyloid. We here show that these microclots exhibit a classical ‘apple-green’ birefringence when stained with the dye Congo red. The urgent task now is to find means of inhibiting the transition to amyloid forms during the clotting process.

Article
Engineering
Control and Systems Engineering

Mircea Ivanescu

,

Decebal Popescu

Abstract: Emerging technologies and cyber-physical systems have led to the development of complex mathematical models described by differential equations with multiple fractional orders. In this regard, this paper investigates the stability of control systems for this class of models, defined by state equations with multiple fractional orders ranging between 0 and 1. Matrix criteria and comparison principle for linear and nonlinear autonomous systems of different fractional orders are developed based on generalized Lyapunov functions for differential equations with multi-order fractional exponents. The results are extended to non-autonomous linear or with nonlinear components systems of different fractional orders. The application of the Yakubovich-Kalman-Popov lemma, adapted for this class of systems, allows us to obtain new stability criteria presented as frequency criteria and represented graphically by familiar frequency plots similar those of the Nyquist or Popov type. Numerical applications illustrate these results such as models of complex human-machine systems described by state equations of multivariable fractional orders. An analysis of the advantages of the proposed methods compared to procedures and techniques used in other papers regarding the study of multi-order fractional exponent systems is presented. It is demonstrated that the proposed methods minimize the computational effort required for stability criteria.

Case Report
Medicine and Pharmacology
Cardiac and Cardiovascular Systems

Vojislav Parezanovic

,

Dusan Andric

,

Vladimir Chadikovski

,

Vedran Stojanovikj

,

Jordanka Madzoska

,

Vesna Trpkovska

,

Igor Stefanovic

Abstract: The association of a major aortopulmonary collateral artery (MAPCA) with simple trans-position of the great arteries (TGA) is uncommon. Such high-flow lesions in the postoper-ative period following arterial switch operation (ASO) may lead to pulmonary hyperten-sion, pulmonary hemorrhage, heart failure (HF), failure to thrive and prolonged mechan-ical ventilation. We report a neonate who developed pulmonary overcirculation and HF in the early postoperative period due to a hemodynamically significant MAPCA. Although the association of MAPCA with simple TGA is infrequent, such lesions should be considered in cases of unexplained cardiovascular compromise following ASO. Fol-lowing transcatheter occlusion of the MAPCA with a vascular coil, rapid hemodynamic stabilization and subsequent extubation of the patient were achieved.

Review
Engineering
Civil Engineering

Almamoon Altawalba

,

Farid Ghazali

Abstract: In Jordan, the construction industry and businesses are burdened by the high prices of materials in terms of extraction, production, transportation, and purchasing, as well as the volatility of their market value. The environment is primarily affected by construction and demolition activity since the construction sector in Jordan is based on a linear economy model and does not rely on the circular economy (CE) by reusing or recycling building materials rather than discarding them. Therefore, this study aims to develop a CE framework for managing construction waste in residential buildings during the construction phase and facilitating the adoption of the proposed model within the construction sector in Jordan. Therefore, a questionnaire was distributed to 31 experts, the results were analyzed, and the Delphi technique was then applied to validate the proposed framework and study findings. The findings indicate that the CE contributes to minimizing construction waste. The researcher sought to identify the most significant challenges hindering the implementation of the CE. The most influential challenges were low demand for reused or recycled materials, limited stakeholder awareness, and difficulties in disassembly. Furthermore, the results indicated use of visual management and 5S techniques, the use of BIM to map materials and components for circular lifecycle planning, and offering tax incentives and grants for using recycled materials are the most important strategies for minimizing construction waste. This study contributes to minimizing construction waste and advancing sustainable development, while also supporting Jordan’s Vision 2025 as outlined by the Jordanian government and the Ministry of Environment.

Article
Environmental and Earth Sciences
Environmental Science

Princewill Odum

,

Zirui Wang

Abstract: Urban coastal cities increasingly confront compounded flood hazards driven by sea-level rise, intense precipitation, and dense impervious surfaces. This study evaluates a cloud-native Python GIS framework for flood susceptibility mapping and critical facility exposure analysis in the City of Miami, Florida, being one of the most flood-exposed coastal cities in the United States. Implemented entirely within a Google Colab cloud native environment, the workflow integrates three open-source spatial indicators: (i) terrain elevation retrieved via the py3dep interface to the USGS 3D Elevation Programme at 10 m resolution; (ii) Euclidean proximity to water bodies extracted from OpenStreetMap (OSM) using OSMnx; and (iii) building footprint density as a proxy for impervious surface cover, also sourced from OSM. These raster-based indicators were standardised, weighted using a Multi-Criteria Decision Analysis (MCDA) framework (water proximity: 0.40; elevation: 0.35; building density: 0.25), and combined via weighted overlay to produce a continuous flood risk index. The index was classified into low, medium, and high susceptibility zones using quantile thresholds at the 33rd and 66th percentiles. Results show that high-susceptibility areas cover 48.66 km² (34.0%) of the city, concentrated along coastal waterfronts and inland water corridors. Exposure analysis reveals that 9 of 16 hospitals (56.2%), 61 of 244 schools (25.0%), and 5 of 17 fire stations (29.4%) are situated in high-susceptibility zones. The framework is fully reproducible, cost effective, low hardware requirement and transferable decision-support methodology.

Article
Medicine and Pharmacology
Clinical Medicine

Hung Nguyen Van

,

Luu Vu Dang

,

Anh Nguyen The

,

Long Nguyen Cong

,

Khang Le Van

,

Trung Nguyen Ngoc

,

Minh Vu Le

,

Hoi Nguyen Ham

Abstract: Objectives: Histologic grade is an important prognostic factor in hepatocellular carcinoma (HCC). Gd-EOB-DTPA-enhanced MRI may provide noninvasive im-aging markers related to tumour differentiation. This study aimed to evaluate the association of Gd-EOB-DTPA-enhanced MRI features, together with the albu-min-bilirubin (ALBI) score and alpha-fetoprotein (AFP), with HCC histologic grade and to assess the performance of combined predictive models. Methods: Methods: In this prospective cross-sectional study, 75 patients (mean age, 56.4 years; 66 men) with 88 histopathologically confirmed HCC lesions were en-rolled. Patients were classified into well-differentiated (grade I–II, n = 24) and poorly differentiated (grade III–IV, n = 51) groups according to the Edmondson–Steiner system. MRI was performed on a 1.5-T scanner and included T1-weighted in-phase/opposed-phase imaging, T2-weighted imaging, diffu-sion-weighted imaging, and dynamic Gd-EOB-DTPA-enhanced sequences, in-cluding arterial, portal venous, transitional, and 20-min hepatobiliary phases. Two radiologists, blinded to pathology, assessed predefined imaging features, and the lesion-to-liver ratio (LLR) was measured. Group comparisons were per-formed using Student’s t-test, the Mann–Whitney U test, and the chi-square or Fisher’s exact test, followed by multivariable logistic regression and ROC analy-sis with 500 bootstrap resamples. Results: Compared with well-differentiated HCC, poorly differentiated HCC showed a higher frequency of peritumoral hepatobiliary phase (HBP) hypointensity (62.7% vs. 4.2%, p < 0.001) and peritu-moral arterial hyperintensity (39.2% vs. 0%, p < 0.001). In multivariable analysis, peritumoral HBP hypointensity remained independently associated with poorly differentiated HCC (OR = 30.89, p = 0.002). The 2-parameter MRI model, includ-ing peritumoral HBP hypointensity and HBP tumour signal, yielded an AUC of 0.84. The combined MRI + ALBI + AFP model showed the highest discriminative performance, with an AUC of 0.87 and an accuracy of 78.7%. Conclusions: Con-clusions: Gd-EOB-DTPA-enhanced MRI features, particularly peritumoral HBP hypointensity, were associated with high histologic grade in HCC. In this cohort, combining MRI features with ALBI grade and AFP yielded higher discriminative performance than the MRI-only model. These findings may support preoperative histologic risk stratification, although external validation is required.

Article
Biology and Life Sciences
Agricultural Science and Agronomy

Mohammed Namoussa

,

Mohammed S. Nili

,

Mahfoud Babaousmail

,

Jean Diatta

,

Zbigniew Karolewski

,

Tomasz Rafalowicz

Abstract: Biochar amendment offers a promising strategy to enhance soil physicochemical performance and yield response in nutrient-poor sandy soils; however, its effectiveness depends strongly on feedstock type and application rate. This field study assessed the agronomic and sandy soil responses of tomatoes to biochars derived from date palm, maize, and potato residues, applied at 0, 2, 4, 8, and 16 t·ha⁻¹ under desert conditions in southeastern Algeria. Biochars were characterized for physicochemical and structural properties, and their effects on soil carbon, nutrient availability, and tomato yield were evaluated. The results showed that biochar application significantly increased soil total organic carbon (TOC) and total yield, particularly at low application rates. Date palm biochar applied at 2 t·ha⁻¹ produced the highest yield improvement, whereas excessive application tended to suppress yield. In contrast, soil N, P, and K did not show statistically significant differences among treatments, although slight numerical increases were observed compared to the control at medium application rates (4–8 t·ha⁻¹). These findings highlight the importance of optimizing biochar application rates according to feedstock type to maximize agronomic benefits. Overall, moderate biochar application represents a promising strategy for improving soil organic carbon status and crop productivity in desert sandy soils agroecosystems.

Article
Medicine and Pharmacology
Pulmonary and Respiratory Medicine

Luisa Jiménez Reyes

,

José Javier Jareño Esteban

,

Lara Almudena Fernández Bermejo

,

Carlos Gutiérrez Ortega

,

Javier de Miguel Díez

Abstract: Background/Objectives: Recent trends show a rising incidence of venous thromboembolism (VTE) that does not correlate with increased mortality; however, population aging and the proliferation of comorbidities are fundamentally reshaping the VTE patient landscape. The aim of this study is to evaluate potential differences in clinical characteristics, comorbidities, and survival rates between patients diagnosed with pulmonary embolism (PE) during the pre-pandemic period (2018−2019) and those diagnosed during the pandemic era (2020−2022). Additionally, as a secondary objective, we analyze the clinical profiles, risk factors, and survival outcomes of patients with and without COVID-19 infection during the 2020−2022 period. Methods: A retrospective observational study was conducted to analyse survival and comorbidities in patients admitted for PE at the Hospital Central de la Defensa ‘Gómez Ulla’ between 2018 and 2022, comparing two periods (2018−2019 and 2020−2022). In addition, a sub-analysis was performed within the second period group comparing patients with and without COVID-19. Results: It was observed that the majority of patients in the first period were men, while in the second period, 55% were women. With regard to comorbidity and risk factors, thrombophilia and dementia were more prevalent in the first period, while immobilization, a history of asthma, autoimmune diseases and infections were more prevalent in the second period. No differences were found with regard to mortality. Conclusions: Significant differences were observed between the two periods of the study with regard to age, gender, and some comorbidities. Patients with COVID-19 showed a greater tendency toward immobilization and a higher prescription of thromboprophylaxis during hospitalisation.

Article
Engineering
Architecture, Building and Construction

Enmanuel Salazar-Ceciliano

,

Ileana Hernández-Salazar

,

Jose Pablo Bulgarelli-Bolaños

Abstract: Programmed conservation of heritage buildings requires assessment tools capable of iden-tifying vulnerabilities in a systematic and decision-oriented manner. This study proposes and applies a methodology for calculating the vulnerability index of the National Theatre of Costa Rica, with the aim of establishing a technical baseline for monitoring, prioritizing interventions, and supporting long-term conservation management. The method struc-tures vulnerability through four dimensions (systems, environment, use, and urban pres-sure), each subdivided into specific risk variables weighted through the Analytic Hierar-chy Process (AHP) and pairwise comparison matrices. The building was assessed through 33 spaces grouped into 17 zones, based on two on-site evaluation campaigns, and the re-sults were consolidated into a global assessment matrix. The findings indicate an overall low vulnerability index for the building (1.391), with similarly low values recorded for systems (1.549), use (1.450), environment (1.268), and urban pressure (1.198). However, the South Façade (1.824) and the Foyer (1.778) reached medium vulnerability levels, while several additional spaces were close to that threshold. The results suggest that use-related conditions exert the greatest influence on the overall index, whereas systems-related is-sues—particularly electrical installations—remain a relevant field for intervention. The study supports the applicability of the proposed method as an objective instrument for programmed conservation of built heritage.

Review
Medicine and Pharmacology
Pulmonary and Respiratory Medicine

Caterina Antonaglia

,

Antonio Fabozzi

,

Alessia Steffanina

,

Sara Soave

,

Paola Confalonieri

,

Ambra Nicolai

,

Federica Olmati

,

Arianna Sanna

,

Nahaun Pena

,

Barbara Ruaro

+4 authors

Abstract: Background: Continuous positive airway pressure (CPAP) is the primary treatment for Obstructive Sleep Apnea (OSA). Despite improvements in CPAP technology and management, adherence to therapy remains one of the main issues to be fixed. Methods: We conducted a narrative review through PubMed (1995-2025). Studies were selected by clinical relevance, methodological quality and expert consensus. Results: OSA treatment outcomes are poor when CPAP adherence is defined as four hours per night. The first step in improving adherence is active patient involvement. This involves explaining what OSA is, its consequences, what PAP therapy is, and its potential benefits. The right mask should be chosen for each patient; a nasal mask should be the first choice according to the Starling resistor model. OSA endotype and phenotype traits could be used to predict adherence, guide adjunct therapy, or suggest titration. Problems during the first night and the first month are the main predictors of future adherence. Strategies such as cognitive behavioral therapy or motivational enhancement can improve adherence, especially during the initial period. Long-term adherence can be predicted by the initial one and maintained with scheduled follow-up. Group meetings, telephone calls and telemedicine interactions are also a valid way of improving adherence. Conclusions: A Patients should initially be educated about how their symptoms are related to sleep apnoea and how CPAP treatment could resolve them. The key to improving CPAP adherence is to involve patients in personalised treatment with scheduled follow-up, particularly during the initial treatment period.

Article
Engineering
Electrical and Electronic Engineering

Minji Kim

,

Jiun Oh

,

Younghun Han

,

June-O Song

,

Joon Seop Kwak

Abstract: p-GaN gate enhancement-mode GaN HEMTs are promising normally-off power devices, but their high-temperature reliability is strongly affected by the gate contact scheme. This study compares Pd-ohmic and Ni-Schottky p-GaN gate HEMTs fabricated on the same GaN-on-Si epitaxial platform by combining temperature-dependent electrical characterization, post-temperature-dependent-test (TDT) room-temperature recovery analysis, and thermoreflectance thermal mapping. Electrical measurements were performed from room temperature to 500 °C using gate leakage, transfer, and output characteristics, while thermal maps were obtained before and after TDT under identical bias conditions. The Pd-ohmic devices exhibited higher initial current drive but larger operating gate-current penalty and stronger degradation of normalized on-state characteristics at elevated temperature. After TDT, reduced transconductance and maximum drain current were accompanied by weaker active-channel heating, indicating degradation-type cooling associated with reduced gate-channel modulation efficiency. In contrast, the Ni-Schottky devices showed lower gate-current penalty and better normalized output retention up to approximately 300 °C; however, post-TDT increases in transconductance and drain current occurred together with degraded subthreshold swing and persistent localized heating, indicating apparent on-state activation with weakened gate/depletion control. These results show that p-GaN gate reliability should be assessed through coupled electrical and thermal signatures rather than single electrical or thermal metrics.

Article
Engineering
Bioengineering

Orlando Meneses Quelal

,

David Pilamunga Hurtado

,

Marco Rubén Burbano-Pulles

Abstract: Food fraud is a persistent global threat estimated to cost the food industry over USD 30 billion annually. The integration of artificial intelligence (AI) with analytical instrumentation has generated significant research activity directed at developing detection systems capable of identifying adulteration, mislabeling, and substitution across diverse food matrices. This systematic review critically examines the extent to which AI-assisted instrumental technologies contribute to food fraud prevention, and identifies the structural limitations that constrain their real-world implementation. A systematic search of peer-reviewed literature published between 2021 and 2026 yielded 72 eligible studies after application of predefined inclusion criteria. Studies were required to report quantitative performance metrics (accuracy, R2, RMSE, AUC, sensitivity, specificity), describe methodological limitations, and mention laboratory or industrial implementation contexts. Data were extracted into a structured seven-sheet workbook covering study characteristics, instrumental technologies, AI architectures, performance metrics, industrial validation status, implementation evidence, and methodological quality. The corpus reveals a systematic pattern of high reported analytical accuracy—frequently exceeding 95% and in many cases reaching 100%—under controlled laboratory conditions. However, 75% of studies (54/72) conducted no external validation, 100% of studies reported no pilot-scale or routine monitoring application, and no study achieved inter-laboratory validation. The predominant technology was NIR spectroscopy (26/72 studies, 36%), followed by gas chromatography-based systems (14/72, 19%) and electronic noses (8/72, 11%). Classical machine learning—predominantly SVM, Random Forest, and ANN—dominated methodological approaches (43/72, 60%), with deep learning architectures accounting for 26% of studies. Technology Readiness Levels were unreported in 97% of studies. Methodological quality was predominantly moderate (42/72 studies scoring 3/5), with 19 studies scoring 2/5 and only one achieving the maximum score. This review identifies a structural gap between detection and prevention as the central finding: the scientific literature consistently demonstrates high analytical precision in laboratory settings while providing minimal evidence of real-world industrial deployment, regulatory integration, or measurable impact on the prevention of food fraud events. The findings demonstrate that the limitation is not primarily technological but systemic, highlighting the need for a paradigm shift from performance-driven research toward validation-driven, deployment-oriented frameworks.

Review
Public Health and Healthcare
Primary Health Care

James V. English

Abstract: Intimate partner violence-related brain injury is the most recent condition in a 150-year arc in which biological brain injury has been misattributed to psychological or moral causes before formal clinical recognition emerged. Earlier conditions in this pattern were each marked by decades of recognition lag before formal diagnostic frameworks emerged. In each prior case, that lag was driven by limits in available diagnostic technology. Intimate partner violence-related brain injury is the first condition in which diagnostic technology, including computed tomography, magnetic resonance imaging, diffusion tensor imaging, and neurocognitive assessment, has been continuously available throughout the recognition gap. The review identifies three structural barriers that sustain this recognition gap: a diagnostic barrier that leaves the injury without formal criteria, an administrative coding barrier that leaves it absent from ICD architecture, and a population surveillance barrier that leaves it indistinguishable from broader assault categories. Each barrier reinforces the others, limiting visibility, resource allocation, and access to care. Across these conditions, recognition lag has reflected an institutional imperative that has shaped which injured populations became clinically legible. Recent neuroimaging and cognitive studies make the biological imperative explicit. A cognitive entrapment framework reframes the reduced capacity to engage the cognitive and material resources leaving requires as injury-driven rather than as ambivalence or motivational deficit. The framework explains mechanistically why brain injury disrupts the multistep planning that leaving demands. Intimate partner violence-related brain injury is not only underdiagnosed but structurally underserved; correcting the mechanisms of recognition failure is necessary for access to treatment and rehabilitation.

Article
Computer Science and Mathematics
Information Systems

Pedro A. Reis Sa Costa

,

Antonio Goncalves

,

Mario Monteiro Marques

Abstract: This case study examines an encryption failure incident involving the exposure of sensitive personal data within a governmental information system environment. The analysis is based on the well-documented data breach that occurred within the U.S. Department of Veterans Affairs, in which a government employee stored a large dataset containing veterans’ personal information on a portable laptop device that lacked adequate encryption protection. Following the theft of the device from the employee’s residence, the personal records of approximately 26.5 million individuals were placed at risk of unauthorized exposure. Rather than interpreting the incident as an isolated technical failure, this study analyzes it through the Swiss cheese model, proposed by James Reason, showing that the breach resulted from the alignment of weaknesses across multiple layers of defence. These weaknesses included the absence of full-disk encryption, insufficient enforcement of data handling policies, weak access control procedures, inadequate oversight of sensitive data transfers outside controlled environments, and excessive reliance on individual user compliance. Based on this analysis, the study proposes corrective and preventive measures, including mandatory strong encryption for portable devices, formal cryptographic key management procedures, strengthened data access and handling controls, and enhanced monitoring and auditing mechanisms. These measures are intended to reinforce multiple defensive layers, improve the protection of sensitive information, and reduce the likelihood of similar incidents in operational environments handling confidential data.

Article
Physical Sciences
Astronomy and Astrophysics

Espen Gaarder Haug

Abstract: A recently proposed CMB temperature relation, obtained from applying the Stefan Boltzmann law to the Hubble sphere and from related Hawking–Planck–Hubble scale arguments, may be written in the compact form TCMB(t) = TP/(8π √(NP(t))). Here NP is the effective Poisson-shot count. In an RH = ct cosmology, the normalization consistent with the Stefan–Boltzmann radiation density is NP(t) = (RH(t))/2lP = t/2tP = (Mc(t))/mP, where Mc(t) = c2RH(t)/(2G) is the critical Hubble mass. If instead one defines the doubled Hubble-sphere mass Mu(t) = c2RH(t)/G, then Mu/mP = 2NP. The formula has the mathematical structure of a Poisson relative-fluctuation law, since σN/N = 1/√N for a Poisson count, and may equivalently be written TCMB(t) = TP/8π σNP/NP.We call this the Poisson-shot CMB formula. Substitution back into the Stefan–Boltzmann law gives uγTCMB4RH-2, matching the critical-density scaling in RH = ct and yielding a constant photon radiation density parameter. This provides additional blackbody support for the formula and connects it to the observed near-perfect blackbody spectrum of the CMB. By contrast, in the standard ΛCDM framework the present CMB temperature is normally an observational input: the model predicts the redshift scaling T(z) = T0(1 + z) once T0 is supplied, but it does not derive the absolute present value T0 from the Planck scale and the Hubble scale.

Article
Engineering
Control and Systems Engineering

Carlos Gomez-Rosas

,

Rogelio de J. Portillo-Velez

,

Guillermo Fernandez-Anaya

,

J. Alejandro Vásquez-Santacruz

,

Luis F. Marín-Urías

Abstract: An approach for the control of linear control systems with a single time-delay is proposed. The main contribution is the inclusion of a symmetric-injection virtual reference trajectory into the controller to render stability robustness of single-delay linear control systems. The dynamics of the virtual trajectory is included into the closed-loop dynamics allowing theoretical computation of the critical time-delay before losing stability. Moreover, an energy-based symmetry interpretation of the proposed approach is drawn. Numerical simulations considering stable and unstable linear systems are shown, and experiments to control a DC-motor with time-delay measurements validate our proposal.

Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Byunghyun Ban

Abstract: Plant factories have evolved from automated cultivation facilities into data-driven crop production systems. Over the last decade, artificial intelligence has been applied to non-destructive crop monitoring, sensor correction, nutrient-solution diagnosis, growth prediction, environmental control, digital twins, and product-level inspection. This review summarizes AI technologies for plant factories, focusing on machine vision, deep learning, nutrient-solution intelligence, reinforcement learning, and digital-twin interfaces. The main argument is that plant-factory AI should not be understood only as image-based phenotyping; practical systems require an integrated intelligence stack connecting visual perception, sensor calibration, nutrient modeling, control, remote operation, and industrial inspection. Remaining challenges include dataset scarcity, model generalization, sensor drift, explainability, energy-aware control, and closed-loop decision-making.

Article
Environmental and Earth Sciences
Environmental Science

Marco Esposito

,

Sara Raggiunto

,

Francesca Sini

,

Paola Pierleoni

,

Natalino Barbizzi

,

Gaia Galassi

Abstract: Floods are among the most damaging natural hazards, threatening human safety and causing substantial economic losses. Their risk results from the interaction between hazard, exposure, and vulnerability, and has been increasing due to the rising frequency and intensity of extreme hydrometeorological events. This issue is particularly relevant in Mediterranean regions, where floods often affect small, densely populated, and highly urbanised basins.This study applies a comprehensive climate risk assessment to the Foglia River basin (Marche, Italy) using the framework and tools developed within the Horizon Europe CLIMAAX project. Locally developed flood hazard maps were integrated with exposure and vulnerability data, focusing on the city of Pesaro at the river mouth. Risk was quantified in terms of building damage and population exposure for different return periods.To further investigate changes in flood hazard, projected river discharge under climate scenarios was analysed. The results indicate a relative increase in flood recurrence exceeding 20% for the 5-, 10-, and 50-year return periods, suggesting a significant intensification of flood risk. The study provides spatially explicit estimates of potential economic losses and supports the refinement of regional climate adaptation strategies, offering valuable insights for the integration of climate risk considerations into urban and territorial planning.

Article
Business, Economics and Management
Econometrics and Statistics

Marlon Fritz

,

Thomas Gries

,

Yuanhua Feng

Abstract: The most widely used method for trend estimation in economics is the Ho-drick-Prescott (HP) filter. The HP filter has various disadvantages as the arbitrary and frequency-dependent choice of the smoothing parameter λ, boundary problems and difficult interpretation when linking to economic theory. We suggest an alternative method by improving some of these disadvantages using a purely data-driven, endog-enous nonparametric trend estimation. A simulation study and different applications demonstrate the advantages of the nonparametric trend compared to the HP filter. We identify optimal time windows supporting the momentary growth trend. Within this window economic fundamentals smoothly change and drive the trend.

of 5,878

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated