Engineering

Sort by

Article
Engineering
Bioengineering

Gianluca Borghini

,

Khadija Latrach

,

Gianluca Di Flumeri

,

Pietro Aricò

,

Vincenzo Ronca

,

Andrea Giorgi

,

Rossella Capotorto

,

Alessia Ricci

,

Stefano Bonelli

,

Vanessa Arrigoni

+5 authors

Abstract: Background/Objectives: The Human Performance Envelope (HPE) is a multidimen-sional model that represents the range in which an individual operator's performance is acceptable or begins to become dangerous. Although several alternative models have been proposed, HPE currently remains primarily a theoretical concept. The goal of the study was therefore to translate this theoretical concept into practical applications, seeking to characterise and measure how HPE manifests itself in real-world contexts. Methods: Multivariate Autoregressive Models (MVAR) have been used in the analysis of complex systems in which variables are interdependent and mutually influence their dynamics over time. Professional Air Traffic Controllers (ATCOs) were involved in the study and asked to deal with realistic traffic scenarios while their behavioural, subjec-tive and neurophysiological data were collected. Partial Information Decomposition - Least Absolute Shrinkage and Selection Operator (PID – LASSO) model was then em-ployed to estimate the interactions among ATCO’s Human Factors (HFs) and identify the most appropriate characterisation of the HPE. Results: The results showed high and significant correlations among each ATCO’s performance and the corresponding neu-rophysiological – based HPE values. Furthermore, high-performance conditions (Best) were characterized by a significantly higher HPE values and a higher inter-HFs con-nections compared to low-performance (Worst) states. This suggested that a densely interconnected network of HFs is a prerequisite for operational resilience. Conclusions: The study provides the first application of a neurophysiological framework to model the causal interactions between HFs, translating the theoretical HPE into a quantifiable model validated against operator performance.

Article
Engineering
Civil Engineering

George Taranu

Abstract: This paper presents a nonlinear time-history re-assessment of an existing reinforced concrete (RC) frame building designed in 2007 according to the Romanian seismic code P100-1/2006 and re-checked against current seismic demand. Two three-dimensional solid finite-element models were developed in ANSYS: a bare RC frame and an RC frame with masonry infill panels. A distinctive feature is the explicit representation of longitudinal and transverse reinforcement embedded in the concrete solids, enabling direct tracking of steel stress demand and post-cracking load transfer. The models were subjected to bidirectional ground motions from the Vrancea 1977 and 1990 earthquakes and the Türkiye 2023 earthquake, scaled to match the P100-1/2013 target spectrum for the investigated site (a_g=0.40g). Modal analysis shows a clear stiffness increase due to infills, with the fundamental frequency rising from 4.4669 Hz (RC) to 5.8680 Hz (RC+M). Under the scaled records, infills substantially reduce global deformation demand: peak roof displacements in the transversal direction decrease from 9.87–14.26 mm (RC) to 2.74–3.38 mm (RC+M), and peak interstorey drift increments decrease from 3.35–4.94 mm to 0.92–1.16 mm, with drift ratios remaining well below conservative serviceability thresholds. Roof peak accelerations also decrease, reaching 0.490 g for RC versus 0.211 g for RC+M in the governing VN90 case. Base-reaction resultants and F_y–roof displacement loops confirm a stiffer global response with reduced displacement excursions for the infilled configuration. Local fields indicate that, in the bare frame, plastic strain concentrates at perimeter column bases and beam ends, while in the infilled model inelastic indicators shift toward masonry discontinuities around openings and panel corners; reinforcement demand peaks at beam ends, column bases, and the staircase region, consistent with torsional participation. The results highlight that masonry infills can strongly govern stiffness and drift demand at current design-level intensity, while introducing localised concentration zones that are relevant for performance assessment of existing buildings.

Review
Engineering
Telecommunications

Evelio Astaiza Hoyos

,

Héctor Fabio Bermúdez-Orozco

,

Nasly Cristina Rodriguez-Idrobo

Abstract: The evolution of future Internet and sixth-generation (6G) networks is driving a paradigm shift from classical bit-centric communication toward meaning-aware and task-oriented communication models. Traditional information theory, while fundamental for ensuring reliable symbol transmission, does not account for semantic relevance or task effectiveness, which are critical for emerging applications such as autonomous systems, immersive services, and ultra-low-latency communications. This article presents a comprehensive review of Semantic Communications (SemCom) from a future Internet perspective. The review systematically analyses representative extensions of classical information theory aimed at quantifying semantic information, including semantic information measures, semantic channel capacity, and semantic rate–distortion formulations. In addition, the main mathematical and computational frameworks enabling practical semantic communication systems are examined, including the Information Bottleneck principle, learning-based end-to-end communication architectures, and reinforcement learning approaches for task-oriented optimization under network constraints. The review further discusses the role of semantic metrics, contextual modelling, and task-driven performance evaluation in the design of semantic-aware communication systems. The analysis identifies key open challenges, particularly the lack of a unified theoretical framework, the need for robust and context-aware semantic performance metrics, and the integration of semantic awareness into network-level design. Overall, this review highlights Semantic Communications as a promising paradigm for future Internet and 6G networks, where communication efficiency is increasingly determined by semantic relevance and task effectiveness rather than bit-level fidelity alone.

Article
Engineering
Energy and Fuel Technology

Apostolos Spyridonidis

,

Katerina Stamatelatou

Abstract: This study explores the use of bentonite to enhance biological biogas upgrading in a bubble reactor (BR) operated under mesophilic conditions (39 ± 1 °C). The experimental setup consisted of a 2 L vertically oriented BR (height to diameter ratio 16:1) fed with a synthetic gas mixture (60% H2, 15% CO2, 25% CH4, v/v) at a gas recirculation rate of 4 L LR⁻¹ h⁻¹. The aim was to overcome hydrogen’s low gas-liquid mass transfer rate while avoiding the operational challenges typically associated with trickle-bed reactors (TBRs). Bentonite increases the density and hydrostatic pressure of the liquid medium, and likely alters its rheology, thereby extending the gas-liquid contact time without requiring elevated pressures or intensive gas recirculation. Additionally, bentonite is expected to provide microstructural support that promotes the formation of biofilm-like communities, creating favorable microenvironments for hydrogenotrophic methanogens. As a clay-based additive, bentonite may also contribute to improved process stability through adsorption of inhibitory compounds, enhanced bio-mass retention, and pH buffering. Under mesophilic conditions, the bentonite-modified BR achieved a methane production rate of 2.17 ± 0.06 L CH4 LR⁻¹ d⁻¹ at a gas retention time of 1.49 h, with methane purity reaching 96.25%. In comparison, a previously reported mesophilic BR operated under identical reactor configuration and operating conditions but without bentonite exhibited substantially lower methane production rates, supporting the beneficial role of bentonite in biological methanation. The findings highlight bentonite’s potential dual role -physical and biological- in improving process efficiency and stability in biological methanation.

Article
Engineering
Control and Systems Engineering

Lorenzo Albanese

Abstract: Hydrodynamic cavitation in process plants is often generated by static devices designed for nominal operating conditions. In real processes, however, the properties of the process fluid, including temperature, viscosity, and gas and solids content, may vary. Consequently, maintaining the cavitation regime within a target operating window over extended periods is challenging. The Dynamic Circular Venturi (DCVA) is introduced as a circular Venturi with an internal geometry that can be reconfigured during operation. The external body and connections are preserved, while the internal configuration, particularly the throat section, can be adjusted. A formalism based on equivalent geometric parameters is proposed to describe the set of admissible configurations. A dedicated design space is also defined to identify, for a given architecture, the subset that is practically accessible. Two implementations are presented: a single degree-of-freedom layout for throat-opening modulation and a multiparametric layout based on axial stations, enabling the generation of a family of internal profiles. An estimated operating indicator is introduced and formulated using variables typically measured in process plants, supporting configuration selection and the specification of operating settings. This conceptual framework can support the optimization of sustainable food-processing operations enabled by hydrodynamic cavitation, such as green extraction and food by-product valorization, with potential gains in resource efficiency and waste minimization.

Article
Engineering
Electrical and Electronic Engineering

Ibrahim Jahan

,

Khoa Nguyen Dang Dinh

,

Vojtěch Blažek

,

Vaclav Snasel

,

Stanislav Mišak

,

Ivo Pergl

,

Faisal Mohamed

,

Abdesselam Mechali

Abstract: To mitigate environmental impact, specifically the CO2 emissions associated with conventional thermal and nuclear facilities, renewable energy sources are increasingly being adopted as primary alternatives. However, integrating these renewable sources into the utility grid poses a significant challenge, primarily due to the stochastic and nonlinear nature of weather. Consequently, it is imperative that power systems operate under an intelligent control model to ensure energy output meets strict power quality standards. In this context, accurate forecasting is a cornerstone of smart power management, particularly in off-grid architectures, where predicting Power Quality Parameters (PQPs) is fundamental for system optimisation and error correction. This study conducts a comprehensive comparative evaluation of nine different predictive architectures for estimating PQPs. The algorithms analyzed include LSTM, GRU, DNN, CNN1D-LSTM, BiLSTM, attention mechanisms, DT, SVM, and XGBoost. The central objective is to develop a reliable basis for the automated regulation and enhancement of electrical quality in isolated systems. The specific parameters investigated are power voltage (U), Voltage Total Harmonic Distortion THDu, Current Total Harmonic Distortion THDi, and short-term flicker severity (Pst). Data for this investigation were acquired from an experimental off-grid setup at VSB-Technical University of Ostrava (VSB-TUO), Czech Republic. To assess model performance, we utilised Root Mean Square Error (RMSE) as the primary accuracy metric, while simultaneously evaluating computational efficiency in terms of processing speed and memory consumption during testing.

Review
Engineering
Bioengineering

Britney S Force

,

Xueqin Gao

,

Johnny Huard

Abstract: Musculoskeletal disorders and injuries are highly prevalent and encompass a broad range of conditions, including bone fractures and segmental defects, tendinopathies and tendon injury, and cartilage disorders such as osteoarthritis, cartilage defects, and intervertebral disc disease. These conditions can arise from diverse causes including trauma and injury, tumor resection, congenital abnormalities, and age-related degeneration. In the past decades, administration of chemically modified mRNA (cmRNA) encoding growth factors and transcriptional regulators has demonstrated effectiveness in repairing musculoskeletal tissues in preclinical studies. This review summarizes recent advancements in bone, tendon, cartilage, intervertebral disc, and muscle regeneration achieved through the localized delivery of protein-encoding mRNAs to express therapeutic target proteins. Delivery of cmRNA encoding growth factors such as BMP-2, BMP-9, VEGF, FGF-18, and IGF-1, or transcriptional regulators including Runx1 to various animal models has shown beneficial effects on bone, tendon, cartilage, and muscle injury repair in preclinical models. Alongside these progresses, the advantages and disadvantages of applying chemically modified mRNA for musculoskeletal tissue regeneration are also discussed. While studies show the promise of cmRNA for therapeutic applications in orthopaedic tissue regeneration, more research is required to optimize growth factors and delivery methods, as well as validate long-term safety and efficacy prior to successful translation into new therapies to benefit patients.

Article
Engineering
Electrical and Electronic Engineering

Michal Kozlok

,

Marek Balsky

,

Petr Zak

Abstract: Spatial light field metrics such as mean cylindrical illuminance provide essential information for qualitative lighting evaluation, yet they remain far less common in practice than horizontal illuminance. To address this gap, we present a multi-sensor prototype that simultaneously measures horizontal illuminance Eh and approximates mean cylindrical illuminance Ez from a set of vertical illuminances uniformly spaced around a cylindrical surface. The device uses a flexible PCB wrapped around a support barrel and an inertial and magnetic measurement unit for orientation tracking. The measurements enable direct calculation of the modelling factor defined in the technical standard EN 12 464 and visualization of directional light distribution using polar plots and illuminance solid. Results show that the prototype approximates mean cylindrical illuminance with high accuracy while preserving directional information, allowing the illuminance solid to be decomposed into vector and symmetric components. Compared with conventional approximation methods, the proposed multi-sensor approach reduces spatial error and yields richer data for lighting analysis. These findings indicate that multi-sensor systems can bridge the gap between theoretical spatial metrics and practical photometry and support the improved modelling evaluation and integration of qualitative lighting parameters into routine workflows.

Article
Engineering
Electrical and Electronic Engineering

Ricardo Adonis Caraccioli Abrego

Abstract: We derive an exact, practical method to update Thévenin parameters (open-circuit voltage and equivalent resistance) of a linear network under a single internal branch modification (open/short/resistance change), without recomputing the full nodal solution from scratch. The change is modeled as a rank-one perturbation of the nodal admittance matrix, and the Sherman–Morrison identity yields closed-form port updates in terms of three physically interpretable scalars: local self-coupling, port–branch coupling, and state projection across the modified branch. We discuss limiting cases (open and short), include a brief note on complex admittances (phasors/Laplace), and provide a reproducible Python check.

Article
Engineering
Civil Engineering

Mariusz Pecio

Abstract: Building law allows the use of a building that is non-compliant with fire safety regulations, provided that enhanced fire exit strategies are implemented to mitigate the negative impact of this non-compliance on fire safety. This article demonstrates the potential of using a probabilistic fire risk analysis method—multisimulation—to increase the efficiency of selecting fire exit strategies. Multisimulation is a quantitative risk analysis method that utilizes, among other things, computer models of fire development and evacuation, as well as modern mathematics and computer science. The main aim of multisimulation is to perform multiple computer simulations (hence the name) for as many fire scenarios as possible in a given building. This article demonstrates the potential of using this method in a practical approach to ensuring fire safety. For this purpose, an existing auditorium building was analyzed, in which numerous non-compliances with applicable regulations were identified. The analysis included 1000 fire and evacuation simulations in a theater auditorium equipped with two emergency exits and 1000 fire and evacuation simulations in a theater auditorium equipped with three emergency exits. In the simulations of both scenarios, the duration of a performance conducted with a full audience and people performing on stage was modelled. The results clearly demonstrated a significant improvement in safety when three emergency exits were available. In terms of both the required safe egress time (RSET) and risk analyses, when three emergency exits were available (instead of the required two), the possibility of having only one functioning exit, which may occur due to a human error, was eliminated. Therefore, it was undoubtedly confirmed that the use of a third emergency exit is justified as an optimal fire exit strategy or a future legislative requirement.

Article
Engineering
Civil Engineering

Sebastian Schilling

,

Christian Clemen

Abstract: The integration of building information modeling (BIM) and geographic information systems (GIS) is an important area of research aimed at improving interoperability between these domains. These domains often use different concepts for semantics such that non-interoperable vocabularies; schemes; metamodels for semantics; and, in general, non-interoperable IT architectures are used to publish semantic concepts. This study investigates the use of BIM data dictionaries for semantic classification of vector-based geospatial data in GIS, aiming to enable the use of common dictionaries and concepts to describe objects in both domains. The study addresses a particular problem: the fact that the domains use different metaconcepts to describe conceptual information and have different classification methods. The research focuses on identifying significant standards, comparing their metamodels to find similarities and explore the practical use of BIM data dictionaries for the semantic enrichment of GIS features. As a proof of concept, three approaches for the classification of features are developed and validated through implementation in the QGIS software. The results demonstrate that BIM data dictionaries can be used to semantically enrich geospatial data in GIS, with the buildingSMART Data Dictionary (bSDD) serving as a practical example. The conclusions drawn from the study are that although there are limitations and challenges, the integration of BIM data dictionaries into GIS is possible and beneficial for improving interoperability, particularly when cross-domain concepts are employed.

Article
Engineering
Bioengineering

Yutaka Yoshida

,

Kiyoko Yokoyama

Abstract: Sample-wise detection of P-, R-, and T-peaks in electrocardiograms (ECGs) is challenging because each peak type is sparsely represented (≈1:500 samples in a typical 10-s, 500-Hz ECG at 60 bpm), such that even a small number of false-positives (FPs) can markedly degrade positive predictive value (PPV) and limit the practicality of classifier-only approaches. This study proposes a lightweight ECG peak detection framework that combines binary classifiers with physiological temporal constraints (PTC) to address extreme sample-level class imbalance. Local morphological features are first evaluated using lightweight machine-learning models, among which XGBoost (XGB) exhibited the most stable score-ranking performance. Rather than directly thresholding classifier outputs, prediction scores are interpreted within the framework, which encodes physiological timing relationships. R-peaks are detected using score ranking combined with a refractory-period constraint, and the detected R-peaks serve as temporal landmarks for subsequent P- and T-peak detection within physiologically plausible time windows reflecting the P–QRS–T sequence. Quantitative evaluation was conducted using the Lobachevsky University Electrocardiography Database, hereafter referred to as LUDB. With a temporal tolerance of ±20 ms, the XGB-based system achieved an F1-score of 0.87 for R-peak detection (sensitivity 0.96, PPV 0.79), corresponding to approximately 9–10 true R-peaks with only 2–3 FP samples per 10-s segment. For P- and T-peaks, F1-scores of 0.70 and 0.69 were obtained, respectively. Additional evaluation on arrhythmic LUDB records demonstrated robust R-peak detection across rhythm types. In AF-related rhythms, where organized P waves are physiologically absent, the framework appropriately suppressed P-peak detections, with false-positive rates remaining below 0.31%. Qualitative application to ECG recordings from the PTB-XL database further demonstrated physiologically consistent behavior. These results indicate that reliable and interpretable ECG peak detection under extreme class imbalance can be achieved by integrating lightweight classifiers within the proposed framework, without reliance on complex deep learning architectures.

Article
Engineering
Marine Engineering

Teresa Abramowicz-Gerigk

Abstract: The paper presents an analysis of the risk of failure of port structures in a modern seaport due to vessel impacts. The analysis addresses potential damage related to port maneuvers of self-maneuvering vessels and possible risk reduction options that can be applied to enhance port resilience. The proposed system model—including ship, port infrastructure and environment—enabled the observation of both implemented and anticipated future risk reduction measures. The analysis was carried out using the ferry terminal in the large Polish Port of Gdynia as a case study. A Bayesian influence diagram—including decisions related to the implementation of risk reduction options—was used to determine the total risk associated with ro-pax ferry port calls. Sustainable risk management led to the implementation of a cloud-based monitoring system and, subsequently, to the design of a new terminal in line with the green port concept. A comparative risk assessment for the two locations demonstrated improved safety and reduced environmental pollution in the new Public Ferry Terminal, primarily due to reduced spatial risk and the implementation of cold-ironing technology in the new terminal. The potential future implementation of an automated mooring system was also discussed.

Article
Engineering
Marine Engineering

Hyunju Lee

,

Hyerim Bae

Abstract: This study presents a large-scale empirical comparison of operational efficiency metrics derived from the IMO Data Collection System (DCS) and the EU Monitoring, Reporting and Verification (MRV) framework. Using a matched dataset of 15,755 dual-reported vessels and over 50,000 ship-year observations from 2019 to 2024, paired non-parametric tests, effect size estimation, and agreement diagnostics were applied to assess consistency across monitoring systems. Results indicate that although statistically significant differences are detected (p < 0.001), practical differences are negligible (Cohen’s d < 0.025), with MRV-based values averaging approximately 1.4% lower Annual Efficiency Ratio (AER) and fuel intensity than DCS values. Distributional analysis confirms substantial overlap between datasets, and temporal trends show progressive convergence following the implementation of the Carbon Intensity Indicator (CII) regulation. However, pronounced vessel-type heterogeneity is observed. Flexible cargo vessels exhibit consistent efficiency improvements in EU-related voyages, whereas container ships show minimal variation and LNG carriers demonstrate indicator-dependent patterns. Overall, the findings indicate that DCS and MRV provide broadly comparable representations of operational efficiency, with observed differences primarily reflecting vessel-type-specific operational characteristics rather than structural inconsistencies in reporting systems. The study contributes a scalable statistical validation framework for cross-regulatory monitoring assessment.

Article
Engineering
Chemical Engineering

Kuan-Hsun Huang

,

Chin-Chung Tseng

,

Chia-Chun Lee

,

Cheng-Xue Yu

,

Lung-Ming Fu

Abstract: Chronic kidney disease (CKD) is a progressively worsening condition that erodes renal function over time, reduces quality of life, and can ultimately culminate in kidney failure with far-reaching systemic complications. In addition to reduced filtration, worsening kidney function disrupts mineral homeostasis and leads to CKD–mineral and bone disorder (CKD-MBD). Dysregulated calcium handling and maladaptive endocrine responses contribute to bone pathology and increase cardiovascular calcification risk; therefore, serial calcium monitoring remains clinically relevant for longitudinal CKD management. Conventional calcium measurements are typically obtained with centralized analyzers or laboratory assays (e.g., colorimetry and electrode/optical readouts). Despite high accuracy, the required instrumentation, controlled operating conditions, and pretreatment steps complicate rapid point-of-care deployment, especially when only microliter-scale biofluids are available. Accordingly, this study develops a finger-actuated microfluidic colorimetric platform capable of determining calcium ion concentrations in human biofluids, such as whole blood, serum, and urine. The platform integrates a three-dimensional PMMA/paper microchip with a compact reader that maintains stable temperature control while enabling CMOS-based optical detection. With just 6 μL of sample, a brief finger press propels the biofluid across an internal filtration layer, generating serum or cleaned urine that subsequently reacts with a pre-deposited murexide reagent. Under optimized conditions (1.6% reagent, 50°C, 3 min), the signal follows a strong logarithmic relationship with calcium concentration (Y = 47.273 ln X + 28.890; R² = 0.9905), supporting quantification over 1–40 mg/dL and a detection limit of 0.2 mg/dL. Across 80 clinical CKD specimens spanning serum, whole blood, and urine, results aligned closely with the NM-BAPTA reference assay, with R² values exceeding 0.97.

Article
Engineering
Automotive Engineering

Volodymyr Shramenko

,

Bernd Lüdemann-Ravit

Abstract: Vibrations of thin sheet-metal parts during robotic manipulation on a production line create a number of serious challenges for production process planning. Modeling the behavior of an elastic plate or shell as a function of the robot manipulator trajectory is typically performed using the finite element method (FEM) and requires significant computational effort. The time factor remains a key limitation for integrating operations involving flexible parts into the virtual commissioning process. In this work, a methodology is proposed that enables accurate real-time reproduction of the behavior of an elastic part during linear robotic manipulation. The approach is based on modeling the response of an elastic part to a prescribed base excitation using the FEM and on the development of a reduced model compliant with the FMI/FMU standard. This reduced model computes, in real time, the convolution of the precomputed base response with the acceleration profile corresponding to the robot TCP trajectory. This makes it possible to determine the total cycle duration, which consists of the part transfer time and the time required for vibration decay at the end of the trajectory down to an acceptable threshold, as well as to perform collision checking while accounting for the deformation of the flexible part. As a result, operations involving elastic parts can be integrated into the virtual commissioning process.

Article
Engineering
Bioengineering

Isabella C. S. Nascimento

,

Andressa M. Souza

,

Andrea P. Parente

,

Edna M. M. Oliveira

,

Andrea Valdman

,

Rossana O. M. Folly

,

Andrea M. Salgado

Abstract: A quartz crystal microbalance-based biosensor for the specific detection of the first transgenic common bean (Phaseolus vulgaris L.) cultivar (BRS FC401 RMD) with resistance to bean golden mosaic virus (BGMV) was developed. The immobilization chemistry relies on the strong bond between the thiolated probe and the gold electrode surface. The probe sequence is internal to a region of the BGMV rep gene that was introduced into the common bean genome. The sensor's analytical performance was determined using synthetic oligonucleotides. Real samples of transgenic and wild-type bean seeds were also tested. Sample pretreatment consisted only of enzymatic fragmentation, followed by a thermal denaturation step combined with blocking oligonucleotides. Different biosensor regeneration approaches were studied. Immobilization showed good reproducibility (CV% of 5.8%). The biosensor proved specific for both synthetic oligonucleotides and non-amplified genomic DNA. A linear detection range of 0–1.4 ng/µL was observed, with a detection limit of 0.18 ng/µL. Three sequential detections were performed without loss of surface activity. The results demonstrate the biosensor's potential for direct, real-time, label-free detection of DNA samples for field screening of transgenic common bean cultivars.

Review
Engineering
Mechanical Engineering

M. Amir Siddiq

Abstract: Physics-based constitutive modelling remains a cornerstone for predicting ductile damage and fracture in metallic materials, particularly where microstructural mechanisms govern macroscopic response. Over the past two decades, a wide range of crystal plasticity, porous plasticity, and void-based fracture models have been proposed to capture deformation localisation, void growth, and coalescence under complex loading paths. However, these developments are often presented in isolation, obscuring their shared physical assumptions and limiting their transferability across material systems and length scales.This article provides a microstructure-sensitive perspective on constitutive modelling of ductile damage and fracture, with particular emphasis on crystal plasticity-based frameworks, void growth and coalescence mechanisms, and interface-driven fracture. Rather than attempting an exhaustive review, this review highlights unifying concepts, modelling trade-offs, and recurring challenges related to parameter identifiability, scale bridging, and predictive robustness. It further clarifies how physics-based constitutive descriptions can be systematically integrated into modern fatigue and fracture assessments and situate these developments relative to emerging data-assisted and machine-learning-enhanced modelling strategies.By reframing established constitutive models within a coherent physical narrative, this perspective aims to support more transparent model selection, improve interpretability, and guide future developments in multiscale damage and fracture modelling of metallic materials.

Article
Engineering
Safety, Risk, Reliability and Quality

Veselina Dimitrova

,

Ventsislav Dimitrov

,

Georgi Tonkov

,

Konstantin Raykov

,

Sylvester Bozherikov

,

Rumen Yankov

,

Gergana Tonkova

Abstract: This paper presents a reliability-oriented analytical framework for the quantitative assessment of fragment-resistant multilayer protective equipment subjected to impulsive fragment loading. The study is motivated by the stochastic nature of fragment generation and impact conditions in industrial and occupational accident scenarios, where deterministic penetration criteria are insufficient to describe protective performance. Fragment interactions are modelled as stochastic spatial events, with impact locations and kinematic characteristics treated as random variables and mapped onto a predefined protected region. System failure is formulated using an energy-based limit-state criterion defined by comparison between the absorbed energy demand induced by fragment impact and a critical admissible energy threshold. The fragment–PPE interaction is described using a reduced-order dynamic formulation with concentrated parameters, capturing the dominant normal deformation response under short-duration impulsive loading. Closed-form analytical expressions are derived that relate fragment mass and velocity to impact impulse and absorbed energy. The resulting formulation establishes a direct link between impulse-driven dynamic response, progressive multilayer engagement, and failure probability under single and repeated impact events. Application of the proposed framework to a representative multilayer protective configuration demonstrates physically consistent reliability trends and confirms its computational efficiency. The framework provides a practical tool for reliability-informed assessment and preliminary design of fragment-resistant multilayer protective equipment.

Article
Engineering
Electrical and Electronic Engineering

Yuzhou Ma

,

Haolong Qian

,

Wei Li

Abstract: The digital preservation of batik, a world intangible cultural heritage, is hindered by the difficulty in performing accurate semantic segmentation on its complex patterns with limited annotated samples. To address this few-shot learning challenge, we constructed a few-shot batik pattern dataset, and proposed a novel network architecture centered on attention weighting and hierarchical decoding. Our method leverages a pre-trained ResNet101 backbone for transfer learning to establish a strong feature foundation. It incorporates a dual-attention module that combines spatial and channel attention to dynamically highlight semantically rich regions and intricate texture boundaries specific to batik. For multi-scale context aggregation, a lightweight module utilizing parallel dilated convolutions is introduced to efficiently capture features from varying receptive fields. Finally, a hierarchical decoder progressively integrates these enhanced, multi-scale features with high-resolution shallow features to reconstruct precise segmentation maps. Comprehensive evaluations on a dedicated batik dataset show that our model achieves state-of-the-art performance, with a mean Intersection over Union (mIoU) of 79.22% and a Pixel Accuracy (PA) of 92.47%. It notably improves over the strong DeepLabV3+ baseline by 3.3% in mIoU and 0.95% in PA, demonstrating its effectiveness for the task of batik pattern segmentation under data-scarce conditions.

of 797

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated