Engineering

Sort by

Review
Engineering
Other

Zhengyu Shu

,

Xiangning Yuan

,

Sien Li

Abstract: Accurate diagnosis of crop water demand is a core challenge in alleviating agricultural water scarcity. Traditional diagnostic methods, which rely mainly on soil moisture sensor monitoring or empirical models based on meteorological data, suffer from limitations such as insufficient spatiotemporal representativeness and an inability to reflect crop physiological status in real time, leading to an annual water waste of 10–30%. Therefore, developing technologies that enable real-time, non-destructive, and precise monitoring of crop water status is crucial. In recent years, the rapid advancement of high-throughput phenotyping technology has provided revolutionary tools to address this challenge. By integrating multi-source sensors (e.g., thermal infrared and hyperspectral imaging), multi-dimensional response characteristics of crops under water stress can be rapidly acquired. This paper systematically reviews research progress in using high-throughput phenotyping to obtain water-sensitive phenotypic traits and construct crop water demand diagnosis models. It focuses on: (1) the connotation and acquisition techniques of key water-sensitive phenotypic indicators, such as canopy temperature, spectral indices, and chlorophyll fluorescence; (2) the advantages, limitations, and fusion strategies of multi-platform data acquisition systems, including unmanned aerial vehicles (UAVs), ground mobile platforms, and satellite remote sensing; and (3) the construction methods, performance evaluation, and practical application cases of diagnostic models based on machine learning (e.g., Random Forest, XGBoost), deep learning (e.g., CNN, LSTM), and mechanism-coupled models. The innovation of this review lies in its systematic integration of the entire technological chain—"phenotyping acquisition → model construction → decision-making"—while identifying current research challenges, including field environmental complexity, model generalization capability, data barriers, and interpretability. Future development pathways are proposed, focusing on low-cost sensing, explainable AI, multi-source data fusion, and cloud-edge collaborative decision systems. This review aims to provide a systematic theoretical and practical reference for water management in precision irrigation and smart agriculture.
Article
Engineering
Other

Dorothy Onchagwa

,

Felix Mutua

Abstract: Rapid urbanization in African cities has increased demand for safe and reliable energy infrastructure, with Liquefied Petroleum Gas (LPG) emerging as a leading option for clean cooking. In Nairobi, Kenya, the siting of LPG refill stations is critical to minimizing safety risks and protecting public health. This study applied a GIS-based Multi-Criteria Decision Analysis (MCDA) to identify suitable areas for LPG stations. The analysis integrated land use, elevation, slope, geology and soil data with regulatory and planning constraints. Two approaches were compared: Boolean analysis, which produced a strict exclusion mask identifying 33 restricted zones and weighted overlay (with the exclusion mask applied), which yielded 2439 feasible locations across varying levels of suitability. Peri-urban neighborhoods in Embakasi ward consistently emerged as the most favorable. Validation using high-resolution imagery confirmed the contextual appropriateness and regulatory compliance of selected sites, with weighted overlay achieving 87.1% accuracy and Boolean analysis 85.7%. The findings show that weighted overlay combined with an exclusion mask provides a more flexible and comprehensive framework than rigid Boolean methods balancing safety with regulatory requirements. The study provides evidence-based guidance for expanding LPG infrastructure in rapidly urbanizing cities supporting strategic urban planning while reducing environmental, social and safety risks.
Article
Engineering
Other

Byron Ricardo Zapata

,

Jaime Rolando Heredia

,

Víctor Ruiz-Díez

,

Jose Luis Sánchez-Rojas

Abstract: This article presents the design, fabrication, and experimental validation of a centimeter-scale autonomous robot that achieves bidirectional locomotion and trajectory control through 3D-printed resonators actuated by piezoelectricity and integrated with miniature legs. Building on previous works that employed piezoelectric bimorphs, the proposed system replaces them with custom-designed 3D-printed resonant plates that exploit the excitation of standing waves (SW) to generate motion. Each resonator is equipped with strategically positioned passive legs that convert vibratory energy into effective thrust, enabling both linear and rotational movement. A differential drive configuration, implemented through two independently actuated resonators, allows precise guidance and the execution of complex trajectories. The robot integrates onboard control electronics consisting of a microcontroller and inertial sensors, which enable closed-loop trajectory correction via a PD controller and allow autonomous navigation. The experimental results demonstrate high-precision motion control, achieving linear displacement speeds of 8.87 mm/s and a maximum angular velocity of 37.88°/s, while maintaining low power consumption and a compact form factor. Furthermore, the evaluation using the mean absolute error (MAE) yielded a value of 0.83° in trajectory tracking. This work advances the field of robotics and automatic control at the insect scale by integrating efficient piezoelectric actuation, additive manufacturing, and embedded sensing into a single autonomous platform capable of agile and programmable locomotion.
Article
Engineering
Other

Tong Wang

,

Xin Du

,

Shufa Chen

,

Qixin SUN

,

Yue Jiang

,

Hengjie Dong

Abstract: This study conducts systematic experimental and numerical investigations to address the parameter calibration issue in the discrete element model of seashells, aiming to establish a high-fidelity numerical model that accurately characterizes their macroscopic mechanical behavior, thereby providing a basis for optimizing parameters of seashell crushing equipment. Firstly, intrinsic parameters of seashells were determined through physical experiments: density of 2.2 kg/m³, Poisson's ratio of 0.26, shear modulus of 1.57×10⁸ Pa, and elastic modulus of 6.5×10¹⁰ Pa. Subsequently, contact parameters between seashells and between seashells and 304 stainless steel, including static friction coefficient, rolling friction coefficient, and coefficient of restitution, were obtained via the inclined plane method and impact tests. The reliability of these contact parameters was validated by the angle of repose test, with a relative error of 5.1% between simulation and measured results. Based on this, using ultimate load as the response indicator, the Plackett-Burman experimental design was employed to identify normal stiffness per unit area and tangential stiffness per unit area as the primary influencing parameters. The Bonding model parameters were then precisely calibrated through the steepest ascent test and Box-Behnken design, resulting in an optimal parameter set. The error between simulation results and physical experiments was only 3.8%, demonstrating the high reliability and accuracy of the established model and parameter calibration methodology.
Article
Engineering
Other

Zaryab Rahman

Abstract: Current paradigms in Self-Supervised Learning (SSL) achieve state-of-the-art results through complex, heuristic-driven pretext tasks such as contrastive learning or masked image modeling. This work proposes a departure from these heuristics by reframing SSL through the fundamental principle of Minimum Description Length (MDL). We introduce the MDL-Autoencoder (MDL-AE), a framework that learns visual representations by optimizing a VQ-VAE-based objective to find the most efficient, discrete compression of visual data. We conduct a rigorous series of experiments on CIFAR-10, demonstrating that this compression-driven objective successfully learns a rich vocabulary of local visual concepts. However, our investigation uncovers a critical and non-obvious architectural insight: despite learning a visibly superior and higher-fidelity vocabulary of visual concepts, a more powerful tokenizer fails to improve downstream performance, revealing that the nature of the learned representation dictates the optimal downstream architecture. We show that our MDL-AE learns a vocabulary of holistic object parts rather than generic, composable primitives. Consequently, we find that a sophisticated Vision Transformer (ViT) head, a state-of-the-art tool for understanding token relationships, consistently fails to outperform a simple linear probe on the flattened feature map. This architectural mismatch reveals that the most powerful downstream aggregator is not always the most effective. To validate this, we demonstrate that a dedicated self-supervised alignment task, based on Masked Autoencoding of the discrete tokens, resolves this mismatch and dramatically improves performance, bridging the gap between generative fidelity and discriminative utility. Our work provides a compelling end-to-end case study on the importance of co-designing objectives and their downstream architectures, showing that token-specific pre-training is crucial for unlocking the potential of powerful aggregators.
Article
Engineering
Other

Diego Camino-Treviño

,

Ricardo I. López-García

,

Luis F. Luque-Vega

,

Jorge A. Lizarraga

,

Marcela E. Mata-Romero

,

Miriam A. Carlos-Mancilla

Abstract: Understanding resonance and frequency behavior is fundamental in engineering acoustics and in technology-supported music-learning environments. This work presents an Educational Acoustics Audio System (EAAS) designed as a sensor-based hardware–software toolkit that enables experiential learning of acoustic resonance through listening, spectrogram visualization, and analytical modeling. The system integrates a bass-reflex loudspeaker with interchangeable vent configurations, a microphone-based sensing module, automated spatial sampling, and a MATLAB interface for generating logarithmic sweeps, recording responses, and computing high-resolution spectrograms. The instructional design is grounded in the Educational Acoustics Conceptual Framework (EACF), originally proposed by the authors, which structures learning through concrete, graphical, and abstract levels. Learners first explore perceptual changes in low-frequency amplification, then interpret time–frequency patterns using spectrograms, and finally compute Helmholtz-based resonance frequencies based on physical parameters. Experimental measurements collected at multiple microphone distances reveal stable resonance peaks at approximately 546 Hz (full vent) and 265 Hz (half vent), alongside consistent amplitude differences between vent configurations. By integrating auditory perception, sensor-based acquisition, and mathematical modeling in a unified and low-cost system, the EAAS provides an effective technological platform for hands-on exploration of resonance and frequency response. This approach strengthens conceptual understanding in engineering acoustics while supporting its application in related educational contexts.
Article
Engineering
Other

Mario Jiménez Benítez

,

Fabio López Pires

,

Eustaquio Martínez Jara

Abstract: The expansion of Artificial Intelligence (AI) research has generated a massive and complex scientific ecosystem that requires systematic characterization, where no comprehensive studies have analyzed applications for engineering. This work conducts one of the most extensive scientometric analyses to date, encompassing 159,139 publications of the specialized literature indexed in the Web of Science (2005–2024). Using data cleaning, citation normalization (NCII), institutional productivity measures and keyword mining algorithms, the study maps the global evolution of AI research. Results reveal the dominance of Engineering and Computer Science disciplines, with China and the United States leading scientific output. High-impact open-access journals, such as IEEE Access, serve as the main dissemination channels. Emerging topics such as ChatGPT, Big Data, Internet of Things (IoT), and Digital Twins define the current research frontiers. The study provides a macroscopic evidence-based framework for understanding the dynamics of AI research for engineering problems and identifies future directions such as sentiment-based analytics, predictive modeling, and the evaluation of Large Language Models (LLMs) in scientific production. Overall, the main findings highlight AI’s growing role as a multidisciplinary driver of innovation across global research ecosystems.
Article
Engineering
Other

Yashpreet Malhotra

Abstract: The exponential growth of astronomical time-series data from missions such as Kepler, TESS, and LSST has created an urgent need for statistical frameworks capable of providing both scalability and interpretability. Gaussian Processes (GPs) have emerged as a powerful tool for probabilistic modeling due to their ability to capture correlated structures and quantify uncertainty. However, their computational complexity, which scales cubically with dataset size, has limited their applicability to large-scale astronomical datasets. This paper introduces a novel Gaussian Process framework, termed celerite, which achieves exact and efficient inference for one-dimensional time-series data. The proposed method exploits the semiseparable structure of covariance matrices derived from mixtures of exponential kernels, reducing computational complexity from O(N 3 ) to O(N). Unlike traditional sparse or approximate GP methods, celerite maintains full model fidelity while enabling rapid processing of datasets containing millions of observations. Experimental evaluations on both simulated and real-world stellar light curves demonstrate that the proposed model accurately captures quasi-periodic and oscillatory variability with minimal loss of precision. The framework’s physical interpretability, numerical stability, and linear scalability make it highly suitable for modern astronomical pipelines and timedomain analyses. Beyond astrophysics, the principles of the celerite approach hold promise for other domains requiring fast, interpretable, and probabilistic time-series modeling.
Article
Engineering
Other

Yashpreet Malhotra

Abstract: Ensuring adequate statistical power is paramount in longitudinal clinical trials evaluating pharmaceutical interventions. Underpowered studies can lead to unreliable conclusions regarding drug efficacy. This paper introduces a computational framework, implemented as an R package and a user-friendly web application, to facilitate robust sample size and power calculations specifically for longitudinal data arising in pharmacological research. The methodology encompasses various statistical models commonly employed in analyzing repeated measures in treatment versus control settings. Utilizing illustrative examples relevant to pharmaceutical outcomes, such as disease progression in neurodegenerative conditions and changes in physiological markers under drug administration, we demonstrate the utility of this software in optimizing study design parameters. Furthermore, the application allows researchers to incorporate pilot data, potentially derived from large-scale initiatives like the Alzheimer’s Disease Neuroimaging Initiative (ADNI), to enhance the precision of these crucial computations, thereby improving the rigor and ethical conduct of pharmaceutical trials.
Concept Paper
Engineering
Other

Sai Praneeth Reddy Dhadi

,

Amulya Biradar

,

Manikanth Reddy Maram

,

Sandeep Gundu

,

Dhatri Mididuddi

,

Shreya Burra

Abstract: Artificial General Intelligence (AGI) has remained largely theoretical due to vague definitions, non-measurable criteria, and architectures that cannot be implemented in practice. Existing interpretations of AGI, from cognitive theories to universal intelligence models, provide valuable insights but do notoffer a concrete pathway for building or evaluating an actual general intelligence system. This paper introduces a new, measurable, and operational definition of AGI that emphasizes autonomous knowledge acquisition, reasoning across diverse and clearly defined domains, cross-domain transfer, adaptive self-improvement, and alignment with human goals. To support this definition, we propose a modular cognitive framework designed specifically for practical implementation. A working prototype is developed to demonstrate the feasibility of this approach. The system is capable of learning new knowledge, storing it in an adaptive memory, applying multi-step reasoning, transferring understanding across unrelated domains, and improving its performance through user feedback. Built using currently available technologies such as the Gemini API and structured memory mechanisms, the prototype shows that AGI can be demonstrated meaningfully even with today’s tools. The paper also presents a standardized evaluation suite that measures generalization, transfer, reasoning accuracy, learning efficiency, memory retention, adaptability, and alignment stability. Together, the definition, architecture, and prototype form a complete foundation for practical AGI research and represent a significant step toward realizing general-purpose intelligence.
Article
Engineering
Other

Szymon Dobrowolski

,

Waldemar Bauer

Abstract: Efficient access to similar legal cases is a crucial requirement for lawyers, judges, and researchers. Traditional text-based search systems often fail to capture both the semantic similarity and the relational context of legal documents \cite{article}. To address this challenge, we present LEGRA, a novel graph-based dataset of Polish court rulings designed for Retrieval-Augmented Generation (RAG) and legal research support \cite{https://doi.org/10.48550/arxiv.2005.11401}. LEGRA is automatically constructed through an end-to-end pipeline: rulings are collected from public sources, converted and cleaned, chunked into passages, and enriched with TF-IDF vectors and embedding representations. The data is stored in a Neo4j graph database where documents, chunks, embeddings, judges, courts, and cited laws are modeled as nodes connected through explicit relations. This structure enables hybrid retrieval that combines semantic similarity with structural queries, allowing legal professionals to quickly identify not only textually related cases but also those linked through judges, locations, or legal references. We discuss the construction pipeline, the graph schema, and potential applications for legal practitioners. LEGRA demonstrates how graph-based datasets can open new directions for AI-powered legal research.
Article
Engineering
Other

Jainagesh Akkaraju Sekhar

Abstract: Self-organization is common in complex systems, especially those in a metastable state at critical tipping points. This article examines the concept of energy storage and sub-regions formation during self-organization pathways within a specific region of interest known as the control volume. Spontaneous self-organization typically occurs in response to events that trigger a need to respond to an imbalance in the rate of entropy production within the control volume. All entropy-producing processes generate anti-work, enabling the creation of storable power and new sub-boundary pat-terns. A new concept, called anti-work (and related anti-power), is introduced to explain the stability of sub-regions. Examples of triggers include exceeding a local warming rate, earthquakes, or uniquely shaped bird flight formations aimed at energy efficiency, or supercooling, which is the primary example studied in this article to illustrate self-organization. The main conclusion is that the product of temperature and entropy density rate production remains constant during an entropy-producing process. Self-organization leads to a new order and new sub-boundaries. The resilience of this new order is examined in the context of energy partitioning and sub-boundary formation rates. It is observed that self-organization often follows sigmoidal (S-curve) patterns when selecting process pathways. This feature enables the maximization of entropy generation rate and the corresponding availability anti-power. This seemingly universal mechanism for pathway selection in entropy-producing transformations is closely related to the main findings of this article, namely the maximization of the entropy generation rate coupled with energy storage, which helps establish connections across discrete events.
Article
Engineering
Other

Alejandro González Barberá

,

Raheem Nabi

,

Aina Macias

,

Guillem Monrós-Andreu

,

Sergio Chiva

Abstract: Airborne particulate emissions originating from bulk-material handling operations constitute an increasingly critical environmental and public health issue in port-industrial areas located near residential areas. This study introduces a novel hybrid framework integrating high-fidelity Computational Fluid Dynamics (CFD) with surrogate Machine Learning (ML) techniques for the rapid assessment of particle dispersion in port-industrial environments. Focusing on the Port of El Grao in Castellón de la Plana (Spain), the study employs detailed geometric reconstructions derived from LiDAR data and cadastral maps to build an accurate three-dimensional digital model of the area. The turbulent atmospheric boundary layer and particle dispersion dynamics were simulated using two different OpenFOAM solvers within a circular computational domain designed to reproduce realistic wind conditions. The ML surrogate model, based on a decoder-style Multilayer Perceptron (MLP) architecture, processes two-dimensional slices of dispersion fields across particle diameter classes, enabling predictions in milliseconds with an acceleration factor of approximately 8×106 over traditional CFD while preserving high fidelity, as validated by error metrics such as the F1 score and Precision values exceeding 0.8 and 0.76 respectively. This approach not only addresses computational inefficiencies but also lays the groundwork for real-time air quality monitoring and sustainable urban planning, potentially integrating with digital twins fed by live weather data.
Article
Engineering
Other

Khabbab Zakaria

Abstract: Optimal Order Execution is a well-established problem in finance that pertains to the flawless execution of a trade (buy or sell) for a given volume within a specified time frame. This problem revolves around optimizing returns while minimizing risk, yet recent research predominantly focuses on addressing one aspect of this challenge. In this paper, we introduce an innovative approach to Optimal Order Execution within the US market, leveraging Deep Reinforcement Learning (DRL) to effectively address this optimization problem holistically. Our study assesses the performance of our model in comparison to two widely employed execution strategies: Volume Weighted Average Price (VWAP) and Time Weighted Average Price (TWAP). Our experimental findings clearly demonstrate that our DRL-based approach outperforms both VWAP and TWAP in terms of return on investment and risk management. The model's ability to adapt dynamically to market conditions, even during periods of market stress, underscores its promise as a robust solution.
Article
Engineering
Other

Nadia Ramona Cretescu

,

Mircea Neagoe

Abstract: Clearances in robot kinematic joints, which are functionally required, can have a significant influence on dynamic behavior and on optimal design, implicitly. Consequently, this paper aims to highlight the influence of clearances in the upper and lower spherical joints of a Delta parallel robot on trajectory accuracy and motor torques, under dynamic conditions with high speeds and accelerations. The employed method involves CAD modeling of the Delta robot's mechanical system in CATIA software and simulating its dynamic behavior in ADAMS software, using a representative trajectory where the end-effector's speed and acceleration reach the robot maximum admissible values. Considering different clearance values and various scenarios (clearances only in the upper spherical joints, only in the lower ones, and cumulatively), the simulation results highlighted a major kinematic and dynamic influence of the clearances from the upper spherical joints (with errors up to 0.05% for speeds, 0.06% for accelerations, and up to 0.19% for torques) and a relatively insignificant influence of those in the lower joints (with errors up to 0.00018% for speeds, 0.00041% for accelerations, and up to 0.172% for torques). The cumulative effect of these clearances was also analyzed, concluding that a little non-linear coupling impact was identified.
Article
Engineering
Other

Alongkorn Pirayawaraporn

,

Nachaya Chindakham

,

Pokkrong Vongkoon

,

Chaowanan Jamroen

Abstract: After arecanuts (or betel nut) are harvested and dried, the processing of arecanut involves cracking the shell and separating the kernel seed from the shell. In this process, traditional machines required full nuts separation by hands or other machines. For this reason, this study developed a low-cost machine to husk and separate full nuts, broken nuts, and shells simultaneously. In addition, this study also aimed to decrease the number of broken nuts from husking by reduction the power of the husking motor. Once dry betel nuts were fed through the hopper into the spaces between concave sieves and ATV (All-Terrain Vehicle) wheels. The shells of betel nuts were peeled by the shearing force of the tire’s surfaces and the sieves before being consecutively passed through the separating hopper to sort out full nuts, broken nuts, and shells. The experimental results presented the testing performance of 500 betel nuts that can simultaneously husk and separate for 80.2% of full nuts, 13.8% of unhusked nuts, and 6% of broken nuts based on the optimum setting (20 psi of tire pressure, 20 mm of the gap between the tires’ surface and the sieves, 442 rpm for husking tires, and 337 rpm for the separating system).
Review
Engineering
Other

Lubov Petrichenko

,

Anna Mutule

,

Sergejs Hlusovs

,

Reinis Zarins

,

Pavels Novosads

,

Illia Diahovchenko

Abstract: This study proposes a comprehensive key performance indicator–multi-criteria decision analysis framework to assess tools supporting renewable energy communities across six critical dimensions. Twenty three specific sub-criteria were defined and scored for each tools, and a weighted sum model was applied to aggregate performance. To ensure robust comparison, criteria weights were derived using both expert judgment (ranking and analytic hierarchy process pairwise comparisons) and objective data-driven methods (entropy-based and the criteria importance through intercriteria correlation weighting). The framework was applied to a diverse sample of contemporary renewable energy community’s tools, including open-source, commercial, and European Union project tools. Key findings indicate that part of tools showed noticeable rank shifts between expert-weighted and data-weighted evaluations, reflecting that expert opinions emphasize technical and operational features while objective variability elevates environmental and economic criteria. This assessment enables stakeholders to compare energy community tools based on structured criteria, offering practical guidance for tool selection and highlighting areas for future improvement.
Article
Engineering
Other

Shaohong Yan

,

Zikun Tian

,

Yanbo Zhang

,

Xulong Yao

,

Zhigang Tao

,

Shuai Wang

Abstract: The accurate prediction of the spatiotemporal evolution of rock damage zones is crucial for evaluating the stability of mine rock mass engineering. However, existing data-driven approaches often face challenges in effectively representing spatial heterogeneity and incorporating physical constraints, which can lead to predictions that deviate from mechanical laws. In this study, a physics-constrained spatio-temporal convolutional long short-term memory (STConvLSTM) network is proposed to address these limitations. First, a bidirectional point cloud–voxel adaptive conversion mechanism is designed to dynamically adjust voxel granularity according to point cloud density, thereby enhancing spatial detail preservation and computational efficiency. Second, a composite loss function combining structure-aware loss and physics-based constraints is constructed to ensure the predicted results satisfy mechanical continuity, smoothness, and boundary consistency. The proposed multi-level STConvLSTM integrates 3D convolution, CBAM3D attention, and residual connections to strengthen spatiotemporal feature extraction. Experimental results based on uniaxial compression–acoustic emission data demonstrate that the proposed method achieves an accuracy of 92.6% and an F1-score of 0.947, outperforming ConvLSTM and UNet3D models by 2.3% and 9.7%, respectively. These findings validate the effectiveness of the proposed framework in improving the physical reliability and predictive precision of rock damage evolution modeling.
Article
Engineering
Other

Dina Kon

,

Sage Ngoie

,

Shu Jisen

,

Yadah Mbuyu

,

Dave Mbako

Abstract: An accurate prediction of the shear strength in rock discontinuities requires ac-counting for surface roughness, which is neglected in the classical Mohr–Coulomb theory. This study presents a fractal-enhanced modification of the criterion by directly incorporating the surface fractal dimension as a state-dependent parameter that governs cohesion and the internal friction angle. Fractal dimensions were reliably quantified using dual methods, box-counting and power spectral density, with strong agreement (R² = 0.98, mean deviation < 0.02), ensuring an objective and scale-invariant input. The results indicated that both the cohesion and friction angle increased nonlinearly with the fractal dimension, leading to a dynamically adjustable failure envelope. The modified model predicts up to 25–40% higher strength for rough joints than classical estimates, aligning closely with the experimental data. The criterion, constructed within the principal stress space and integrated into a fractional-order damage law, effectively characterizes the history-dependent failure process of the structure. The robustness of the solution was verified through Lyapunov stability analysis. The application of the Banach fixed-point theorem ensures the continuity and uniqueness of the solutions. Furthermore, the use of the fractal dimension to assess dynamic changes in fractality during shearing confirms that the fractal dimension is in a mechanical state. This fractal-based framework successfully linked microscale topography to macroscale strength, providing an intuitively grounded approach for the predictive stability assessment of geomechanics.
Article
Engineering
Other

Xuemei Jian

,

Meng Chen

,

Xiangjun Liu

,

Guofeng Yang

,

Hongfa Ye

,

Yaning Zhao

Abstract: Multi-arm caliper logging tools provide a high-precision technology for monitoring tubing and casing integrity. However, due to the wellbore structure of horizontal wells and the self-weight of the instrument, the multi-arm caliper logging data of horizontal wells typically exhibit obvious eccentricity, resulting in a high misjudgment probability in the quantitative evaluation of oil and casing pipe damage. In this manuscript, simulation experiments were conducted using the 5.5-inch casing under laboratory conditions. The experiments clarified the response characteristics of forty-arm caliper logging tools in centered and eccentric states within intact and damaged casing. An equivalent model combining the instrument and the horizontal pipe structure was established, and a new method for correcting instrument eccentricity based on the forty-arm caliper logging data in horizontal wells was developed through theoretical analysis. This model efficiently overcomes the limitation of existing eccentricity correction models in weakening damage detection under casing damage conditions. When compared with experimental data, the relative error between intact well section after eccentricity correction and centered measurement ranged from -4.93% to -0.72%, representing a reduction of 0.03% to 0.53% compared to the ellipse-fitting algorithm. The relative error between the response value of the damaged well section after eccentricity correction and centered measurement ranged from -5.02% to -0.45%, representing a reduction of 0.92% to 1.16% compared to ellipse-fitting algorithm. This effectively improves the correction coincidence rate for eccentricity influence of multi-arm caliper logging tools under various conditions in horizontal well. Applying this established method to the processing and interpretation of forty-arm caliper logging data in horizontal wells provides powerful technical support for the high-precision quantitative evaluation, damage location identification, and repair of horizontal well casing.

of 28

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated