Preprint
Article

This version is not peer-reviewed.

A Flexible Sensor-Enabled Multi-Parameter Collaborative Monitoring System for Precision Agriculture with Field Validation

Submitted:

18 April 2026

Posted:

20 April 2026

You are already at the latest version

Abstract
Real-time and accurate monitoring of farmland environmental parameters and crop growth status is essential for precision agriculture and intelligent irrigation management. However, conventional agricultural monitoring approaches remain limited in spatial coverage, sensor adaptability, and intelligent data analysis. To address these limitations, this study proposes a multi-parameter collaborative monitoring system for precision agriculture that integrates flexible sensing, LoRa-based wireless communication, and deep learning-based data analysis. Specifically, a flexible capacitive humidity sensor based on graphene-PDMS composites was designed and fabricated for farmland environmental monitoring, and a distributed LoRa sensor network was developed to enable large-scale multi-parameter data acquisition and remote transmission. In addition, a convolutional neural network (CNN) was established for feature extraction and crop disease identification using multimodal sensor data. Experimental results showed that the flexible sensor exhibited a response time of 2.3 s and good mechanical stability, while the proposed model achieved an accuracy of 97.1% for crop disease identification on the test set. Field experiments conducted in 12 test fields across Hebei, Shandong, and Henan provinces showed that the proposed system achieved an average water-saving rate of 32.8% and an average crop yield increase of 10.6%. These results demonstrate that the proposed system can effectively improve farmland monitoring accuracy and support intelligent irrigation decision-making, highlighting its application potential in smart agriculture.
Keywords: 
;  ;  ;  

1. Introduction

With the continuous growth of the global population and the increasing scarcity of agricultural production resources such as arable land and water, traditional agriculture management methods that rely heavily on experience are no longer able to meet the requirements of modern agriculture for high yield, high efficiency, and sustainable development [1,2,3]. Precision agriculture emphasizes real-time sensing and precise regulation of soil environments, crop growth conditions, and disease risks, and has become an important pathway for transforming agricultural production from experience-driven to data-driven approaches [4,5,6,7].
Currently, rigid sensors widely used in agricultural applications exhibit certain limitations when monitoring complex ground surfaces and plant structures. For instance, they often suffer from insufficient mechanical compliance, difficulty in achieving intimate contact with irregular monitoring objects, and relatively poor comfort and stability during long-term deployment. In addition, traditional monitoring methods that rely on manual inspection are inefficient and highly subjective, making it difficult to meet the requirements of precision agriculture for real-time, automated, and intelligent monitoring. In contrast, flexible sensors, owing to their excellent flexibility, stretchability, and conformal contact capability, demonstrate significant potential in sensing plant growth conditions, monitoring farmland environments, and acquiring microenvironmental information [8,9,10]. Meanwhile, the rapid development of artificial intelligence technologies, particularly deep learning methods, enables efficient fusion of multi-source sensing data, crop disease identification, and intelligent agricultural decision support. These advances provide a technological foundation for constructing next-generation intelligent agricultural sensing systems [11,12,13,14].
In recent years, extensive research has been conducted by scholars worldwide on flexible sensors and intelligent agricultural monitoring. On the one hand, significant progress has been achieved in the design of flexible sensing materials, optimization of device structures, and applications in plant health monitoring, which has promoted the transition of flexible sensors from laboratory devices to practical applications in agricultural scenarios. On the other hand, several studies have attempted to integrate multiple sensors, such as temperature, humidity, and strain sensors, to enable comprehensive monitoring of plant physiological conditions and farmland environmental parameters [15,16]. In addition, the introduction of Internet of Things (IoT) communication technologies and intelligent analytical models into agricultural monitoring has provided new approaches for the remote acquisition, transmission, and automated analysis of farmland information. Overall, the integration of flexible sensing, multi-parameter monitoring, and intelligent analysis has become an important development direction in precision agriculture [17,18,19,20,21].
Nevertheless, several limitations remain in existing studies. First, most research still focuses on the design of a single sensor or a single monitoring task, lacking collaborative sensing and integrated analysis of environmental parameters, crop physiological states, and disease information. Second, many studies remain at the stage of small-scale experiments or localized validation, and systematic integration of flexible sensors, wireless communication networks, and intelligent algorithms is still insufficient. Furthermore, long-term deployment, stable operation, and performance verification under large-scale real farmland environments are relatively limited, which restricts the practical promotion of existing research outcomes in precision irrigation and intelligent agricultural decision-making. Therefore, developing a multi-parameter collaborative monitoring system that integrates flexible sensing, remote communication, and intelligent analysis remains an important challenge to be addressed in the field of precision agriculture [22,23,24,25,26,27].
To address the aforementioned challenges, this paper proposes a multi-parameter collaborative monitoring system for precision agriculture based on a flexible sensor array [28,29,30]. The proposed system integrates flexible sensing, LoRa-based wireless communication, and deep learning-based data analysis to enable real-time acquisition, remote transmission, intelligent recognition, and decision support for farmland environmental information [31,32,33,34]. Specifically, a flexible humidity sensor for agricultural monitoring was designed and fabricated, while a distributed sensing network was established for multi-parameter data acquisition. Furthermore, crop disease identification and irrigation decision analysis were performed based on multi-source sensing data. Field experiments conducted at 12 experimental farmland sites across Hebei, Shandong, and Henan provinces demonstrated that the proposed system has strong potential for water-saving irrigation and crop yield improvement [35,36,37,38].
To address these limitations, this study proposes a flexible sensor-enabled multi-parameter collaborative monitoring system for precision agriculture. Specifically, a graphene–PDMS flexible humidity sensor suitable for agricultural environmental monitoring is designed and fabricated to improve conformal sensing capability and responsiveness in complex agricultural scenarios. A distributed agricultural sensing network based on LoRa communication is further constructed to enable low-power and long-distance acquisition and transmission of multi-parameter farmland information. In addition, deep learning methods are employed for multimodal sensing data analysis, thereby improving the accuracy of crop disease identification and intelligent agricultural decision-making. Finally, field experiments conducted across multiple provinces and experimental sites validate the practical performance of the proposed system and demonstrate its effectiveness for precision irrigation and yield improvement.

2. Methods

2.1. Design and Manufacturing Process of Flexible Sensors

To meet the demand for real-time monitoring of soil moisture in farmland, this study designs a capacitive flexible sensor based on a graphene–polydimethylsiloxane (PDMS) composite structure. The sensing material is fabricated using a chemical vapor deposition (CVD) process to form a monolayer graphene thin film [39,40]. The graphene layer exhibits a carrier mobility of up to 15000 cm2/(V·s), which enables a rapid response to water molecule adsorption. The dielectric layer consists of a PDMS thin film with a thickness of 50 μ m. Its relative dielectric constant ε r varies with humidity according to the following relationship:
ε r ( R H ) = ε 0 + α · R H + β · R H 2
Here, ε 0 represents the reference dielectric constant with a value of 2.65. α and β are fitting coefficients, which are 0.032 and 0.00018, respectively. RH denotes the relative humidity expressed as a percentage. The electrode pattern adopts an interdigitated structure, which is fabricated on a 125 μ m-thick polyimide substrate using laser direct writing technology. The electrode finger width is 200 μ m with a spacing of 150 μ m, forming an effective sensing area of 12 mm × 8 mm.
During the fabrication process, all parameters were strictly controlled. The temperature for graphene transfer was maintained at 80 °C, with a pressure of 0.2 MPa for 15 minutes to ensure that the interfacial contact resistance was lower than 5 Ω · cm [41,42]. The PDMS prepolymer and curing agent were mixed at a mass ratio of 10:1. The mixture was then degassed under a vacuum condition of 0.09 MPa for 20 minutes to remove air bubbles. Subsequently, the PDMS layer was spin-coated onto the electrode surface at a speed of 1500 rpm for 30 seconds to obtain a uniform dielectric layer. A stepwise heating scheme was adopted during the curing process. First, the sample was pre-cured at 65 °C for 2 hours, followed by complete crosslinking at 120 °C for 4 hours. Table 1 shows the effects of different manufacturing parameters on sensor performance:
The device was encapsulated using a 10 μ m-thick polyethylene terephthalate (PET) film through a thermal lamination process. The processing temperature was set to 90 °C with a pressure of 0.15 MPa applied for 10 minutes. The water vapor transmission rate (WVTR) was controlled to be below 0.8 g/( m 2 · day). During the quality control process, the capacitance value was measured using an impedance analyzer at a frequency of 1 kHz, with an acceptable tolerance of ± 2 % . A bending test of 5000 cycles was conducted at a curvature radius of 5 mm, and the capacitance drift was required to be less than 3% of the initial value. Figure 1 shows the fabrication process of the flexible sensor. The sensitivity of the sensor is defined as the ratio of the capacitance variation to the change in relative humidity:
S = Δ C Δ R H = C R H 2 C R H 1 R H 2 R H 1

2.2. Multimodal Sensing Data Acquisition System

The system adopted a distributed sensing-node architecture, in which each multimodal node integrated three types of flexible sensors, namely humidity, temperature, and strain sensors. Each node was designed as an independent data acquisition unit for real-time monitoring of local crop and microenvironmental conditions. An STM32F407 microcontroller operating at 168 MHz with a 12-bit ADC module was used to achieve multi-channel synchronous sampling. The output capacitance signal of the humidity sensor was conditioned using an LTC1563-2 active filter with a cutoff frequency of 50 Hz to suppress environmental interference. The temperature sensor employed a flexible platinum resistance probe. A schematic diagram of the data acquisition system is shown in Figure 2.
The resistance–temperature conversion relationship can be expressed as follows:
R T = R 0 1 + α ( T T 0 ) + β ( T T 0 ) 2
When R 0 was 273.15 K, the reference resistance was 100 Ω , and the temperature coefficients were α = 3.908 × 10 3 K 1 and β = 5.775 × 10 7 K 2 . The strain sensor was based on a carbon nanotube–PDMS composite material. The relative change in resistance was linearly related to strain, and the gauge factor was GF = 12.6 . Table 2 compares the performance of the multimodal sensing nodes:
LoRa modulation technology was used for wireless transmission, with a center frequency of 470 MHz, a spreading factor of SF = 9 , a bandwidth of 125 kHz, and a coding rate of CR = 4 / 5 . The information frame consisted of an 8-byte timestamp, a 12-byte sensor-data payload, and a 4-byte checksum. The field gateway was deployed at a central position within the monitored area, with an effective receiving radius of 800 m. Time synchronization was implemented based on the Precise Time Protocol (PTP), in which each node adjusted its local clock every 600 s via a GPS module, ensuring synchronization accuracy within 5 ms [43,44]. In addition, a Time Division Multiple Access (TDMA) scheme was adopted to reduce packet collisions, and the slot width was set to 100 ms.
The sampling frequencies of the sensors were configured according to the variation characteristics of the monitored parameters, namely 10 Hz for humidity sensors, 5 Hz for temperature sensors, and 20 Hz for strain sensors. A weighted average algorithm was employed to fuse the multi-channel data, where the weighting factor w i was defined as the inverse of the sensor measurement uncertainty:
x ¯ = i = 1 n w i x i i = 1 n w i , w i = 1 σ i 2
Here, σ i represents the standard deviation of the ith sensor. The system was equipped with a 512 KB circular buffer for raw data storage. In the event of temporary network interruption, the local cache was able to maintain data recording for up to 72 h. The power supply system consisted of a 5 W monocrystalline silicon solar panel combined with a 3.7 V/5000 mAh lithium battery, providing up to 14 d of autonomous operation under cloudy or rainy conditions.

2.3. AI Algorithm Model Construction and Training Strategy

Data preprocessing employs a sliding window mechanism to extract feature segments from continuous time-series signals. The window length is set to 128 sampling points, with a step size of 32 points and an overlap rate of 75% to ensure feature continuity. The raw sensor data are standardized using the Z-score method to eliminate the influence of different dimensions. The transformation formula is as follows:
x norm = x μ σ
Here, μ and σ represent the mean and standard deviation of the training set, respectively. To address baseline drift in the humidity sensor signals, a five-level wavelet decomposition based on the db4 basis function was applied. During signal reconstruction, the low-frequency approximation coefficient a 5 was removed, while the detail coefficients d 1 d 5 were retained to effectively suppress trend-related interference [45,46].
The CNN framework consisted of four convolutional layers and two fully connected layers. The first convolutional layer employed 64 kernels of size 3 × 3 with a stride of 1, same padding, and ReLU activation. In the second convolutional layer, the number of kernels was increased to 128, and two max-pooling layers were introduced for dimensionality reduction [47,48]. The third and fourth convolutional layers used 256 and 512 kernels, respectively, and each convolutional layer was followed by batch normalization to accelerate convergence. The fully connected layers contained 512 and 128 neurons, respectively, and a dropout rate of 0.5 was adopted to reduce overfitting. The Softmax activation function was used in the output layer to predict the probability distribution of five crop disease states.
To address class imbalance, the training set was augmented using the Synthetic Minority Oversampling Technique (SMOTE). The initial dataset contained 3200 healthy samples, 1800 mildly diseased samples, 950 moderately diseased samples, 620 severely diseased samples, and 430 withered samples. Synthetic samples for the minority classes were generated using the SMOTE algorithm with k = 5 nearest neighbors in the feature space, resulting in a balanced distribution of 3200 samples for each class.
Three temporal data augmentation strategies were applied, including time stretching ( 0.8 1.2 × speed), amplitude scaling ( 0.9 1.1 × ), and Gaussian noise injection ( SNR = 25 dB), resulting in 48,000 training samples after augmentation [49,50]. To address class imbalance, the Focal Loss function was adopted as the loss function, defined as follows:
F L ( p t ) = α t ( 1 p t ) γ log [ f 0 ] ( p t ) ( 6 )
The modulation factor was set to γ = 2 , and the class weights α were calculated using inverse frequency weighting. AdamW was employed as the optimizer, with an initial learning rate of 0.001 and a weight decay coefficient of 0.0001. Cosine annealing was adopted for learning rate scheduling, with a period of 50 epochs and a minimum learning rate of 1 × 10 6 . Model training was performed on an NVIDIA RTX 3090 GPU with a batch size of 64 for a total of 200 epochs. An early stopping strategy was introduced to terminate training when the validation loss showed no improvement for 15 consecutive epochs. Hyperparameter optimization was conducted using Bayesian optimization, with the search space including the number of convolutional kernels [ 32 , 64 , 128 ] , learning rate [ 0.0001 , 0.01 ] , and dropout rate [ 0.3 , 0.7 ] . The optimal configuration was obtained after 50 iterations. Model performance was evaluated on an independent test set containing 12,000 samples, and five-fold stratified cross-validation was further performed to assess generalization ability.

2.4. System Integration and Field Deployment Scheme

A grid-based deployment strategy was adopted for field-scale sensor installation. In a representative wheat experimental field covering 12 ha, the monitored area was subdivided into grid cells of approximately 11 m × 11 m, corresponding to 120 m2 per node. One core multimodal sensing node was deployed near the center of each grid cell, forming a monitoring network of 980 core nodes. In addition, 48 boundary reference nodes were installed along the field perimeter to correct edge effects and provide calibration references for long-term measurement stability. The deployment layout is illustrated schematically in Figure 3.
To adapt to dynamic crop growth conditions, the vertical positions of the sensing elements were adjusted during the growing season. Soil-moisture sensors were buried at a depth of 15 cm, temperature sensors were fastened at the midpoint of the plant stem, and strain sensors were attached to the abaxial side of the leaf at one-third of the distance from the leaf tip [51,52]. The installation tilt angle of each node was controlled within ± 3 to ensure measurement consistency.
The mobile monitoring platform was developed based on the React Native framework and was compatible with Android 8.0 and iOS 12 or later. The interface provided a heatmap for displaying the spatial distribution of multiple parameters, and the visualization color scale was automatically adjusted according to historical data. Three warning levels were defined: a yellow alert when humidity fell below 35%, an orange alert when humidity fell below 25%, and a red alert when humidity fell below 15%, with notifications sent to the agricultural technician terminal. The data were updated every 5 s. The server implemented the WebSocket protocol to maintain persistent connectivity, and the message queue length was limited to 200 messages to prevent memory overflow.
An adaptive calibration algorithm was introduced to compensate for long-term sensor drift. Each boundary reference node was compared with a portable standard device every 24 h, and the correction coefficient k t was calculated as follows:
k t = x r e f x s e n s o r · k t 1
Here, x ref denotes the reading of the standard device, x sensor denotes the original sensor value, and k t 1 denotes the correction coefficient from the previous calibration cycle. The corrected measurement was then smoothed using a first-order low-pass filter:
y t = α · x t · k t + ( 1 α ) · y t 1
The filter coefficient was set to α = 0.3 to balance response speed and stability. Temperature compensation was implemented using a lookup-table method, in which calibration parameters were pre-stored at 1°C intervals over the range from 10 C to 50 C. Linear interpolation was further applied to improve computational accuracy [53,54].
The decision output process consisted of four stages: data acquisition, edge preprocessing, cloud inference, and instruction issuance. The edge gateway performed data aggregation and anomaly detection, removing outliers beyond the ± 3 σ range. The trained CNN model was deployed on the cloud server, and the inference latency was controlled within 120 ms. Irrigation decisions were made based on a comprehensive assessment of disease probability and soil moisture output from the model. When the health probability exceeded 0.85 and the moisture content was below 40%, the drip irrigation system was activated. The opening of the flow-regulating valve was adjusted within a range of 0–100% according to the degree of water shortage.

3. Results and Discussion

3.1. Performance Test Results of Flexible Sensors

Response time tests were conducted in a constant temperature and humidity chamber under a gradual increase in humidity from 30 % RH to 80 % RH . An oscilloscope was used to record the capacitance change over time, and the time required to reach 90 % of the steady-state value was defined as the response time. To evaluate measurement accuracy, standard humidity environments were generated using the saturated salt solution method. Standard humidity points of 11 % , 33 % , 75 % , and 85 % RH were prepared using LiCl, MgCl2, NaCl, and KCl solutions, respectively. A total of 100 consecutive samples were collected to calculate the standard deviation.
Bending tests were performed using an electric bending stage with a curvature radius of 5 mm and a frequency of 1 Hz . The baseline capacitance and sensitivity variation were measured after every 1000 bending cycles. Temperature drift tests were conducted by increasing the temperature from 10 C to 60 C in steps of 10 C , maintaining each temperature point for 20 min, and recording the capacitance shift. Sensirion SHT35 and Honeywell HIH-4000 sensors were selected as reference devices, and all tests were carried out under identical conditions. Figure 5 compares the response performance of different sensors, while Figure 6 presents the durability and stability test results.
Bending tests were performed using an electric bending stage with a curvature radius of 5 mm and a frequency of 1 Hz. The baseline capacitance and sensitivity variation were measured after every 1000 bending cycles. Temperature drift tests were conducted by increasing the temperature from 10 C to 60 C in steps of 10 °C, maintaining each temperature point for 20 min, and recording the capacitance shift. To provide a benchmark for performance comparison, Sensirion SHT35 and Honeywell HIH-4000 were selected as reference commercial sensors. All test items were carried out under identical conditions to ensure the consistency and comparability of the experimental results. Figure 4 compares the response performance of different sensors, while Figure 5 presents the durability and stability test results.
Figure 4. Response Performance Comparison.
Figure 4. Response Performance Comparison.
Preprints 209121 g004
Figure 5. Durability and Stability Test Results.
Figure 5. Durability and Stability Test Results.
Preprints 209121 g005
The results showed that the proposed sensor exhibited a response time of 2.3 s, corresponding to 71.3% and 84.7% of the response times of the Sensirion and Honeywell sensors, respectively. The proposed sensor also demonstrated a clear advantage in recovery time. In addition, the sensor showed good linearity, with a coefficient of determination of R 2 = 0.9947 , and a repeatability of 98.6%, indicating stable measurement performance. After 5000 bending cycles, the capacitance drift and sensitivity degradation were only 2.8% and 4.1%, respectively, which satisfied the long-term stability requirements for agricultural field applications. The contact resistance increased gradually with the number of bending cycles, reaching 9.1 Ω after 10,000 cycles, which remained far below the failure threshold of 50 Ω . The temperature coefficient varied only slightly from 0.08%/°C to 0.18%/°C, indicating that the temperature compensation algorithm effectively suppressed the influence of ambient temperature on measurement accuracy and confirmed the reliability of the sensor over a wide temperature range from 10 C to 60 C.

3.2. AI Model Recognition Accuracy Evaluation

The evaluation dataset included three crops, namely wheat, corn, and rice, collected from March to October 2024 under three typical climatic conditions, namely rainy spring, hot summer, and dry autumn. The disease recognition test set contained 2400 samples across five categories: healthy, powdery mildew, rust, sheath blight, and wilt. For the growth prediction task, sensor data were used to forecast crop growth status over the following 7 days, including normal growth, slow growth, and stagnant growth, with a total of 9000 test samples. Model inference was performed on an NVIDIA A100 GPU with a batch size of 128.
The confusion matrix was used to analyze the relationship between actual and predicted categories, where the diagonal elements represented the numbers of correctly classified samples. The ROC curve was constructed using the one-vs-rest strategy, with the true positive rate (TPR) and false positive rate (FPR) calculated for each class under different thresholds. The thresholds ranged from 0.01 to 0.99 with a step size of 0.01, and classification performance was evaluated using the area under the curve (AUC). Figure 6 shows the confusion matrix, while Figure 7 presents the ROC curve.
Figure 6. Confusion Matrix.
Figure 6. Confusion Matrix.
Preprints 209121 g006
Figure 7. ROC Curve.
Figure 7. ROC Curve.
Preprints 209121 g007
The confusion matrix showed that the overall accuracy of disease identification reached 0.971, with Fusarium wilt achieving the highest class-specific accuracy of 0.983, owing to its distinctive sensor signal characteristics. In contrast, powdery mildew exhibited a relatively lower recall of 0.958. Further analysis indicated that the early-stage humidity difference between powdery mildew and healthy samples was only 2–3% RH, which was close to the sensor resolution limit and resulted in 31 false-negative misclassifications. The ROC analysis showed that the AUC values for all categories exceeded 0.97, with the highest AUC of 0.9941 observed for Fusarium wilt, indicating strong discriminative capability for this disease. Generalization analysis across crop varieties showed that the average identification accuracies were 96.3% for wheat, 96.2% for maize, and 96.9% for rice, with inter-crop differences below 1.0%, confirming good cross-variety adaptability of the model. The highest accuracy observed for rice was attributed to the larger humidity fluctuations in the rice-growing environment, which produced more distinguishable sensor signal variations. In the growth prediction task, the F1-scores for all categories exceeded 0.94. The best performance was obtained for the normal growth condition of rice, with an F1-score of 0.971, whereas slow growth was the most difficult condition to identify, with the lowest F1-score of 0.935 observed for wheat. This result was mainly attributed to the overlap between the sensor patterns of slow growth and normal growth, which led to less distinct feature boundaries. These results can be partly explained by the microclimatic conditions associated with crop disease development. Sustained increases in local humidity and leaf-surface wetness create favorable conditions for fungal spore germination and pathogen propagation, which in turn induce measurable changes in transpiration behavior, tissue strain, and temperature-related physiological responses. As a result, multimodal sensor signals capture not only environmental variations but also coupled plant physiological stress responses. The higher recognition accuracy observed for Fusarium wilt may therefore be attributed to its more pronounced and stable multimodal signal characteristics, whereas powdery mildew in the early stage produces only subtle humidity differences from healthy plants, leading to overlapping feature distributions.

3.3. Field Trial Application Results

In Hebei, Shandong, and Henan provinces, 12 experimental fields were established, with field sizes ranging from 3.3 to 8.0 ha (approximately 50–120 mu), cultivating wheat, corn, and rice. In the control group, irrigation was scheduled manually, typically 2–3 times per week, and the irrigation amount was determined based on empirical experience. In the experimental group, a smart sensor network was deployed with approximately one node installed per 120 m2 to measure soil moisture content, crop transpiration, and other meteorological parameters in real time. This deployment density is consistent with the grid-based layout described for the representative field in Figure . The system dynamically adjusted irrigation thresholds according to crop growth stages. When the soil moisture content dropped below a predefined threshold, precision irrigation was automatically initiated, and the irrigation volume was determined based on the soil water deficit. The trial covered the entire growing season from March to October 2024. Data were recorded on total irrigation volume, number of irrigation events, crop yield, and water use efficiency. On-site yield measurements conducted by local agricultural departments were used to validate the yield data. The water-saving rate was calculated as ( Traditional Irrigation Amount Smart Irrigation Amount ) / Traditional Irrigation Amount × 100 % . Table 3 presents the results of the field trials.
The average water-saving rate reached 32.8%, with wheat showing the highest water-saving effect of 33.6%. This was mainly attributed to the precise regulation of water demand during critical growth stages, which avoided the excessive irrigation commonly observed under conventional practices. Although rice required the highest water input, its water-saving rate was relatively lower (31.2%) because rice cultivation requires the maintenance of a shallow water layer, thereby limiting irrigation flexibility. Maize exhibited the greatest yield improvement, with an average increase of 10.6%. This result suggested that timely water replenishment provided by the intelligent system during the tasseling stage effectively reduced water stress. Wheat and rice showed average yield increases of 10.3% and 9.2%, respectively. The irrigation frequency decreased by an average of 39.6%, indicating a substantial reduction in labor input. In Shandong province, the reduction reached approximately 40%, where conventional conservative irrigation practices had previously led to excessive irrigation. The most representative case was observed in the wheat experimental field in Henan province, where favorable soil water-retention capacity, combined with intelligent irrigation scheduling, achieved a good match with local soil characteristics and promoted synergistic improvements in water and fertilizer use efficiency.

3.4. Sensor Drift Issues

Three primary factors were identified as contributors to sensor drift. First, contamination of the electrochemical sensor electrode surface by salts and organic matter in the soil led to the formation of a passivation layer, resulting in a gradual loss of sensitivity over time. After six months of deployment, humidity sensor readings were found to be 3–8% lower than normal. Second, temperature cycling accelerated the aging of the encapsulation material. When the diurnal temperature difference exceeded 20°C, microcracks could form in the silicone sealing ring, allowing moisture ingress. This process led to circuit board corrosion and caused a positive baseline voltage drift of 0.05–0.15 V. Third, contamination of the optical sensor window also contributed to measurement drift. When the infrared window of the leaf transpiration sensor was blocked by dust and fungal spores, the transmittance decreased by 12–18%, resulting in a systematic overestimation of the transpiration rate. To mitigate these effects, a weekly automatic calibration mechanism was implemented using natural environmental conditions with humidity close to 100% and morning dew-point conditions. In addition, the optical window was manually cleaned once per month, and the electrochemical sensor was inspected using a standard solution. Table 4 presents the drift analysis and calibration performance of the sensors.
The results showed that the soil moisture sensor exhibited negative drift characteristics, with an error of -12.3% after 12 months of deployment. After automatic calibration, this error was reduced to -2.8%, corresponding to a reduction of 77.2%. In contrast, the leaf transpiration sensor exhibited a larger positive drift, with a cumulative drift of +26.7% within 12 months; however, this value was reduced to +4.3% after calibration. The temperature sensor showed the smallest drift, at only +1.6°C over 12 months. Given a calibration cycle of 30 days, this level of drift could be maintained within acceptable limits. The automatic calibration process was particularly effective for optical sensors, for which the error could be reduced by more than 83%. In contrast, the calibration effect was relatively limited for electrochemical sensors because of irreversible electrode passivation.

4. Conclusions

In this paper, a precision agriculture monitoring system integrating flexible sensing technology, wireless communication, and deep learning was developed for real-time crop growth monitoring and intelligent irrigation management. The proposed system combines multimodal sensing, LoRa-based data transmission, CNN-based disease recognition, and field-scale deployment to enable dynamic monitoring of soil, crop, and environmental conditions under practical agricultural scenarios.
Experimental results demonstrated that the flexible sensing system exhibited favorable response speed, stability, and durability for field applications. The proposed disease recognition model achieved an overall accuracy of 0.971, with AUC values above 0.97 for all categories, indicating strong classification and generalization performance across different crop varieties. Field trials further confirmed the practical value of the system, with an average water-saving rate of 32.8%, an average yield increase of 10.6%, and a reduction of 39.6% in irrigation frequency, demonstrating that the proposed approach can effectively improve water-use efficiency and support precision irrigation decision-making.
Despite these promising results, several challenges remain, including long-term sensor drift, performance degradation under complex field environments, and the need to further improve model robustness under heterogeneous agricultural conditions. Future research should therefore focus on improving sensor stability and calibration strategies, developing lightweight edge intelligence models for real-time decision-making, and enhancing the integration of multimodal sensing with large-scale agricultural monitoring platforms.

Author Contributions

Conceptualization, G.S. and X.W.; methodology, M.X. and G.S.; software, G.S.; validation, M.X. and G.S.; formal analysis, M.X.; investigation, M.X.; resources, X.W.; data curation, M.X.; writing—original draft preparation, M.X.; writing—review and editing, G.S. and X.W.; visualization, M.X.; supervision, X.W. and G.S.; project administration, X.W.; funding acquisition, X.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Key Research and Development Program of China, grant number 2023YFD2000601-02.

Institutional Review Board Statement

Funding: This research was funded by the National Key Research and Development Program of China, grant number 2023YFD2000601-02.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Devlet, A. Modern agriculture and challenges. Frontiers in Life Sciences and Related Technologies 2021, 2, 21–29. [Google Scholar] [CrossRef]
  2. Raj, E.F.I.; Appadurai, M.; Athiappan, K. Precision farming in modern agriculture. In Smart Agriculture Automation Using Advanced Technologies: Data Analytics and Machine Learning, Cloud Architecture, Automation and IoT; Springer: Berlin, Germany, 2022; pp. 61–87. [Google Scholar]
  3. Misra, S.; Ghosh, A. Agriculture paradigm shift: a journey from traditional to modern agriculture. In Biodiversity and Bioeconomy; Elsevier: Amsterdam, The Netherlands, 2024; pp. 113–141. [Google Scholar]
  4. Gebbers, R.; Adamchuk, V.I. Precision agriculture and food security. Science 2010, 327, 828–831. [Google Scholar] [CrossRef]
  5. Bhakta, I.; Phadikar, S.; Majumder, K. State-of-the-art technologies in precision agriculture: a systematic review. Journal of the Science of Food and Agriculture 2019, 99, 4878–4888. [Google Scholar] [CrossRef] [PubMed]
  6. Cisternas, I.; Velásquez, I.; Caro, A.; Rodríguez, A. Systematic literature review of implementations of precision agriculture. Computers and Electronics in Agriculture 2020, 176, 105626. [Google Scholar] [CrossRef]
  7. Sharma, A.; Jain, A.; Gupta, P.; Chowdary, V. Machine learning applications for precision agriculture: A comprehensive review. IEEE Access 2020, 9, 4843–4873. [Google Scholar] [CrossRef]
  8. Luo, Y.; Abidian, M.R.; Ahn, J.-H.; Akinwande, D.; Andrews, A.M.; Antonietti, M.; Bao, Z.; Berggren, M.; Berkey, C.A.; Bettinger, C.J.; et al. Technology roadmap for flexible sensors. ACS Nano 2023, 17, 5211–5295. [Google Scholar] [CrossRef]
  9. Yan, B.; Zhang, F.; Wang, M.; Zhang, Y.; Fu, S. Flexible wearable sensors for crop monitoring: A review. Frontiers in Plant Science 2024, 15, 1406074. [Google Scholar] [CrossRef]
  10. Lu, Y.; Xu, K.; Zhang, L.; Deguchi, M.; Shishido, H.; Arie, T.; Pan, R.; Hayashi, A.; Shen, L.; Akita, S.; et al. Multimodal plant healthcare flexible sensor system. ACS Nano 2020, 14, 10966–10975. [Google Scholar] [CrossRef]
  11. Attri, I.; Awasthi, L.K.; Sharma, T.P.; Rathee, P. A review of deep learning techniques used in agriculture. Ecological Informatics 2023, 77, 102217. [Google Scholar] [CrossRef]
  12. Saleem, M.H.; Potgieter, J.; Arif, K.M. Automation in agriculture by machine and deep learning techniques: A review of recent developments. Precision Agriculture 2021, 22, 2053–2091. [Google Scholar] [CrossRef]
  13. Torres, A.B.B.; Da Rocha, A.R.; Da Silva, T.L.C.; De Souza, J.N.; Gondim, R.S. Multilevel data fusion for the internet of things in smart agriculture. Computers and Electronics in Agriculture 2020, 171, 105309. [Google Scholar] [CrossRef]
  14. Wang, Z.; Jia, C.; He, W.; Feng, H.; et al. Integrated multi-modal flexible sensors and AI-driven fusion modeling for internal and external quality detection of agricultural products. Trends in Food Science & Technology 2025, 105401. [Google Scholar]
  15. Miao, F.; Han, Y.; Shi, J.; Tao, B.; Zhang, P.; Chu, P.K. Design of graphene-based multi-parameter sensors. Journal of Materials Research and Technology 2023, 22, 3156–3169. [Google Scholar] [CrossRef]
  16. Li, X.-H.; Li, M.-Z.; Li, J.-Y.; Gao, Y.-Y.; Liu, C.-R.; Hao, G.-F. Wearable sensor supports in-situ and continuous monitoring of plant health in precision agriculture era. Plant Biotechnology Journal 2024, 22, 1516–1535. [Google Scholar] [CrossRef]
  17. Khanna, A.; Kaur, S. Evolution of Internet of Things (IoT) and its significant impact in the field of Precision Agriculture. Computers and Electronics in Agriculture 2019, 157, 218–231. [Google Scholar] [CrossRef]
  18. Aggarwal, K.; Reddy, G.S.; Makala, R.; Srihari, T.; Sharma, N.; Singh, C. Studies on energy efficient techniques for agricultural monitoring by wireless sensor networks. Computers and Electrical Engineering 2024, 113, 109052. [Google Scholar] [CrossRef]
  19. Banđur, Đ; Jakšić, B.; Banđur, M.; Jović, S. An analysis of energy efficiency in Wireless Sensor Networks (WSNs) applied in smart agriculture. Computers and Electronics in Agriculture 2019, 156, 500–507. [Google Scholar] [CrossRef]
  20. Khattab, A.; Habib, S.E.D.; Ismail, H.; Zayan, S.; Fahmy, Y.; Khairy, M.M. An IoT-based cognitive monitoring system for early plant disease forecast. Computers and Electronics in Agriculture 2019, 166, 105028. [Google Scholar] [CrossRef]
  21. Bhat, S.A.; Huang, N.-F. Big data and AI revolution in precision agriculture: Survey and challenges. IEEE Access 2021, 9, 110209–110222. [Google Scholar] [CrossRef]
  22. Torky, M.; Hassanein, A.E. Integrating blockchain and the internet of things in precision agriculture: Analysis, opportunities, and challenges. Computers and Electronics in Agriculture 2020, 178, 105476. [Google Scholar] [CrossRef]
  23. Wu, D.; Liu, A.; Ma, L.; Guo, J.; Ma, F.; Han, Z.; Wang, L. Multi-parameter cooperative optimization and solution method for regional integrated energy system. Sustainable Cities and Society 2023, 95, 104622. [Google Scholar] [CrossRef]
  24. Paccoia, V.D.; Bonacci, F.; Clementi, G.; Cottone, F.; Neri, I.; Mattarelli, M. Toward field deployment: Tackling the energy challenge in environmental sensors. Sensors 2025, 25, 5618. [Google Scholar] [CrossRef]
  25. Wang, Y.; Wang, Y.; Xue, Y.; Li, X.; Geng, Y.; Zhao, J.; Ge, L.; He, H.; Li, F.; Liu, X. Portable and flexible hydrogel sensor for on-site atrazine assay on agricultural products. Analytical Chemistry 2024, 96, 7772–7779. [Google Scholar] [CrossRef]
  26. Zhao, J.; Liu, D.; Huang, R. A review of climate-smart agriculture: Recent advancements, challenges, and future directions. Sustainability 2023, 15, 3404. [Google Scholar] [CrossRef]
  27. Yu, P.; Teng, F.; Zhu, W.; Shen, C.; Chen, Z.; Song, J. Cloud–edge–device collaborative computing in smart agriculture: architectures, applications, and future perspectives. Frontiers in Plant Science 2025, 16, 1668545. [Google Scholar] [CrossRef]
  28. Yang, R.; Zhang, W.; Tiwari, N.; Yan, H.; Li, T.; Cheng, H. Multimodal sensors with decoupled sensing mechanisms. Advanced Science 2022, 9, 2202470. [Google Scholar] [CrossRef]
  29. Li, J.; Bao, R.; Tao, J.; Peng, Y.; Pan, C. Recent progress in flexible pressure sensor arrays: from design to applications. Journal of Materials Chemistry C 2018, 6, 11878–11892. [Google Scholar] [CrossRef]
  30. Lv, M.; Wei, H.; Fu, X.; Wang, W.; Zhou, D. A loosely coupled extended Kalman filter algorithm for agricultural scene-based multi-sensor fusion. Frontiers in Plant Science 2022, 13, 849260. [Google Scholar] [CrossRef]
  31. Devalal, S.; Karthikeyan, A. LoRa technology-an overview. In Proceedings of the 2018 Second International Conference on Electronics, Communication and Aerospace Technology (ICECA), Coimbatore, India, 29–31 March 2018; pp. 284–290. [Google Scholar]
  32. Bor, M.C.; Vidler, J.E.; Roedig, U. LoRa for the Internet of Things. In Proceedings of the 13th International Conference on Embedded Wireless Systems and Networks (EWSN), Graz, Austria, 15–17 February 2016; pp. 361–366. [Google Scholar]
  33. Li, Z.; Liu, F.; Yang, W.; Peng, S.; Zhou, J. A survey of convolutional neural networks: analysis, applications, and prospects. IEEE Transactions on Neural Networks and Learning Systems 2021, 33, 6999–7019. [Google Scholar] [CrossRef]
  34. O’Shea, K.; Nash, R. An introduction to convolutional neural networks. arXiv 2015, arXiv:1511.08458. [Google Scholar] [CrossRef]
  35. Saggi, M.K.; Jain, S. A survey towards decision support system on smart irrigation scheduling using machine learning approaches. Archives of Computational Methods in Engineering 2022, 29, 4455–4478. [Google Scholar] [CrossRef]
  36. Simionesei, L.; Ramos, T.B.; Palma, J.; Oliveira, A.R.; Neves, R. IrrigaSys: A web-based irrigation decision support system based on open source data and technology. Computers and Electronics in Agriculture 2020, 178, 105822. [Google Scholar] [CrossRef]
  37. Lu, J.; Tan, L.; Jiang, H. Review on convolutional neural network (CNN) applied to plant leaf disease classification. Agriculture 2021, 11, 707. [Google Scholar] [CrossRef]
  38. Jafar, A.; Bibi, N.; Naqvi, R.A.; Sadeghi-Niaraki, A.; Jeong, D. Revolutionizing agriculture with artificial intelligence: plant disease detection methods, applications, and their limitations. Frontiers in Plant Science 2024, 15, 1356260. [Google Scholar] [CrossRef]
  39. Giaretta, J.E.; Duan, H.; Oveissi, F.; Farajikhah, S.; Dehghani, F.; Naficy, S. Flexible sensors for hydrogen peroxide detection: A critical review. ACS Applied Materials & Interfaces 2022, 14, 20491–20505. [Google Scholar] [CrossRef]
  40. Lee, H.C.; Liu, W.-W.; Chai, S.-P.; Mohamed, A.R.; Lai, C.W.; Khe, C.-S.; Voon, C.H.; Hashim, U.; Hidayah, N.M.S. Synthesis of single-layer graphene: A review of recent development. Procedia Chemistry 2016, 19, 916–921. [Google Scholar] [CrossRef]
  41. Qin, T.; Liao, W.; Yu, L.; Zhu, J.; Wu, M.; Peng, Q.; Han, L.; Zeng, H. Recent progress in conductive self-healing hydrogels for flexible sensors. Journal of Polymer Science 2022, 60, 2607–2634. [Google Scholar] [CrossRef]
  42. Xiao, W.; Cai, X.; Jadoon, A.; Zhou, Y.; Gou, Q.; Tang, J.; Ma, X.; Wang, W.; Cai, J. High-performance graphene flexible sensors for pulse monitoring and human–machine interaction. ACS Applied Materials & Interfaces 2024, 16, 32445–32455. [Google Scholar]
  43. Dubey, A.; Ahmed, A.; Singh, R.; Singh, A.; Sundramoorthy, A.K.; Arya, S. Role of flexible sensors for the electrochemical detection of organophosphate-based chemical warfare agents. International Journal of Smart and Nano Materials 2024, 15, 502–533. [Google Scholar] [CrossRef]
  44. Salehi, B.; Reus-Muns, G.; Roy, D.; Wang, Z.; Jian, T.; Dy, J.; Ioannidis, S.; Chowdhury, K. Deep learning on multimodal sensor data at the wireless edge for vehicular network. IEEE Transactions on Vehicular Technology 2022, 71, 7639–7655. [Google Scholar] [CrossRef]
  45. Ooi, K.-B.; Tan, G.W.-H.; Al-Emran, M.; Al-Sharafi, M.A.; Capatina, A.; Chakraborty, A.; Dwivedi, Y.K.; Huang, T.-L.; Kar, A.K.; Lee, V.-H.; et al. The potential of generative artificial intelligence across disciplines: Perspectives and future directions. Journal of Computer Information Systems 2025, 65, 76–107. [Google Scholar] [CrossRef]
  46. Vakil, A.; Liu, J.; Zulch, P.; Blasch, E.; Ewing, R.; Li, J. A survey of multimodal sensor fusion for passive RF and EO information integration. IEEE Aerospace and Electronic Systems Magazine 2021, 36, 44–61. [Google Scholar] [CrossRef]
  47. Zhao, F.; Zhang, C.; Geng, B. Deep multimodal data fusion. ACM Computing Surveys 2024, 56, 1–36. [Google Scholar] [CrossRef]
  48. Zhao, X.; Zhang, M.; Tao, R.; Li, W.; Liao, W.; Tian, L.; Philips, W. Fractional Fourier image transformer for multimodal remote sensing data classification. IEEE Transactions on Neural Networks and Learning Systems 2022, 35, 2314–2326. [Google Scholar] [CrossRef]
  49. Purwono; Ma’arif, A.; Rahmaniar, W.; Fathurrahman, H.I.K.; Frisky, A.Z.K.; ul Haq, Q.M. Understanding of convolutional neural network (CNN): A review. International Journal of Robotics and Control Systems 2022, 2, 739–748. [Google Scholar] [CrossRef]
  50. Xie, X.; Cheng, G.; Wang, J.; Li, K.; Yao, X.; Han, J. Oriented R-CNN and beyond. International Journal of Computer Vision 2024, 132, 2420–2442. [Google Scholar] [CrossRef]
  51. Bhosle, K.; Musande, V. Evaluation of deep learning CNN model for recognition of Devanagari digit. Artificial Intelligence and Applications 2023, 1, 98–102. [Google Scholar] [CrossRef]
  52. Arkin, E.; Yadikar, N.; Xu, X.; Aysa, A.; Ubul, K. A survey: object detection methods from CNN to transformer. Multimedia Tools and Applications 2023, 82, 21353–21383. [Google Scholar] [CrossRef]
  53. Shang, S.; Shan, Z.; Liu, G.; et al. Resdiff: Combining CNN and diffusion model for image super-resolution. Proceedings of the AAAI Conference on Artificial Intelligence 2024, 38, 8975–8983. [Google Scholar] [CrossRef]
  54. Khan, A.; Rauf, Z.; Sohail, A.; Khan, A.R.; Asif, H.; Asif, A.; Farooq, U. A survey of the vision transformers and their CNN-transformer based variants. Artificial Intelligence Review 2023, 56, 2917–2970. [Google Scholar] [CrossRef]
Figure 1. Fabrication process of the flexible sensor.
Figure 1. Fabrication process of the flexible sensor.
Preprints 209121 g001
Figure 2. Schematic diagram of the collection system.
Figure 2. Schematic diagram of the collection system.
Preprints 209121 g002
Figure 3. Schematic deployment layout of the multimodal sensor network in a representative 12 ha wheat experimental field. The field was subdivided into approximately 11 m × 11 m grid cells ( 120 m² per cell). A total of 980 core sensing nodes were positioned near the centers of the grid cells, while 48 boundary reference nodes were arranged along the field perimeter for edge-effect correction and calibration support. The figure is intended as a schematic illustration and is not drawn to exact scale.
Figure 3. Schematic deployment layout of the multimodal sensor network in a representative 12 ha wheat experimental field. The field was subdivided into approximately 11 m × 11 m grid cells ( 120 m² per cell). A total of 980 core sensing nodes were positioned near the centers of the grid cells, while 48 boundary reference nodes were arranged along the field perimeter for edge-effect correction and calibration support. The figure is intended as a schematic illustration and is not drawn to exact scale.
Preprints 209121 g003
Table 1. Effect of different fabrication parameters on sensor performance.
Table 1. Effect of different fabrication parameters on sensor performance.
Graphene Layers PDMS Thickness ( μ m) Curing Temperature (°C) Baseline Capacitance (pF) Sensitivity (pF/%RH) Response Time (s) Hysteresis Error (%)
Monolayer 50 120 18.3 0.42 2.3 1.8
Bilayer 50 120 21.7 0.38 3.1 2.2
Monolayer 80 120 12.5 0.31 3.7 2.5
Monolayer 50 100 17.9 0.39 2.6 3.1
Monolayer 50 140 19.1 0.44 2.1 1.5
Trilayer 50 120 25.4 0.35 4.2 2.9
Monolayer 30 120 26.8 0.48 1.9 2.3
Table 2. Performance Comparison of Multimodal Sensing Nodes.
Table 2. Performance Comparison of Multimodal Sensing Nodes.
Node ID Sampling Rate (Hz) Power Consumption (mW) Data Latency (ms) Packet Loss Rate (%) Synchronization Error (ms)
N01 10 85.3 42 0.31 3.2
N02 10 87.1 38 0.28 2.9
N03 20 124.6 29 0.45 4.1
N04 10 83.9 45 0.33 3.5
N05 5 62.7 68 0.18 5.3
N06 20 128.2 31 0.52 3.8
N07 10 86.5 40 0.29 3.1
Table 3. Field Trial Results.
Table 3. Field Trial Results.
Location Crop Area Trad. Irr. Smart Irr. Water Save Trad. Yield Smart Yield Yield Inc. Irr. Freq. Red.
(ha) (m3/ha) (m3/ha) (%) (kg/ha) (kg/ha) (%) (%)
Hebei-1 Wheat 5.67 42750 28800 32.6 72750 79200 8.9 38.5
Hebei-2 Corn 6.33 48000 32100 33.0 96300 105750 9.8 41.2
Hebei-3 Rice 6.80 68400 47700 30.2 115200 124650 8.2 35.7
Shandong-1 Wheat 7.33 43800 27750 36.6 73800 81750 10.8 42.9
Shandong-2 Corn 5.20 50100 33150 33.8 98250 108600 10.5 39.6
Shandong-3 Rice 5.87 70800 48750 31.1 117300 128850 9.8 37.3
Henan-1 Wheat 6.80 41700 28200 32.4 71700 79650 11.1 40.8
Henan-2 Corn 4.33 47250 33150 30.0 95700 106650 11.4 40.5
Henan-3 Rice 6.40 69300 47000 32.2 122100 132000 8.1 34.6
Hebei-4 Wheat 6.80 43350 29100 32.9 73200 80550 10.0 41.8
Shandong-4 Corn 5.47 49200 32700 33.5 97350 107700 10.6 36.8
Henan-4 Rice 5.00 70200 48150 31.4 116850 127800 9.4 35.0
Table 4. Sensor Drift Analysis and Calibration Performance.
Table 4. Sensor Drift Analysis and Calibration Performance.
Sensor Type Deployment Duration (months) Drift without Calibration (%) Drift with Auto-Calibration (%) Calibration Frequency (days) Measurement Error Reduction (%)
Soil Moisture Sensor 6 -6.8 -1.2 7 82.4
Soil Moisture Sensor 12 -12.3 -2.8 7 77.2
Leaf Transpiration Sensor 6 +14.5 +2.1 7 85.5
Leaf Transpiration Sensor 12 +26.7 +4.3 7 83.9
Temperature Sensor 6 +0.8 +0.15 30 81.3
Temperature Sensor 12 +1.6 +0.32 30 80.0
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated