Submitted:
12 November 2024
Posted:
13 November 2024
You are already at the latest version
Abstract
Keywords:
1. Introduction
2. Related work
3. Materials and Methods
- Weed detection: Identifying and detecting weeds to enhance removal practices.
- Disease detection: Early identification of crop diseases through image analysis to minimize damage.
- Crop classification: Classifying different crop types for better field management.
- Water management: Monitoring water and moisture levels and optimizing irrigation practices.
- Yield prediction: Using visual and environmental data to predict crop yield more accurately.
4. Background
5. Convolutional Neural Network Applications in Smart Agriculture
5.1. Weed Detection
5.2. Disease Detection
5.3. Crop Classification
5.4. Water Management
5.5. Yield Prediction
6. Cross-Application Discussion
6.1. Data Acquisition
6.2. Data Types
6.3. CNN Relevance in Smart Agriculture
6.4. Potential and Future Directions
7. Conclusions
Funding
Abbreviations
| AG5.0 | Agriculture 5.0 |
| AI | Artificial Intelligence |
| CNN | Convolutional Neural Network |
| DT | Decision Tree |
| DVI | Difference Vegetation Index |
| EVI | Enhanced Vegetation Index |
| IoT | Internet of Things |
| IoU | Intersection over Union |
| KNN | K-Nearest Neighbors |
| LLM | Large Language Models |
| LSTM | Long Short-Term Memory |
| MLP | Multilayer Perceptron |
| MSAVI | Modified Soil Adjusted Vegetation Index |
| NDVI | Normalized Difference Vegetation Index |
| NIR | Near Infrared |
| RF | Random Forest |
| RGB | Red Green Blue |
| RNN | Recurrent Neural Networks |
| SAR | Synthetic-aperture radar |
| SVM | Support Vector Machine |
| SVR | Support Vector Regression |
| UAV | Unmanned aerial vehicle |
| UGV | Unmanned ground vehicle |
| ViT | Vision Transformers |
Conflicts of Interest
References
- Ivanovici, M.; Olteanu, G.; Florea, C.; Coliban, R.M.; Ștefan, M.; Marandskiy, K. Digital Transformation in Agriculture. In Digital Transformation: Exploring the Impact of Digital Transformation on Organizational Processes; Springer, 2024; pp. 157–191.
- Ragazou, K.; Garefalakis, A.; Zafeiriou, E.; Passas, I. Agriculture 5.0: A New Strategic Management Mode for a Cut Cost and an Energy Efficient Agriculture Sector. Energies 2022, 15. [Google Scholar] [CrossRef]
- Latief Ahmad, F.N. Agriculture 5.0: Artificial Intelligence, IoT and Machine Learning; CRC Press, 2021.
- FAO. Agricultural production statistics 2000–2020, 2022.
- Lee, U.; Chang, S.; Putra, G.A.; Kim, H.; Kim, D.H. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis. PloS one 2018, 13, e0196615. [Google Scholar] [CrossRef] [PubMed]
- Chai, J.; Zeng, H.; Li, A.; Ngai, E.W. Deep learning in computer vision: A critical review of emerging techniques and application scenarios. Machine Learning with Applications 2021, 6, 100134. [Google Scholar] [CrossRef]
- Khan, S.; Rahmani, H.; Shah, S.A.A.; Bennamoun, M.; Medioni, G.; Dickinson, S. A guide to convolutional neural networks for computer vision; Springer, 2018.
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems 2012, 25. [Google Scholar] [CrossRef]
- El Sakka, M.; Mothe, J.; Ivanovici, M. Images and CNN applications in smart agriculture. European Journal of Remote Sensing 2024, 57, 2352386. [Google Scholar] [CrossRef]
- Sun, G.; Yang, W.; Ma, L. BCAV: a generative ai author verification model based on the integration of Bert and CNN. Working Notes of CLEF 2024. [Google Scholar]
- Liu, Z.; Mao, H.; Wu, C.Y.; Feichtenhofer, C.; Darrell, T.; Xie, S. A convnet for the 2020s. Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2022, pp. 11976–11986.
- Gu, J.; Wang, Z.; Kuen, J.; Ma, L.; Shahroudy, A.; Shuai, B.; Liu, T.; Wang, X.; Wang, G.; Cai, J.; others. Recent advances in convolutional neural networks. Pattern recognition 2018, 77, 354–377. [Google Scholar] [CrossRef]
- Hossain, M.A.; Sajib, M.S.A. Classification of image using convolutional neural network (CNN). Global Journal of Computer Science and Technology 2019, 19, 13–14. [Google Scholar] [CrossRef]
- Niu, S.; Liu, Y.; Wang, J.; Song, H. A decade survey of transfer learning (2010–2020). IEEE Transactions on Artificial Intelligence 2020, 1, 151–166. [Google Scholar] [CrossRef]
- Ma, Y.; Chen, S.; Ermon, S.; Lobell, D.B. Transfer learning in environmental remote sensing. Remote Sensing of Environment 2024, 301, 113924. [Google Scholar] [CrossRef]
- Wujek, B.; Hall, P.; Günes, F. Best practices for machine learning applications. SAS Institute Inc 2016, p. 3.
- D’Aniello, M.; Zampella, M.; Dosi, A.; Rownok, A.; Delli Veneri, M.; Ettari, A.; Cavuoti, S.; Sannino, L.; Brescia, M.; Donadio, C.; Longo, G. RiverZoo: A Machine Learning Framework for Terrestrial and Extraterrestrial Drainage Networks Classification Using Clustering Techniques and Fuzzy Reasoning. 2024. [CrossRef]
- Adams, S.; Friedland, C.; Levitan, M. Unmanned aerial vehicle data acquisition for damage assessment in hurricane events. Proceedings of the 8th international workshop on remote sensing for disaster management, Tokyo, Japan, 2010, Vol. 30.
- Ouchra, H.; Belangour, A. Satellite image classification methods and techniques: A survey. 2021 IEEE International Conference on Imaging Systems and Techniques (IST). IEEE, 2021, pp. 1–6.
- Awad, M.M. A New Winter Wheat Crop Segmentation Method Based on a New Fast-UNet Model and Multi-Temporal Sentinel-2 Images. Agronomy 2024, 14, 2337. [Google Scholar] [CrossRef]
- Liakos, K.G.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine learning in agriculture: A review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef]
- Kok, Z.H.; Shariff, A.R.M.; Alfatni, M.S.M.; Khairunniza-Bejo, S. Support vector machine in precision agriculture: a review. Computers and Electronics in Agriculture 2021, 191, 106546. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. A review of the use of convolutional neural networks in agriculture. The Journal of Agricultural Science 2018, 156, 312–322. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Computers and electronics in agriculture 2018, 147, 70–90. [Google Scholar] [CrossRef]
- Liu, J.; Wang, X. Plant diseases and pests detection based on deep learning: a review. Plant Methods 2021, 17, 1–18. [Google Scholar] [CrossRef]
- Saleem, M.H.; Potgieter, J.; Arif, K.M. Plant disease detection and classification by deep learning. Plants 2019, 8, 468. [Google Scholar] [CrossRef] [PubMed]
- Kamarudin, M.H.; Ismail, Z.H.; Saidi, N.B. Deep learning sensor fusion in plant water stress assessment: A comprehensive review. Applied Sciences 2021, 11, 1403. [Google Scholar] [CrossRef]
- Hasan, A.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G. A survey of deep learning techniques for weed detection from images. Computers and electronics in agriculture 2021, 184, 106067. [Google Scholar] [CrossRef]
- Wu, Z.; Chen, Y.; Zhao, B.; Kang, X.; Ding, Y. Review of weed detection methods based on computer vision. Sensors 2021, 21, 3647. [Google Scholar] [CrossRef]
- Hu, K.; Wang, Z.; Coleman, G.; Bender, A.; Yao, T.; Zeng, S.; Song, D.; Schumann, A.; Walsh, M. Deep Learning Techniques for In-Crop Weed Identification: A Review. arXiv 2021. arXiv preprint arXiv:2103.14872. [CrossRef]
- Zhao, X.; Wang, L.; Zhang, Y.; Han, X.; Deveci, M.; Parmar, M. A review of convolutional neural networks in computer vision. Artificial Intelligence Review 2024, 57, 99. [Google Scholar] [CrossRef]
- Krichen, M. Convolutional neural networks: A survey. Computers 2023, 12, 151. [Google Scholar] [CrossRef]
- Naidu, G.; Zuva, T.; Sibanda, E.M. A review of evaluation metrics in machine learning algorithms. Computer Science On-line Conference. Springer, 2023, pp. 15–25.
- Rainio, O.; Teuho, J.; Klén, R. Evaluation metrics and statistical tests for machine learning. Scientific Reports 2024, 14, 6086. [Google Scholar] [CrossRef] [PubMed]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 770–778.
- Zhang, X.; Cui, J.; Liu, H.; Han, Y.; Ai, H.; Dong, C.; Zhang, J.; Chu, Y. Weed identification in soybean seedling stage based on optimized Faster R-CNN algorithm. Agriculture 2023, 13, 175. [Google Scholar] [CrossRef]
- Hendrawan, Y.; Damayanti, R.; Al Riza, D.F.; Hermanto, M.B. Classification of water stress in cultured Sunagoke moss using deep learning. Telkomnika (Telecommunication Computing Electronics and Control) 2021, 19, 1594–1604. [Google Scholar] [CrossRef]
- Yang, W.; Nigon, T.; Hao, Z.; Paiao, G.D.; Fernández, F.G.; Mulla, D.; Yang, C. Estimation of corn yield based on hyperspectral imagery and convolutional neural network. Computers and Electronics in Agriculture 2021, 184, 106092. [Google Scholar] [CrossRef]
- Quan, L.; Feng, H.; Lv, Y.; Wang, Q.; Zhang, C.; Liu, J.; Yuan, Z. Maize seedling detection under different growth stages and complex field environments based on an improved Faster R–CNN. Biosystems Engineering 2019, 184, 1–23. [Google Scholar] [CrossRef]
- Lottes, P.; Behley, J.; Milioto, A.; Stachniss, C. Fully convolutional networks with sequential information for robust crop and weed detection in precision farming. IEEE Robotics and Automation Letters 2018, 3, 2870–2877. [Google Scholar] [CrossRef]
- Osorio, K.; Puerto, A.; Pedraza, C.; Jamaica, D.; Rodríguez, L. A deep learning approach for weed detection in lettuce crops using multispectral images. AgriEngineering 2020, 2, 471–488. [Google Scholar] [CrossRef]
- Sangaiah, A.K.; Yu, F.N.; Lin, Y.B.; Shen, W.C.; Sharma, A. UAV T-YOLO-Rice: An Enhanced Tiny Yolo Networks for Rice Leaves Diseases Detection in Paddy Agronomy. IEEE Transactions on Network Science and Engineering 2024. [Google Scholar] [CrossRef]
- Maji, A.K.; Marwaha, S.; Kumar, S.; Arora, A.; Chinnusamy, V.; Islam, S. SlypNet: Spikelet-based yield prediction of wheat using advanced plant phenotyping and computer vision techniques. Frontiers in plant science 2022, 13, 889853. [Google Scholar] [CrossRef] [PubMed]
- Costa, L.d.F. Further generalizations of the Jaccard index. arXiv preprint arXiv:2110.09619 2021. [CrossRef]
- Yao, Z.; Zhu, X.; Zeng, Y.; Qiu, X. Extracting Tea Plantations from Multitemporal Sentinel-2 Images Based on Deep Learning Networks. Agriculture 2022, 13, 10. [Google Scholar] [CrossRef]
- Ilyas, T.; Kim, H. A deep learning based approach for strawberry yield prediction via semantic graphics. 2021 21st International Conference on Control, Automation and Systems (ICCAS). IEEE, 2021, pp. 1835–1841.
- Kamath, R.; Balachandra, M.; Vardhan, A.; Maheshwari, U. Classification of paddy crop and weeds using semantic segmentation. Cogent engineering 2022, 9, 2018791. [Google Scholar] [CrossRef]
- Asad, M.H.; Bais, A. Weed detection in canola fields using maximum likelihood classification and deep convolutional neural network. Information Processing in Agriculture 2020, 7, 535–545. [Google Scholar] [CrossRef]
- Wu, Y.; Yang, H.; Mao, Y. Detection of the Pine Wilt Disease Using a Joint Deep Object Detection Model Based on Drone Remote Sensing Data. Forests 2024, 15, 869. [Google Scholar] [CrossRef]
- Suh, H.K.; Ijsselmuiden, J.; Hofstee, J.W.; van Henten, E.J. Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosystems engineering 2018, 174, 50–65. [Google Scholar] [CrossRef]
- Pajjuri, N.; Kumar, U.; Thottolil, R. Comparative evaluation of the convolutional neural network based transfer learning models for classification of plant disease. 2022 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT). IEEE, 2022, pp. 1–6.
- Pandey, A.; Jain, K. An intelligent system for crop identification and classification from UAV images using conjugated dense convolutional neural network. Computers and Electronics in Agriculture 2022, 192, 106543. [Google Scholar] [CrossRef]
- Liang, D.; Liu, W.; Zhao, L.; Zong, S.; Luo, Y. An improved convolutional neural network for plant disease detection using unmanned aerial vehicle images. Nature Environment and Pollution Technology 2022, 21, 899–908. [Google Scholar] [CrossRef]
- Duan, K.; Bai, S.; Xie, L.; Qi, H.; Huang, Q.; Tian, Q. Centernet: Keypoint triplets for object detection. Proceedings of the IEEE/CVF international conference on computer vision, 2019, pp. 6569–6578.
- Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 4700–4708.
- Ahad, M.T.; Li, Y.; Song, B.; Bhuiyan, T. Comparison of CNN-based deep learning architectures for rice diseases classification. Artificial Intelligence in Agriculture 2023, 9, 22–35. [Google Scholar] [CrossRef]
- Bhadra, S.; Sagan, V.; Skobalski, J.; Grignola, F.; Sarkar, S.; Vilbig, J. End-to-end 3D CNN for plot-scale soybean yield prediction using multitemporal UAV-based RGB images. Precision Agriculture 2024, 25, 834–864. [Google Scholar] [CrossRef]
- Wu, Y.; Kirillov, A.; Massa, F.; Lo, W.Y.; Girshick, R. Detectron2. https://github.com/facebookresearch/detectron2, 2019.
- Jabir, B.; Falih, N.; Rahmani, K. Accuracy and efficiency comparison of object detection open-source models. International Journal of Online & Biomedical Engineering 2021, 17. [Google Scholar]
- Tan, M.; Pang, R.; Le, Q.V. Efficientdet: Scalable and efficient object detection. Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2020, pp. 10781–10790.
- Kamarudin, M.; Ismail, Z.H. Lightweight deep CNN models for identifying drought stressed plant. IOP Conference Series: Earth and Environmental Science. IOP Publishing, 2022, Vol. 1091, p. 012043. [CrossRef]
- Tan, M.; Le, Q. Efficientnet: Rethinking model scaling for convolutional neural networks. International conference on machine learning. PMLR, 2019, pp. 6105–6114.
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE transactions on pattern analysis and machine intelligence 2016, 39, 1137–1149. [Google Scholar] [CrossRef]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. Proceedings of the IEEE conference on computer vision and pattern recognition, 2015, pp. 1–9.
- Szegedy, C.; Vanhoucke, V.; Ioffe, S.; Shlens, J.; Wojna, Z. Rethinking the inception architecture for computer vision. Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 2818–2826.
- Kumar, A.; Shreeshan, S.; Tejasri, N.; Rajalakshmi, P.; Guo, W.; Naik, B.; Marathi, B.; Desai, U. Identification of water-stressed area in maize crop using uav based remote sensing. 2020 IEEE India geoscience and remote sensing symposium (InGARSS). IEEE, 2020, pp. 146–149.
- He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask r-cnn. Proceedings of the IEEE international conference on computer vision, 2017, pp. 2961–2969.
- Prashanth, K.; Harsha, J.S.; Kumar, S.A.; Srilekha, J. Towards Accurate Disease Segmentation in Plant Images: A Comprehensive Dataset Creation and Network Evaluation. Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 2024, pp. 7086–7094.
- Howard, A.G. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv preprint arXiv:1704.04861 2017. [CrossRef]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L.C. Mobilenetv2: Inverted residuals and linear bottlenecks. Proceedings of the IEEE conference on computer vision and pattern recognition, 2018, pp. 4510–4520.
- Zhao, H.; Shi, J.; Qi, X.; Wang, X.; Jia, J. Pyramid scene parsing network. Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 2881–2890.
- Gupta, E.; Azimi, S.; Gandhi, T.K. Characterizing Water Deficiency induced stress in Plants using Gabor filter based CNN. 2022 IEEE IAS Global Conference on Emerging Technologies (GlobConET). IEEE, 2022, pp. 91–95.
- Xie, S.; Girshick, R.; Dollár, P.; Tu, Z.; He, K. Aggregated residual transformations for deep neural networks. Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 1492–1500.
- Hu, J.; Shen, L.; Sun, G. Squeeze-and-excitation networks. Proceedings of the IEEE conference on computer vision and pattern recognition, 2018, pp. 7132–7141.
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. Ssd: Single shot multibox detector. Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, October 11–14, 2016, Proceedings, Part I 14. Springer, 2016, pp. 21–37.
- Chen, J.; Wang, H.; Zhang, H.; Luo, T.; Wei, D.; Long, T.; Wang, Z. Weed detection in sesame fields using a YOLO model with an enhanced attention mechanism and feature fusion. Computers and Electronics in Agriculture 2022, 202, 107412. [Google Scholar] [CrossRef]
- Badrinarayanan, V.; Kendall, A.; Cipolla, R. Segnet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE transactions on pattern analysis and machine intelligence 2017, 39, 2481–2495. [Google Scholar] [CrossRef] [PubMed]
- Espejo-Garcia, B.; Mylonas, N.; Athanasakos, L.; Fountas, S.; Vasilakoglou, I. Towards weeds identification assistance through transfer learning. Computers and Electronics in Agriculture 2020, 171, 105306. [Google Scholar] [CrossRef]
- Li, F.; Bai, J.; Zhang, M.; Zhang, R. Yield estimation of high-density cotton fields using low-altitude UAV imaging and deep learning. Plant Methods 2022, 18, 55. [Google Scholar] [CrossRef]
- Iandola, F.N. SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and<0.5 MB model size. arXiv preprint arXiv:1602.07360 2016. [CrossRef]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. Medical image computing and computer-assisted intervention–MICCAI 2015: 18th international conference, Munich, Germany, October 5-9, 2015, proceedings, part III 18. Springer, 2015, pp. 234–241.
- Bannari, A.; Morin, D.; Bonn, F.; Huete, A. A review of vegetation indices. Remote sensing reviews 1995, 13, 95–120. [Google Scholar] [CrossRef]
- Shoaib, M.; Hussain, T.; Shah, B.; Ullah, I.; Shah, S.M.; Ali, F.; Park, S.H. Deep learning-based segmentation and classification of leaf images for detection of tomato plant disease. Frontiers in plant science 2022, 13, 1031748. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 2014. [CrossRef]
- Thakur, P.S.; Sheorey, T.; Ojha, A. VGG-ICNN: A Lightweight CNN model for crop disease identification. Multimedia Tools and Applications 2023, 82, 497–520. [Google Scholar] [CrossRef]
- Vijayakumar, A.; Vairavasundaram, S. Yolo-based object detection models: A review and its applications. Multimedia Tools and Applications 2024, 1–40. [Google Scholar] [CrossRef]
- Bhatt, D.; Patel, C.; Talsania, H.; Patel, J.; Vaghela, R.; Pandya, S.; Modi, K.; Ghayvat, H. CNN variants for computer vision: History, architecture, application, challenges and future scope. Electronics 2021, 10, 2470. [Google Scholar] [CrossRef]
- Andreasen, C.; Scholle, K.; Saberi, M. Laser weeding with small autonomous vehicles: Friends or foes? Frontiers in Agronomy 2022, 4, 841086. [Google Scholar] [CrossRef]
- Wu, H.; Wang, Y.; Zhao, P.; Qian, M. Small-target weed-detection model based on YOLO-V4 with improved backbone and neck structures. Precision Agriculture 2023, 24, 2149–2170. [Google Scholar] [CrossRef]
- Gao, J.; French, A.P.; Pound, M.P.; He, Y.; Pridmore, T.P.; Pieters, J.G. Deep convolutional neural networks for image-based Convolvulus sepium detection in sugar beet fields. Plant methods 2020, 16, 1–12. [Google Scholar] [CrossRef] [PubMed]
- Yang, J.; Wang, Y.; Chen, Y.; Yu, J. Detection of weeds growing in Alfalfa using convolutional neural networks. Agronomy 2022, 12, 1459. [Google Scholar] [CrossRef]
- Khanam, R.; Hussain, M. YOLOv11: An Overview of the Key Architectural Enhancements. arXiv preprint arXiv:2410.17725 2024. [CrossRef]
- Redmon, J. You only look once: Unified, real-time object detection. Proceedings of the IEEE conference on computer vision and pattern recognition, 2016.
- Hussain, M. Yolov1 to v8: Unveiling each variant–a comprehensive review of yolo. IEEE Access 2024, 12, 42816–42833. [Google Scholar] [CrossRef]
- Farooq, A.; Jia, X.; Hu, J.; Zhou, J. Transferable convolutional neural network for weed mapping with multisensor imagery. IEEE Transactions on Geoscience and Remote Sensing 2021, 60, 1–16. [Google Scholar] [CrossRef]
- Sahin, H.M.; Miftahushudur, T.; Grieve, B.; Yin, H. Segmentation of weeds and crops using multispectral imaging and CRF-enhanced U-Net. Computers and Electronics in Agriculture 2023, 211, 107956. [Google Scholar] [CrossRef]
- Moazzam, S.I.; Khan, U.S.; Qureshi, W.S.; Tiwana, M.I.; Rashid, N.; Alasmary, W.S.; Iqbal, J.; Hamza, A. A patch-image based classification approach for detection of weeds in sugar beet crop. IEEE Access 2021, 9, 121698–121715. [Google Scholar] [CrossRef]
- Ramirez, W.; Achanccaray, P.; Mendoza, L.; Pacheco, M. Deep convolutional neural networks for weed detection in agricultural crops using optical aerial images. 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS). IEEE, 2020, pp. 133–137.
- Xu, B.; Fan, J.; Chao, J.; Arsenijevic, N.; Werle, R.; Zhang, Z. Instance segmentation method for weed detection using UAV imagery in soybean fields. Computers and Electronics in Agriculture 2023, 211, 107994. [Google Scholar] [CrossRef]
- Ong, P.; Teo, K.S.; Sia, C.K. UAV-based weed detection in Chinese cabbage using deep learning. Smart Agricultural Technology 2023, 4, 100181. [Google Scholar] [CrossRef]
- Gallo, I.; Rehman, A.U.; Dehkordi, R.H.; Landro, N.; La Grassa, R.; Boschetti, M. Deep object detection of crop weeds: Performance of YOLOv7 on a real case dataset from UAV images. Remote Sensing 2023, 15, 539. [Google Scholar] [CrossRef]
- Haq, M.A. CNN based automated weed detection system using UAV imagery. Computer Systems Science & Engineering 2022, 42. [Google Scholar]
- Smith, L.N.; Byrne, A.; Hansen, M.F.; Zhang, W.; Smith, M.L. Weed classification in grasslands using convolutional neural networks. Applications of Machine Learning. SPIE, 2019, Vol. 11139, pp. 334–344. [CrossRef]
- Rasti, P.; Ahmad, A.; Samiei, S.; Belin, E.; Rousseau, D. Supervised image classification by scattering transform with application to weed detection in culture crops of high density. Remote Sensing 2019, 11, 249. [Google Scholar] [CrossRef]
- Rahman, A.; Lu, Y.; Wang, H. Performance evaluation of deep learning object detectors for weed detection for cotton. Smart Agricultural Technology 2023, 3, 100126. [Google Scholar] [CrossRef]
- Jin, X.; Liu, T.; McCullough, P.E.; Chen, Y.; Yu, J. Evaluation of convolutional neural networks for herbicide susceptibility-based weed detection in turf. Frontiers in Plant Science 2023, 14, 1096802. [Google Scholar] [CrossRef]
- Chen, D.; Lu, Y.; Li, Z.; Young, S. Performance evaluation of deep transfer learning on multi-class identification of common weed species in cotton production systems. Computers and Electronics in Agriculture 2022, 198, 107091. [Google Scholar] [CrossRef]
- Kalbande, K.; Patil, W.V. The convolutional neural network for plant disease detection using hierarchical mixed pooling technique with smoothing to sharpening approach. International Journal of Computing and Digital Systems 2023, 14, 1–1. [Google Scholar] [CrossRef]
- Panshul, G.S.; Pushadapu, D.; Reddy, G.E.K.K.; Abhishek, S.; Anjali, T. Deeptuber: Sequential cnn-based disease detection in potato plants for enhanced crop management. 2023 5th International Conference on Inventive Research in Computing Applications (ICIRCA). IEEE, 2023, pp. 380–386.
- Zhong, Y.; Teng, Z.; Tong, M. The Convolutional Neural Network for Plant Disease Detection Using Hierarchical Mixed Pooling Technique with Smoothing to Sharpening Approach. Frontiers in Plant Science 2023, 14, 1166296. [Google Scholar] [CrossRef]
- Kaya, Y.; Gürsoy, E. A novel multi-head CNN design to identify plant diseases using the fusion of RGB images. Ecological Informatics 2023, 75, 101998. [Google Scholar] [CrossRef]
- Sunitha, G.; Madhavi, K.R.; Avanija, J.; Reddy, S.T.K.; Vittal, R.H.S. Modeling convolutional neural network for detection of plant leaf spot diseases. 2022 3rd International Conference on Electronics and Sustainable Communication Systems (ICESC). IEEE, 2022, pp. 1187–1192.
- Pandian, J.A.; Kanchanadevi, K.; Kumar, V.D.; Jasińska, E.; Goňo, R.; Leonowicz, Z.; Jasiński, M. A five convolutional layer deep convolutional neural network for plant leaf disease detection. Electronics 2022, 11, 1266. [Google Scholar] [CrossRef]
- Narayanan, K.L.; Krishnan, R.S.; Robinson, Y.H.; Julie, E.G.; Vimal, S.; Saravanan, V.; Kaliappan, M. Banana plant disease classification using hybrid convolutional neural network. Computational Intelligence and Neuroscience 2022, 2022, 9153699. [Google Scholar] [CrossRef] [PubMed]
- Sharmila, R.; Kamalitta, R.; Singh, D.P.; Chauhan, A.; Acharjee, P.B. ; others. Weighted Mask Recurrent-Convolutional Neural Network based Plant Disease Detection using Leaf Images. 2023 7th International Conference on Intelligent Computing and Control Systems (ICICCS). IEEE, 2023, pp. 681–687.
- Kaur, P.; Harnal, S.; Gautam, V.; Singh, M.P.; Singh, S.P. Performance analysis of segmentation models to detect leaf diseases in tomato plant. Multimedia Tools and Applications 2024, 83, 16019–16043. [Google Scholar] [CrossRef]
- Sharma, T.; Sethi, G.K. Improving Wheat Leaf Disease Image Classification with Point Rend Segmentation Technique. SN Computer Science 2024, 5, 244. [Google Scholar] [CrossRef]
- Duan, Z.; Li, H.; Li, C.; Zhang, J.; Zhang, D.; Fan, X.; Chen, X. A CNN model for early detection of pepper Phytophthora blight using multispectral imaging, integrating spectral and textural information. Plant Methods 2024, 20, 115. [Google Scholar] [CrossRef] [PubMed]
- De Silva, M.; Brown, D. Tomato Disease Detection Using Multispectral Imaging with Deep Learning Models. 2024 International Conference on Artificial Intelligence, Big Data, Computing and Data Communication Systems (icABCD). IEEE, 2024, pp. 1–9.
- Reyes-Hung, L.; Soto, I.; Majumdar, A.K. Neural Network-Based Stress Detection in Crop Multispectral Imagery for Precision Agriculture. 2024 14th International Symposium on Communication Systems, Networks and Digital Signal Processing (CSNDSP). IEEE, 2024, pp. 551–556.
- Bansal, P.; Kumar, R.; Kumar, S. Disease detection in apple leaves using deep convolutional neural network. Agriculture 2021, 11, 617. [Google Scholar] [CrossRef]
- Guan, X. A novel method of plant leaf disease detection based on deep learning and convolutional neural network. 2021 6th International conference on intelligent computing and signal processing (ICSP). IEEE, 2021, pp. 816–819.
- Gill, H.S.; Bath, B.S.; Singh, R.; Riar, A.S. Wheat crop classification using deep learning. Multimedia Tools and Applications 2024, 1–17. [Google Scholar] [CrossRef]
- Kaya, A.; Keceli, A.S.; Catal, C.; Yalic, H.Y.; Temucin, H.; Tekinerdogan, B. Analysis of transfer learning for deep neural network based plant classification models. Computers and electronics in agriculture 2019, 158, 20–29. [Google Scholar] [CrossRef]
- Lu, S.; Lu, Z.; Aok, S.; Graham, L. Fruit classification based on six layer convolutional neural network. 2018 IEEE 23rd International conference on digital signal processing (DSP). IEEE, 2018, pp. 1–5.
- Rasheed, M.U.; Mahmood, S.A. A framework base on deep neural network (DNN) for land use land cover (LULC) and rice crop classification without using survey data. Climate Dynamics 2023, 61, 5629–5652. [Google Scholar] [CrossRef]
- Kou, W.; Shen, Z.; Liu, D.; Liu, Z.; Li, J.; Chang, W.; Wang, H.; Huang, L.; Jiao, S.; Lei, Y.; others. Crop classification methods and influencing factors of reusing historical samples based on 2D-CNN. International Journal of Remote Sensing 2023, 44, 3278–3305. [Google Scholar] [CrossRef]
- Farmonov, N.; Amankulova, K.; Szatmári, J.; Sharifi, A.; Abbasi-Moghadam, D.; Nejad, S.M.M.; Mucsi, L. Crop type classification by DESIS hyperspectral imagery and machine learning algorithms. IEEE Journal of selected topics in applied earth observations and remote sensing 2023, 16, 1576–1588. [Google Scholar] [CrossRef]
- Seydi, S.T.; Arefi, H.; Hasanlou, M. Crop-Net: A Novel Deep Learning Framework for Crop Classification using Time-series Sentinel-1 Imagery by Google Earth Engine 2023.
- Zhao, H.; Chen, Z.; Jiang, H.; Jing, W.; Sun, L.; Feng, M. Evaluation of three deep learning models for early crop classification using sentinel-1A imagery time series—A case study in Zhanjiang, China. Remote Sensing 2019, 11, 2673. [Google Scholar] [CrossRef]
- Yin, Q.; Lin, Z.; Hu, W.; López-Martínez, C.; Ni, J.; Zhang, F. Crop classification of multitemporal PolSAR based on 3-D attention module with ViT. IEEE Geoscience and Remote Sensing Letters 2023, 20, 1–5. [Google Scholar] [CrossRef]
- Li, H.; Tian, Y.; Zhang, C.; Zhang, S.; Atkinson, P.M. Temporal Sequence Object-based CNN (TS-OCNN) for crop classification from fine resolution remote sensing image time-series. The Crop Journal 2022, 10, 1507–1516. [Google Scholar] [CrossRef]
- Chamundeeswari, G.; Srinivasan, S.; Bharathi, S.P.; Priya, P.; Kannammal, G.R.; Rajendran, S. Optimal deep convolutional neural network based crop classification model on multispectral remote sensing images. Microprocessors and Microsystems 2022, 94, 104626. [Google Scholar] [CrossRef]
- Galodha, A.; Vashisht, R.; Nidamanuri, R.R.; Ramiya, A.M. Convolutional Neural Network (CNN) for Crop-Classification of Drone Acquired Hyperspectral Imagery. IGARSS 2022-2022 IEEE International Geoscience and Remote Sensing Symposium. IEEE, 2022, pp. 7741–7744.
- Kwak, G.H.; Park, C.w.; Lee, K.d.; Na, S.i.; Ahn, H.y.; Park, N.W. Potential of hybrid CNN-RF model for early crop mapping with limited input data. Remote Sensing 2021, 13, 1629. [Google Scholar] [CrossRef]
- Kamarudin, M.H.; Ismail, Z.H.; Saidi, N.B.; Hanada, K. An augmented attention-based lightweight CNN model for plant water stress detection. Applied Intelligence 2023, 53, 20828–20843. [Google Scholar] [CrossRef]
- Azimi, S.; Wadhawan, R.; Gandhi, T.K. Intelligent monitoring of stress induced by water deficiency in plants using deep learning. IEEE Transactions on Instrumentation and Measurement 2021, 70, 1–13. [Google Scholar] [CrossRef]
- Zhuang, S.; Wang, P.; Jiang, B.; Li, M. Learned features of leaf phenotype to monitor maize water status in the fields. Computers and electronics in agriculture 2020, 172, 105347. [Google Scholar] [CrossRef]
- Kuo, C.E.; Tu, Y.K.; Fang, S.L.; Huang, Y.R.; Chen, H.W.; Yao, M.H.; Kuo, B.J. Early detection of drought stress in tomato from spectroscopic data: A novel convolutional neural network with feature selection. Chemometrics and Intelligent Laboratory Systems 2023, 239, 104869. [Google Scholar] [CrossRef]
- Spišić, J.; Šimić, D.; Balen, J.; Jambrović, A.; Galić, V. Machine learning in the analysis of multispectral reads in maize canopies responding to increased temperatures and water deficit. Remote Sensing 2022, 14, 2596. [Google Scholar] [CrossRef]
- Zhang, W.; Zhang, W.; Yang, Y.; Hu, G.; Ge, D.; Liu, H.; Cao, H.; others. A cloud computing-based approach using the visible near-infrared spectrum to classify greenhouse tomato plants under water stress. Computers and Electronics in Agriculture 2021, 181, 105966. [Google Scholar]
- Li, M.W.; Chan, Y.K.; Yu, S.S. Use of CNN for Water Stress Identification in Rice Fields Using Thermal Imagery. Applied Sciences 2023, 13, 5423. [Google Scholar] [CrossRef]
- Sobayo, R.; Wu, H.H.; Ray, R.; Qian, L. Integration of convolutional neural network and thermal images into soil moisture estimation. 2018 1st International Conference on Data Intelligence and Security (ICDIS). IEEE, 2018, pp. 207–210.
- Nagappan, M.; Gopalakrishnan, V.; Alagappan, M. Prediction of reference evapotranspiration for irrigation scheduling using machine learning. Hydrological Sciences Journal 2020, 65, 2669–2677. [Google Scholar] [CrossRef]
- Afzaal, H.; Farooque, A.A.; Abbas, F.; Acharya, B.; Esau, T. Groundwater estimation from major physical hydrology components using artificial neural networks and deep learning. Water 2019, 12, 5. [Google Scholar] [CrossRef]
- Liu, J.; Xu, Y.; Li, H.; Guo, J. Soil moisture retrieval in farmland areas with sentinel multi-source data based on regression convolutional neural networks. Sensors 2021, 21, 877. [Google Scholar] [CrossRef] [PubMed]
- Ge, L.; Hang, R.; Liu, Y.; Liu, Q. Comparing the performance of neural network and deep convolutional neural network in estimating soil moisture from satellite observations. Remote Sensing 2018, 10, 1327. [Google Scholar] [CrossRef]
- Chaudhari, S.; Sardar, V.; Rahul, D.; Chandan, M.; Shivakale, M.S.; Harini, K. Performance analysis of CNN, Alexnet and vggnet models for drought prediction using satellite images. 2021 Asian Conference on Innovation in Technology (ASIANCON). IEEE, 2021, pp. 1–6.
- Bazzi, H.; Baghdadi, N.; Ienco, D.; El Hajj, M.; Zribi, M.; Belhouchette, H.; Escorihuela, M.J.; Demarez, V. Mapping irrigated areas using Sentinel-1 time series in Catalonia, Spain. Remote Sensing 2019, 11, 1836. [Google Scholar] [CrossRef]
- Hu, Z.; Xu, L.; Yu, B. Soil moisture retrieval using convolutional neural networks: Application to passive microwave remote sensing. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 2018, 42, 583–586. [Google Scholar] [CrossRef]
- Xue, M.; Hang, R.; Liu, Q.; Yuan, X.T.; Lu, X. CNN-based near-real-time precipitation estimation from Fengyun-2 satellite over Xinjiang, China. Atmospheric Research 2021, 250, 105337. [Google Scholar] [CrossRef]
- Sankararao, A.U.; Priyanka, G.; Rajalakshmi, P.; Choudhary, S. Cnn based water stress detection in chickpea using uav based hyperspectral imaging. 2021 IEEE International India Geoscience and Remote Sensing Symposium (InGARSS). IEEE, 2021, pp. 145–148.
- Wu, Z.; Cui, N.; Zhang, W.; Yang, Y.; Gong, D.; Liu, Q.; Zhao, L.; Xing, L.; He, Q.; Zhu, S.; others. Estimation of soil moisture in drip-irrigated citrus orchards using multi-modal UAV remote sensing. Agricultural Water Management 2024, 302, 108972. [Google Scholar] [CrossRef]
- Mia, M.S.; Tanabe, R.; Habibi, L.N.; Hashimoto, N.; Homma, K.; Maki, M.; Matsui, T.; Tanaka, T.S. Multimodal deep learning for rice yield prediction using UAV-based multispectral imagery and weather data. Remote Sensing 2023, 15, 2511. [Google Scholar] [CrossRef]
- Morales, G.; Sheppard, J.W.; Hegedus, P.B.; Maxwell, B.D. Improved yield prediction of winter wheat using a novel two-dimensional deep regression neural network trained via remote sensing. Sensors 2023, 23, 489. [Google Scholar] [CrossRef] [PubMed]
- Terliksiz, A.S.; Altilar, D.T. A Simple and Efficient Deep Learning Architecture for Corn Yield Prediction. 2023 11th International Conference on Agro-Geoinformatics (Agro-Geoinformatics). IEEE, 2023, pp. 1–6.
- Tanabe, R.; Matsui, T.; Tanaka, T.S. Winter wheat yield prediction using convolutional neural networks and UAV-based multispectral imagery. Field Crops Research 2023, 291, 108786. [Google Scholar] [CrossRef]
- Zhou, S.; Xu, L.; Chen, N. Rice yield prediction in hubei province based on deep learning and the effect of spatial heterogeneity. Remote Sensing 2023, 15, 1361. [Google Scholar] [CrossRef]
- Saini, P.; Nagpal, B.; Garg, P.; Kumar, S. CNN-BI-LSTM-CYP: A deep learning approach for sugarcane yield prediction. Sustainable Energy Technologies and Assessments 2023, 57, 103263. [Google Scholar] [CrossRef]
- Jiang, Z.; Huete, A.R.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote sensing of Environment 2008, 112, 3833–3845. [Google Scholar] [CrossRef]
- MacEachern, C.B.; Esau, T.J.; Schumann, A.W.; Hennessy, P.J.; Zaman, Q.U. Detection of fruit maturity stage and yield estimation in wild blueberry using deep learning convolutional neural networks. Smart Agricultural Technology 2023, 3, 100099. [Google Scholar] [CrossRef]
- Chen, Y.; Lee, W.S.; Gan, H.; Peres, N.; Fraisse, C.; Zhang, Y.; He, Y. Strawberry yield prediction based on a deep neural network using high-resolution aerial orthoimages. Remote Sensing 2019, 11, 1584. [Google Scholar] [CrossRef]
- Sun, J.; Yang, K.; Chen, C.; Shen, J.; Yang, Y.; Wu, X.; Norton, T. Wheat head counting in the wild by an augmented feature pyramid networks-based convolutional neural network. Computers and Electronics in Agriculture 2022, 193, 106705. [Google Scholar] [CrossRef]
- Tedesco-Oliveira, D.; da Silva, R.P.; Maldonado Jr, W.; Zerbato, C. Convolutional neural networks in predicting cotton yield from images of commercial fields. Computers and Electronics in Agriculture 2020, 171, 105307. [Google Scholar] [CrossRef]
- Häni, N.; Roy, P.; Isler, V. Apple counting using convolutional neural networks. 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, 2018, pp. 2559–2565.
- Yu, D.; Zha, Y.; Sun, Z.; Li, J.; Jin, X.; Zhu, W.; Bian, J.; Ma, L.; Zeng, Y.; Su, Z. Deep convolutional neural networks for estimating maize above-ground biomass using multi-source UAV images: A comparison with traditional machine learning algorithms. Precision Agriculture 2023, 24, 92–113. [Google Scholar] [CrossRef]
- Huber, F.; Yushchenko, A.; Stratmann, B.; Steinhage, V. Extreme Gradient Boosting for yield estimation compared with Deep Learning approaches. Computers and Electronics in Agriculture 2022, 202, 107346. [Google Scholar] [CrossRef]
- Sagan, V.; Maimaitijiang, M.; Bhadra, S.; Maimaitiyiming, M.; Brown, D.R.; Sidike, P.; Fritschi, F.B. Field-scale crop yield prediction using multi-temporal WorldView-3 and PlanetScope satellite data and deep learning. ISPRS journal of photogrammetry and remote sensing 2021, 174, 265–281. [Google Scholar] [CrossRef]
- Khaki, S.; Pham, H.; Wang, L. Simultaneous corn and soybean yield prediction from remote sensing data using deep transfer learning. Scientific Reports 2021, 11, 11132. [Google Scholar] [CrossRef] [PubMed]
- Fernandez-Beltran, R.; Baidar, T.; Kang, J.; Pla, F. Rice-yield prediction with multi-temporal sentinel-2 data and 3D CNN: A case study in Nepal. Remote Sensing 2021, 13, 1391. [Google Scholar] [CrossRef]
- Gastli, M.S.; Nassar, L.; Karray, F. Satellite images and deep learning tools for crop yield prediction and price forecasting. 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021, pp. 1–8.
- Qiao, M.; He, X.; Cheng, X.; Li, P.; Luo, H.; Tian, Z.; Guo, H. Exploiting hierarchical features for crop yield prediction based on 3-d convolutional neural networks and multikernel gaussian process. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 2021, 14, 4476–4489. [Google Scholar] [CrossRef]
- Qiao, M.; He, X.; Cheng, X.; Li, P.; Luo, H.; Zhang, L.; Tian, Z. Crop yield prediction from multi-spectral, multi-temporal remotely sensed imagery using recurrent 3D convolutional neural networks. International Journal of Applied Earth Observation and Geoinformation 2021, 102, 102436. [Google Scholar] [CrossRef]
- Kang, Y.; Ozdogan, M.; Zhu, X.; Ye, Z.; Hain, C.; Anderson, M. Comparative assessment of environmental variables and machine learning algorithms for maize yield prediction in the US Midwest. Environmental Research Letters 2020, 15, 064005. [Google Scholar] [CrossRef]
- Terliksiz, A.S.; Altỳlar, D.T. Use of deep neural networks for crop yield prediction: A case study of soybean yield in lauderdale county, alabama, usa. 2019 8th international conference on Agro-Geoinformatics (Agro-Geoinformatics). IEEE, 2019, pp. 1–4.
- Sun, J.; Di, L.; Sun, Z.; Shen, Y.; Lai, Z. County-level soybean yield prediction using deep CNN-LSTM model. Sensors 2019, 19, 4363. [Google Scholar] [CrossRef]
- Tiwari, P.; Shukla, P. Crop yield prediction by modified convolutional neural network and geographical indexes. International Journal of Computer Sciences and Engineering 2018, 6, 503–513. [Google Scholar] [CrossRef]
- Nevavuori, P.; Narra, N.; Linna, P.; Lipping, T. Crop yield prediction using multitemporal UAV data and spatio-temporal deep learning models. Remote Sensing 2020, 12, 4000. [Google Scholar] [CrossRef]
- Xu, H. Modification of normalised difference water index (NDWI) to enhance open water features in remotely sensed imagery. International journal of remote sensing 2006, 27, 3025–3033. [Google Scholar] [CrossRef]
- Vaswani, A. Attention is all you need. Advances in Neural Information Processing Systems 2017. [Google Scholar]
- Hammami, E.; Boughanem, M.; Faiz, R.; Dkaki, T. Intermediate Hidden Layers for Legal Case Retrieval Representation. International Conference on Database and Expert Systems Applications. Springer, 2024, pp. 306–319.
- Neptune, N.; Mothe, J. Automatic annotation of change detection images. Sensors 2021, 21, 1110. [Google Scholar] [CrossRef]



| Application | About | Number of papers |
|---|---|---|
| Weed detection | Identifying unwanted plants within target crops | 26 |
| Disease detection | Diagnosing and assessing plant diseases to prevent their spread | 23 |
| Crop classification | Categorizing crop and plantation varieties | 15 |
| Water management | Managing water resources or detecting water scarcity | 22 |
| Yield prediction | Estimating future crop production levels | 29 |
| Metric | Formula | Description |
|---|---|---|
| Accuracy [36,37,38] | Ratio of correct predictions to total predictions. | |
| Precision [39] | True positives over predicted positives; accuracy of positive predictions. | |
| Recall[40] | True positives over actual positives; ability to identify relevant instances. | |
| F1-Score [40,41] | Harmonic mean of precision and recall; balances both metrics. | |
| Mean Average Precision [42,43] | Average of AP scores across classes. | |
| Intersection over Union (Jaccard index)[44,45] | Ratio of overlap area to union area; used in image segmentation and object detection. | |
| Mean Intersection over Union[46] | Average IoU across classes for multi-class evaluation. | |
| Weighted Mean Intersection over Union [47,48] | mIoU with class weights to emphasize importance. | |
| Processing Time[49] | Total time for model to process data and produce predictions. |
| Category | Approaches | Purpose |
|---|---|---|
| Detection Techniques | Image segmentation (e.g. PSPNet, SegNet, UNet) | Classifies pixels as weeds or not (e.g. crops, background) [40,41,47,48,50,96,97,98,99] |
| Object detection (e.g. YOLO, Faster-RCNN, Mask-RCNN) | Efficient at detecting and locating zones that contain weeds in an image [36,39,41,76,91,101,105] | |
| Input Data Type | RGB | Most common data type in weed detection due to the different shapes of weeds that makes possible their identification [36,39,47,48,50,76,78,89,91,99,100,101,103,104,105,106,107] |
| Multispectral | Improves weed detection performances [40,41,95,96,97,98] | |
| Vegetation indexes (e.g. NIR, NDVI) | Assists in distinguishing vegetation from non-vegetation [41,78,96,97,98,99] | |
| Data Acquisition | UAV | Useful for aerial weed detection at different altitudes (1 - 65 m) [41,89,98,99,100,101,102] |
| UGV | Efficient for close-range weed detection [40,50,96,103,104,107] | |
| Handheld devices (e.g. cameras, mobile phones) | Suitable for small scale weed detection [36,39,47,48,76,78,91,105,106] |
| Category | Approaches | Purpose |
|---|---|---|
| Detection Techniques | Image classification (e.g. VGG, Inception, DenseNet, ResNet) | Efficient at identifying disease types in leaf images [51,56,85,108,109,110,111,112,113,114] |
| Object detection (e.g. YOLO, CenterNet) | Used to detect areas in plants or fields that show disease symptoms [42,49,53] | |
| Image segmentation (e.g. Mask R-CNN) | Helpful at classifying diseased pixels in crops and leaves [68,115,116,117] | |
| Input Data Type | RGB | Most commonly used for detecting visible symptoms [51,53,56,85,108,109,110,111,112,113,114,121,122] |
| Multispectral images | Provides a high potential for early detection of diseases, without apparent symptoms [118,119,120] | |
| Data Acquisition | UAV | Efficient for large-scale monitoring and real-time disease detection [42,49,53] |
| Handheld devices | Used for close-range and on-ground images. Useful for quick data collection on fields and in controlled laboratory environment [51,56,85,108,109,110,111,112,113,114,121,122] |
| Category | Approaches | Purpose |
|---|---|---|
| Detection Techniques | Image classification (e.g. CNN-RNN-LSTM, CNN,) | Efficient at classifying images of leaves, plants, or fruits [123,124,125] |
| Image segmentation (e.g. 1D-CNN, 3D-CNN, ViT, Recurrent CNN, HRNet) | High performances on classifying each pixel into the corresponding crop type [45,128,129,132] | |
| Input Data Type | RGB images | Mostly used for leaf and plant classifications [52,124,125,135] |
| Multispectral and hyperspectral images | Captures unique spectral signatures that are crop specific, assisting land cover crop classification [45,127,128,132,133] | |
| SAR data | Acquires detailed surface information. Useful to capture crop structure while being unaffected by weather conditions (e.g. clouds) [129,130,132] | |
| Data Acquisition | Satellites (e.g. Sentinel-1, Sentinel-2, RADARSAT2) | Provides historical and periodical data for large-scale crop classification (e.g. land cover) [45,126,127,128,129,130,131,132,133] |
| UAV | Captures high resolution aerial images that can be combined with satellite images to improve crop classification [52,131,132,133,134,135] | |
| Handheld devices | Close-range imaging for small scale classification [124,125] |
| Category | Approaches | Purpose |
|---|---|---|
| Detection Techniques | Image classification | High accuracy in detecting water stress, predicting droughts, and classifying different irrigation treatments [37,61,72,136,137,138,139,140,141] |
| Regression | Accurately estimate soil moisture content, evapotranspiration, and groundwater content [143,144,146,150,151] | |
| Input Data Type | RGB images | Effective for detecting visible changes (e.g. color, curvature) in plants under water stress [37,72,136,137,138] |
| Multispectral and hyperspectral | Useful for early detection of water stress, even before visible symptoms [61,139,140,141] | |
| Vegetation indexes (e.g. NDVI, MSAVI) | Commonly used to assist in drought prediction and soil moisture estimation [140,146,147,148] | |
| Thermal | Helps in detecting water stress and estimating soil moisture [142,143] | |
| SAR | Beneficial in soil moisture estimation [146,147,149] | |
| Weather and in-situ data | Effective in estimating evapotranspiration, groundwater, and soil moisture content [144,145,147,150] | |
| Data Acquisition | Satellites (e.g. Sentinel-1, Sentinel-2, RADARSAT2) | Provides high spatio-temporal data, useful for soil moisture and irrigation mapping[146,147,149,150,151,152] |
| UAV | Captures high resolution imagery, mostly used in water stress detection and soil moisture estimation [66,153] | |
| Handheld devices | Allows ground-level data acquisition for close-range water stress detection [37,137,138] | |
| Other sensors (e.g. tensiometers, thermometer) | Gathers data for better water management and ground truth labels [61,139,140,141,142,143,146,147,152] |
| Category | Approaches | Purpose |
|---|---|---|
| Detection Techniques | Image classification | Used in identifying crop growth stages, which correlates with yield [38,154,155,159,167,168,169,172,173,174,176,177] |
| Image segmentation (instance and semantic segmentation) | Used for crop segmentation and maturity classification, which helps in crop counting and yield estimation [43,165] | |
| Regression | Most commonly used technique for yield prediction [154,155,157,158,159,167,168,169,170,172,173,174,176,177] | |
| Object detection | Applied for detecting individual crop heads, fruits, or plants [161,162,163,164,165] | |
| Input Data Type | RGB images | Effective for identifying crop growth stages based on different visible traits [43,161,162,163,164,165,168] |
| Multispectral and hyperspectral images | Useful for detecting crop health and predicting yield [38,154,157,167,168,169,170,172,173,174,176] | |
| Vegetation indexes (e.b. NDVI, SAVI, EVI) | Helps in biomass estimation [157,158,168,170,172,174,177] | |
| Thermal | Improves yield prediction performances when combined with spectral data [167,172,174,176] | |
| Weather and in-situ data | Provides more features to help in yield prediction [154,170,174] | |
| Data Acquisition | Satellites (e.g. MODIS, Sentinel-1, Sentinel-2) | Used for large-scale yield prediction based on multitemporal and historical data [155,156,158,167,168,169,170,171,172,173,174,175,176,177] |
| UAV | Provides high resolution imagery for yield estimation [38,57,79,154,157,162,163,166] | |
| Handheld devices | Offers localized data for specific crops with limited coverage but effective for small farms [43,161,163,164,165] | |
| surveys and land cover | Mostly used as labels for ground truth [38,154,155,157,158,159,167,168,169,170,172,173,176] | |
| Other sensors (e.g. in-situ sensors) | Provides weather and soil related measurements, assisting yield prediction [38,155,158,159,170,174] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).