1. Introduction
Food security, a critical global concern, is intricately linked to the United Nations’ Sustainable Development Goals (SDGs), particularly SDG 2 (Zero Hunger). As the world population continues to grow and is projected to reach 9.7 billion by 2050, ensuring food security and maintaining food quality have become increasingly challenging [
1,
2]. Future forecasts suggest a potential strain on food systems, with climate change and resource scarcity further complicating these issues. In developing countries, agricultural industries play a pivotal role in addressing food security challenges while simultaneously contributing to national income [
3,
4]. These industries not only enhance local food production but also create employment opportunities and stimulate economic growth in the region. However, achieving sustainable food security requires a multifaceted approach that integrates innovative farming techniques, such as autonomous robots or agricultural drones, to enhance agricultural production, environmental conservation, and economic development [
5,
6].
Food security issues are closely intertwined with sustainable and smart agricultural practices. Sustainable agriculture focuses on producing food in an environment friendly, economically viable, and socially responsible manner. Smart agriculture, a subset of sustainable agriculture, leverages technological and data-driven approaches to optimize the farming process [
7]. The elements of smart agriculture include: (a) precision farming techniques, (b) Internet of Things (IoT) sensors and devices, (c) data analytics and artificial intelligence (AI), (d) automated irrigation systems, (e) drone technology for crop monitoring, (f) climate-smart practices, and (g) soil and crop management tools [
8,
9]. Given the broad challenges of food security and smart agriculture, this study considers the transformative potential of drone technology as a breakthrough in some of the challenges. This study proposes a solution to agricultural problems using micro-unmanned aerial vehicle (UAV) technology with micro dimension which is less than 100 mm x 100 mm. This is to take advantage off navigating tight spaces, such as vertical farm. Previous drone-based pollination systems utilized larger dimensions [
10,
11]. The proposed method and solution lay the foundation for research on utilizing micro-UAVs as smart agricultural tools.
The novelty of this paper lies in the deployment of convolutional neural network (CNN) on a micro-sized, self-operating drone designed to function as a pollinator. The main tasks of drone is to find flowers and fly towards them from a specific starting point. Before the drone was used, it was trained to recognize flowers by using a CNN. During operation, the pollinator drone takes off and examines its surroundings. If it spots a flower nearby, it flies close to it. The goal of this study was to improve the navigation and flower detection capabilities of micro drones using CNN technology, potentially aiding pollination tasks in the future.
In Section II, we provide an overview of drone use in agriculture, the potential of drone pollinators, AI classifiers, and flower recognition systems, and discuss the problems. In Section III, we present the details of the experimental methods, which consist of a binary classification technique using a ResNet-18 CNN to detect flowers. We also elaborate on the integration of AI during real-time drone flights. In Section IV, we discuss the experimental results and effectiveness of drone AI detection versus distance. Finally, we conclude the paper and provide directions for future research in Section V.
2. Related Works
2.1. Agriculture Micro-UAV
Micro-UAVs, also called drones, offer numerous benefits for smart agriculture. Off-the-shelf agricultural drone solution packages are equipped with onboard monitoring sensors that can provide real-time data [
12]. A typical onboard sensor package includes a camera that can be utilized for additional functionalities. Current solutions employing onboard cameras are used to monitor crop conditions, detect pest infestations, and perform other tasks [
13]. Another promising area of research is the use of drones as agricultural pollinators. Bees are the primary pollinators of many crops. However, in greenhouses or enclosed agricultural settings, bee penetration is limited, posing challenges to pollination [
14]. Given this limitation, pollinator drones are a potential solution. With the help of onboard cameras, pollinator drones can identify the location of flower buds for subsequent pollination [
15]. Drones can simultaneously collect data on crop health and pollination efficacy while executing pollination tasks. Automated drone pollination has the potential to reduce labor costs associated with manual pollination methods. However, the use of drones as pollinators is still in the experimental phase and faces challenges such as battery life limitations, potential impacts on local ecosystems, and the need for further technological refinement.
2.2. Potential of Autonomous Pollinator Drone
Although bees naturally serve as pollinators, sustainable smart agriculture often employs closed environments, such as greenhouses or indoor spaces, such as vertical farms, which restrict bees’ access to natural pollination [
16,
17,
18]. To address pollination challenges in controlled settings, the use of drones as pollinators has emerged as an innovative solution [
15,
19,
20]. Despite their long and excellent track record as pollinators, traditional pollinators such as bees encounter difficulties in reaching enclosed agricultural areas [
21,
22]. This has led researchers to investigate alternative pollination methods for such crops. In recent years, various solutions have been proposed, including mobile robots with robotic hands and soft actuators with high potential as pollination mechanisms. However, a significant limitation is the initial investment cost, as mobile robots and soft actuators must be developed to meet specific environmental requirements [
23,
24,
25,
26]. In contrast, off-the-shelf micro-UAVs are highly programmable, making them well-suited for autonomous pollination tasks [
27].
Recent reports have indicated that pollinator drones equipped with specialized attachments can replicate the pollination process. These drones, categorized under the vertical takeoff and landing (VTOL) configuration, offer precise control and ability to operate in confined spaces such as greenhouses and indoor farming environments [
19]. This technology has the potential to sustain crop productivity. Empirical studies have confirmed the efficacy of drone pollination in various crops including strawberries, peppers, tomatoes, and kiwifruits [
15,
19,
28,
29]. The benefits of drone pollinators include their consistent operation regardless of weather conditions, programmability for optimal pollination timing, and potential to reduce labor costs [
30,
31]. Nonetheless, challenges persist, particularly the need for further refinement of drone sensor payloads, particularly navigation and imaging sensors, to enable autonomous navigation to flowers and match the efficiency of natural pollinators. Additionally, concerns regarding long-term ecological impact of deploying multiple drones simultaneously in greenhouse agricultural settings remain. Although short-term ecological impacts may enhance agricultural efficiency, long-term ecological consequences require further investigation [
32,
33]. One notable negative impact is the buzzing noise produced by drones, which may cause biological disturbances to local bird and insect populations [
34]. As research in this domain advances, drone pollination technology continues to evolve, potentially serving as a viable complement to natural pollinators in ensuring food security and promoting sustainable agricultural practices.
Determining the appropriate size and suitability of pollinator drones for robotic pollination in greenhouse environments is crucial for optimizing their effectiveness and efficiency [
15]. The drone must be micro-sized to navigate confined spaces while carrying essential pollination equipment. Micro drones can easily maneuver between plant rows and the surrounding structures [
35]. However, accommodating a sufficient pollen payload along with necessary sensors or cameras presents significant challenges. The ideal size balances the agility and functionality, which is the primary reason why this study selected a micro-UAV configuration as the main drone platform.
2.3. Simultaneous Navigation and Flower Recognition
Flower detection and navigation are crucial components for the development of autonomous pollination systems using drones. Recent studies have focused on improving the accuracy and efficiency of these processes to enhance drone performance in agricultural settings [
36]. AI techniques for computer vision, particularly deep-learning algorithms, have shown promising results in flower detection. CNNs have been successfully applied to identify and locate flowers in complex greenhouse environments, achieving high accuracy rates. These models can distinguish between different flower species and their growth stages, thereby enabling targeted pollination [
37,
38].
The enclosed nature of greenhouses limits the use of traditional global positioning systems (GPS), necessitating alternative navigation methods [
39,
40]. Researchers have explored various approaches to overcoming the limitations of GPS in enclosed spaces. Visual Simultaneous Localization and Mapping (VSLAM) techniques have been implemented to allow drones to create and update maps of their environments in real time [
41]. Among these techniques, cameras and computer vision algorithms can help drones maintain their position and navigate through greenhouses [
39,
42]. Sensor fusion, which combines data from cameras, infrared sensors, and inertial measurement units, has improved the precision of drone movement around plants [
43]. Some studies have investigated the use of artificial landmarks or QR codes within greenhouses to aid drone localization [
44,
45].
The importance of navigation in greenhouse settings cannot be overlooked. Accurate navigation ensures that drones systematically cover all plants that require pollination without missing areas or over-pollinating others. They also prevent collisions with greenhouse structures or plants that can damage both the drones and crops [
46]. Infrared or ultrasonic sensors can be employed for obstacle avoidance and precise maneuvering around plants [
47]. Efficient navigation contributes to energy conservation, allowing drones to operate for longer periods and cover larger areas. Additionally, precise navigation enables the collection of valuable data on plant health and growth patterns, contributing to overall greenhouse management and crop optimization [
48]. The integration of flower detection and navigation systems has led to the development of more efficient pollination strategies, optimized flight paths, and reduced energy consumption.
3. Experimental Setup
This section provides an overview of the proposed drone pollination system, which comprises of two main parts. The first part involved the use of the ResNet-18 machine learning CNN algorithm for the binary classification of flowers, as illustrated in
Figure 1. This algorithm, commonly used for image analysis, helps us categorize objects into two classes: `flower’ or `not flower’. We developed an AI-based flower identification system that allows drones to hover towards detected flowers without requiring human intervention. Building on this flower classification using the ResNet-18 algorithm, the second part of our study focused on autonomous navigation for real-time field test flower recognition. A drone platform with an onboard camera is the most crucial device. An indoor laboratory was chosen for the real-time field test drone flight and flower recognition evaluation.
3.1. AI Binary Classification for Flower Recognition
Before we conduct the real-time field test, there is an important step called AI binary classification for flower recognition using the ResNet-18 machine-learning CNN algorithm. As shown in
Figure 1, this classification is a key part of the process for handling the raw visual data of the flowers. The process began with the collection of flower images, as shown in
Figure 2. Subsequently, we pre-trained the model and performed classification. We used a pre-trained model, ResNet-18, which can be fine-tuned specifically for our dataset. This dataset was prepared for the computational model by splitting it into separate sets for training, validation, and testing. These categorized data subsets were fed into the ResNet-18 framework, which learns to recognize specific patterns and features of the flowers. Finally, the ResNet-18 framework performs a binary classification task to determine whether the input image shows a `flower’ or `not flower’. We compared the classification performance of a drone’s onboard camera and a standard webcam, focusing on the time it takes to recognize a flower, measured in seconds. The number of detections was set to 10, 20, 30, 40, 50, and 60.
3.2. Micro-UAV Specification
The system for the micro-UAV pollinator navigation platform involves careful selection of a lightweight, compact drone capable of stable flight within the confined and controlled environment of a greenhouse. The DJI Tello Robomaster TT was selected with key specifications including payload capacity to accommodate integrated sensors and cameras, extended battery life for prolonged operation, and precise maneuverability to navigate narrow aisles and avoid obstacles. The hardware setup prioritizes modularity and reliability, ensuring that UAV can be easily maintained and adapted to different greenhouse layouts and crop types.
Table 1 summarizes the specifications of DJI Tello Robomaster TT used in this study. The onboard camera was central to the functionality of the drone for flower detection. A high-resolution 5 MP camera enabled the accurate identification and monitoring of flower positions. Sunflowers were used in this project because they have unique visual characteristics, such as bright yellow petals and distinct shapes, which make them distinguishable from the surrounding environment. An indoor laboratory was selected as the experimental field to mimic an indoor greenhouse scenario, as shown in
Figure 3.
3.3. Integration of AI Detection Capability on the Micro-UAV
In the next part of our research, we incorporated a drone into the real-time field experimental setup, as depicted in
Figure 3. In our laboratory tests, we arranged four identical flowers in a `cross’ configuration, maintaining specific distances between the flowers and drone, as illustrated in
Figure 4. The drone was initially positioned at its center. We systematically adjusted the distance between the drone and the flowers for each subexperiment, conducting tests at intervals of 15.5 cm, 30.5 cm, 60.5 cm, 91.5 cm, and 116.5 cm, as shown in
Figure 4. At each distance, we documented the efficacy of drone in flower detection. Upon takeoff readiness, the drone established a connection with the workstation via Wi-Fi. Following a vertical ascent to approximately 1.5 m above ground level, the drone commenced its search for flowers.
The flight path was pre-programmed as shown in
Figure 5. Initially, the drone ascended vertically after it was successfully connected to the local Wi-Fi. Once it reached approximately 1.5 m above ground level, the drone was incrementally rotated 10° until a flower was detected in its video frame. The process of flower recognition during flight is illustrated in
Figure 6. The drone advances forward when a flower is detected. In the event that no flower is identified in the video stream, the drone continues to rotate in 10° increments until the flower is located. If a flower is detected during rotation, the drone moves towards it. The drone lands when its battery level falls below 20%.
4. Results and Discussion
Two databases, designated as `flower’ and `non-flower’, were collected for this study. The training process was conducted over 64 epochs, with 14 iterations. As illustrated in
Figure 7, the model achieved an accuracy of 1, indicating that it correctly classified all the instances in the training data without errors. This suggests that the model effectively learns the training data and accurately predicts the correct labels for each input.
The preliminary experiment aimed to assess the classification algorithm’s ability to recognize a flower, with the performance measured in seconds. A comparison was made between the drone’s onboard camera and the standard webcam.
Table 2 presents the experimental data for detection time. Detection commenced at intervals of 10, 20, 30, 40, 50, and 60. Both the webcam and drone camera operated in the real-time mode. As the number of detections increased, the time required for both the webcam and the drone camera also increased. Notably, the webcam required more time for detection than the camera of the drone. This discrepancy can be attributed to several factors, primarily the hardware specifications. The DJI Tello Robomaster TT drone is equipped with specialized image-processing hardware and dedicated processors, including GPUs, optimized for real-time image-analysis tasks. These components facilitate parallel computations and expedite the detection process. In contrast, webcams typically rely on a computer’s CPU, which may have fewer computational resources and slower processing speeds compared to specialized hardware.
In the subsequent phase of this experiment, the drone was programmed to navigate in a `cross’-shaped pattern, as illustrated in
Figure 8. The drone’s rotational movement was detected, and the graph indicated that the drone’s camera identified four flowers during its rotation to locate them. The first detection occurred between 32.2 seconds and 38.6 seconds, at angles ranging from 49° to 113.6°. The second detection took place from 46.5 seconds to 49.9 seconds, at an angle of -147.2° to -95°. The third detection was recorded between 103.7 seconds and 108.8 seconds, at an angle of 69.9° to 133.1°. The final detection occurred from 143.4 seconds to 149.1 seconds, at an angle of 82.5° to 145.2°. At certain points, the angle on the line graph dips in the negative direction, indicating a counterclockwise rotation and a descent or lowering of the drone.
According to
Table 3, if the distance between the drone and the flower was too small (i.e., 15.5 cm), the drone failed to detect it. Conversely, if the distance is too large (i.e., 116.5 cm), the drone also struggles to locate the flower. This is because when the drone is too close, its sensors and cameras may not have a sufficiently wide field of view to capture the entire object or provide accurate measurements, making it difficult to effectively detect and track the flower. However, if the drone is too far, the flower’s size in the camera’s field of view diminishes, causing the drone’s computer vision algorithms to have difficulty accurately identifying the flower, especially if there are other objects or cluttered backgrounds. Additionally, lighting conditions can affect object detection; if the flower is poorly lit or the lighting is too harsh, the drone’s sensors and cameras may struggle to distinguish the flower from its surroundings.
5. Conclusion
This study effectively demonstrates the implementation of a micro-UAV equipped with a CNN for autonomous flower detection and navigation within a controlled greenhouse environment. The drone successfully identified sunflower targets at optimal distances with detection accuracy only when the drone is positioned excessively close to or distant from the flowers, thereby emphasizing the necessity of maintaining an effective operational range. The real-time detection capabilities of the drone’s specialized camera surpassed those of conventional webcams, thereby underscoring the benefits of dedicated onboard hardware. The `cross’-shaped flight patterns confirmed the drone’s ability to navigate and accurately identify multiple flowers. These findings highlight the potential of micro-UAVs as efficient pollination agents in enclosed agricultural settings and offer a promising solution to the challenges encountered by natural pollinators in greenhouses. Future research should focus on enhancing sensor capabilities, extending operational ranges, and assessing ecological impacts to advance the practical deployment of autonomous drone pollinators.
Author Contributions
Conceptualization, M.I.Y., F.N.M.Y., A.G.R., N.K.P.; methodology, M.I.Y., F.N.M.Y., A.G.R., N.K.P.; software, M.I.Y., F.N.M.Y., A.Z, M.A.A.S.; validation, M.I.Y. and F.N.M.Y.; investigation, M.I.Y., F.N.M.Y., S.H.S., A.Z.; resources, M.I.Y., F.N.M.Y., A.G.R., N.K.P.; data curation, M.I.Y., F.N.M.Y., A.Z, M.A.A.S.; writing—original draft preparation, M.I.Y., F.N.M.Y.; writing—review and editing, M.I.Y., F.N.M.Y., A.Z, M.A.A.S.; visualization, M.I.Y. and F.N.M.Y.; supervision, M.I.Y and S.H.S; project administration, M.I.Y. and F.N.M.Y.; funding acquisition, M.I.Y. and F.N.M.Y. All authors have read and agreed to the published version of the manuscript.
Funding
This work has been supported by the UniKL Industrial Matching Grant UniKL/CIL/UniKL IMG-25/0005.
Acknowledgments
The authors thank Universiti Kuala Lumpur (UniKL) for providing technical and administrative support. During the preparation of this manuscript, we used Paperpal to improve the language, grammar, and readability of the text. After using this tool, the authors reviewed and edited the content as required and took full responsibility for the final version of the manuscript.
Conflicts of Interest
The authors declare no conflicts of interest.
Abbreviations
The following abbreviations are used in this manuscript:
| UAV |
Unmanned Aerial Vehicle |
| CNN |
Convolution Neural Network |
| SDG |
Sustainable Development Goals |
| IoT |
Internet of Things |
| AI |
Artificial Intelligence |
References
- Steiner, G.; Geissler, B.; Schernhammer, E.S. Hunger and Obesity as Symptoms of Non-Sustainable Food Systems and Malnutrition. Applied Sciences 2019, 9. [CrossRef]
- Islam, S. Agriculture, food security, and sustainability: a review. Exploration of Foods and Foodomics 2025, 3. [CrossRef]
- Paudel, D.; Neupane, R.C.; Sigdel, S.; Poudel, P.; Khanal, A.R. COVID-19 Pandemic, Climate Change, and Conflicts on Agriculture: A Trio of Challenges to Global Food Security. Sustainability 2023, 15. [CrossRef]
- Raza, A.; Khare, T.; Zhang, X.; Rahman, M.M.; Hussain, M.; Gill, S.S.; Chen, Z.H.; Zhou, M.; Hu, Z.; Varshney, R.K. Novel Strategies for Designing Climate-Smart Crops to Ensure Sustainable Agriculture and Future Food Security. Journal of Sustainable Agriculture and Environment 2025, 4, e70048, [https://onlinelibrary.wiley.com/doi/pdf/10.1002/sae2.70048]. [CrossRef]
- Gamage, A.; Gangahagedara, R.; Subasinghe, S.; Gamage, J.; Guruge, C.; Senaratne, S.; Randika, T.; Rathnayake, C.; Hameed, Z.; Madhujith, T.; et al. Advancing sustainability: The impact of emerging technologies in agriculture. Current Plant Biology 2024, 40, 100420. [CrossRef]
- Nazarov, Anton.; Kulikova, Elena.; Molokova, Elena. Economic security through technological advancements in agriculture: A pathway to sustainable agro-industrial growth. BIO Web Conf. 2024, 121, 02012. [CrossRef]
- AlZubi, A.A.; Galyna, K. Artificial Intelligence and Internet of Things for Sustainable Farming and Smart Agriculture. IEEE Access 2023, 11, 78686–78692. [CrossRef]
- Patel, A.; Shukla, C.; Trivedi, A.; Balasaheb, K.S.; Sinha, M.K., Smart Farming: Utilization of Robotics, Drones, Remote Sensing, GIS, AI, and IoT Tools in Agricultural Operations and Water Management. In Integrated Land and Water Resource Management for Sustainable Agriculture Volume 1; Jadhav, D.A.; Khaple, S.; Wable, P.S.; Chendake, A.D., Eds.; Springer Nature Singapore: Singapore, 2025; pp. 127–151.
- Aarif K. O., M.; Alam, A.; Hotak, Y. Smart Sensor Technologies Shaping the Future of Precision Agriculture: Recent Advances and Future Outlooks. Journal of Sensors 2025, 2025, 2460098, [https://onlinelibrary.wiley.com/doi/pdf/10.1155/js/2460098]. [CrossRef]
- Moradi, S.; Bokani, A.; Hassan, J. UAV-based smart agriculture: A review of UAV sensing and applications. In Proceedings of the 2022 32nd international telecommunication networks and applications conference (ITNAC). IEEE, 2022, pp. 181–184.
- Lu, K.; Zhang, X.; Zhai, T.; Zhou, M. Adaptive sharding for UAV networks: A deep reinforcement learning approach to blockchain optimization. Sensors 2024, 24, 7279.
- Caruso, A.; Chessa, S.; Lopez, J.C.; Escolar, S.; Barba, J. Collection of Data With Drones in Precision Agriculture: Analytical Model and LoRa Case Study. IEEE Internet of Things Journal 2021, 8, 16692–16704. [CrossRef]
- Yin, J.; Lan, Y.; Long, Y.; Wu, B.; Jiang, L.; Zhan, H.; Zhu, J.; Xu, H.; Deng, H.; Chen, G. An Intelligent Field Monitoring System Based on Enhanced YOLO-RMD Architecture for Real-Time Rice Pest Detection and Management. Agriculture 2025, 15, 798. [CrossRef]
- Wang, T.; Zhao, Y.; Li Pang, L.L.; Cheng, Q. Evaluation method and design of greenhouse pear pollination drones based on grounded theory and integrated theory. PLOS ONE 2024, 19, 1–21. [CrossRef]
- Miyoshi, K.; Hiraguri, T.; Shimizu, H.; Hattori, K.; Kimura, T.; Okubo, S.; Endo, K.; Shimada, T.; Shibasaki, A.; Takemura, Y. Development of Pear Pollination System Using Autonomous Drones. AgriEngineering 2025, 7. [CrossRef]
- Bersani, C.; Ouammi, A.; Sacile, R.; Zero, E. Model Predictive Control of Smart Greenhouses as the Path towards Near Zero Energy Consumption. Energies 2020, 13. [CrossRef]
- Hati, A.J.; Singh, R.R. Smart Indoor Farms: Leveraging Technological Advancements to Power a Sustainable Agricultural Revolution. AgriEngineering 2021, 3, 728–767. [CrossRef]
- Singh, S.; Singh, P.; Kumar, A.; Baheliya, A.K.; Patel, K.K. Promoting Environmental Sustainability Through Vertical Farming: A Review. Journal of Advances in Biology & Biotechnology 2024, 27, 210–219. [CrossRef]
- Hiraguri, T.; Shimizu, H.; Kimura, T.; Matsuda, T.; Maruta, K.; Takemura, Y.; Ohya, T.; Takanashi, T. Autonomous Drone-Based Pollination System Using AI Classifier to Replace Bees for Greenhouse Tomato Cultivation. IEEE Access 2023, 11, 99352–99364. [CrossRef]
- V.J, R.; Inamdar, M.N. Impact of Autonomous Drone Pollination in Date Palms. International Journal of Innovative Research and Scientific Studies 2022, 5, 297–305. [CrossRef]
- Sadeh, A.; Shmida, A.; Keasar, T. The Carpenter Bee Xylocopa pubescens as an Agricultural Pollinator in Greenhouses. Apidologie 2007, 38, 508–517. [CrossRef]
- Ester Judith Slaa.; Luis Alejandro Sánchez Chaves.; Katia Sampaio Malagodi-Braga.; Frouke Elisabeth Hofstede. Stingless bees in applied pollination: practice and perspectives. Apidologie 2006, 37, 293–315. [CrossRef]
- Strader, J.; Nguyen, J.; Tatsch, C.; Du, Y.; Lassak, K.; Buzzo, B.; Watson, R.; Cerbone, H.; Ohi, N.; Yang, C.; et al. Flower Interaction Subsystem for a Precision Pollination Robot. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2019, pp. 5534–5541. [CrossRef]
- Ohi, N.; Lassak, K.; Watson, R.; Strader, J.; Du, Y.; Yang, C.; Hedrick, G.; Nguyen, J.; Harper, S.; Reynolds, D.; et al. Design of an Autonomous Precision Pollination Robot. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, pp. 7711–7718. [CrossRef]
- Lochan, K.; Khan, A.; Elsayed, I.; Suthar, B.; Seneviratne, L.; Hussain, I. Advancements in Precision Spraying of Agricultural Robots: A Comprehensive Review. IEEE Access 2024, 12, 129447–129483. [CrossRef]
- Karim, M.J. Autonomous Pollination System for Tomato Plants in Greenhouses: Integrating Deep Learning and Robotic Hardware Manipulation on Edge Device. In Proceedings of the 2024 International Conference on Innovations in Science, Engineering and Technology (ICISET), 2024, pp. 1–6. [CrossRef]
- Agrawal, J.; Arafat, M.Y. Transforming Farming: A Review of AI-Powered UAV Technologies in Precision Agriculture. Drones 2024, 8. [CrossRef]
- Broussard, M.A.; Coates, M.; Martinsen, P. Artificial Pollination Technologies: A Review. Agronomy 2023, 13. [CrossRef]
- Bell, J. Robots for Kiwifruit Harvesting and Pollination, 2025, [arXiv:cs.RO/2507.15484].
- Wu, P.; Lei, X.; Zeng, J.; Qi, Y.; Yuan, Q.; Huang, W.; Ma, Z.; Shen, Q.; Lyu, X. Research progress in mechanized and intelligentized pollination technologies for fruit and vegetable crops. International Journal of Agricultural and Biological Engineering 2024, 17, 11–21.
- Manzoor, S.H.; Kabir, M.H.; Zhang, Z. UAV-based apple flowers pollination system. In Towards Unmanned Apple Orchard Production Cycle: Recent New Technologies; Springer, 2023; pp. 211–236.
- Yablokova, A.; Kovalev, D.; Kovalev, I.; Podoplelova, V.; Astanakulov, K. Environmental safety problems of swarm use of UAVs in precision agriculture. In Proceedings of the E3S web of conferences. EDP Sciences, 2024, Vol. 471, p. 04018.
- Montilla-Pacheco, A.d.J.; Pacheco-Gil, H.A.; Pastrán-Calles, F.R.; Rodríguez-Pincay, I.R. Pollination with drones: A successful response to the decline of entomophiles pollinators? 2021.
- Francis, C.D.; Kleist, N.J.; Ortega, C.P.; Cruz, A. Noise pollution alters ecological services: enhanced pollination and disrupted seed dispersal. Proceedings of the Royal Society B: Biological Sciences 2012, 279, 2727–2735.
- Stehr, N.J. Drones: The Newest Technology for Precision Agriculture. Natural Sciences Education 2015, 44, 89–91, [https://acsess.onlinelibrary.wiley.com/doi/pdf/10.4195/nse2015.04.0772]. [CrossRef]
- García-Munguía, A.; Guerra-Ávila, P.L.; García-Munguía, A.M.; Islas-Ojeda, E.; Vázquez-Martínez, O.; Flores-Sánchez, J.L.; García-Munguía, O. A Review of Drone Technology and Operation Processes in Agricultural Crop Spraying. Drones 2024, 8, 674. [CrossRef]
- Akbar, J.U.M.; Kamarulzaman, S.F.; Muzahid, A.J.M.; Rahman, M.A.; Uddin, M. A comprehensive review on deep learning assisted computer vision techniques for smart greenhouse agriculture. IEEE Access 2024, 12, 4485–4522.
- Apriyanti, D.H.; Spreeuwers, L.J.; Lucas, P.J. Explainable automated wild-orchid identification combining deep neural networks and Bayesian networks. Engineering Applications of Artificial Intelligence 2025, 161, 111961.
- Arafat, M.Y.; Alam, M.M.; Moh, S. Vision-Based Navigation Techniques for Unmanned Aerial Vehicles: Review and Challenges. Drones 2023, 7, 89. [CrossRef]
- Choutri, K.; Shaiba, H.; Meshoul, S.; Chegrani, A.; Yahiaoui, M.; Lagha, M. Vision-Based UAV Detection and Localization to Indoor Positioning System. Sensors (Basel, Switzerland) 2024, 24, 4121. [CrossRef]
- Liu, Y.; Tan, Y. A Review of Visual SLAM Systems Based on Multi-Sensor Fusion. In Proceedings of the 2024 9th International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS). IEEE, 2024, Vol. 9, pp. 304–310.
- Ruotsalainen, L.; Sokolova, N.; Morrison, A.; Rantanen, J.; Makela, M. Improving Computer Vision-Based Perception for Collaborative Indoor Navigation. IEEE Sensors Journal 2022, 22, 4816–4826. [CrossRef]
- Gupta, A.; Fernando, X. Simultaneous localization and mapping (slam) and data fusion in unmanned aerial vehicles: Recent advances and challenges. Drones 2022, 6, 85.
- Bach, S.H.; Yi, S.Y.; Khoi, P.B. Application of QR Code for Localization and Navigation of Indoor Mobile Robot. IEEE Access 2023, 11, 28384–28390. [CrossRef]
- Li, M.; Zhao, M.; Mao, H.; Gao, H. Development and Experimentation of a Real-Time Greenhouse Positioning System Based on IUKF-UWB. Agriculture 2024, 14, 1479. [CrossRef]
- Rahman, M.F.F.; Zhang, Y.; Chen, L.; Fan, S. A Comparative Study on Application of Unmanned Aerial Vehicle Systems in Agriculture. Agriculture 2021, 11, 22. [CrossRef]
- Suherman, S.; Pinem, M.; Putra, R.A. Ultrasonic Sensor Assessment for Obstacle Avoidance in Quadcopter-based Drone System. institute of electrical electronics engineers, 2020, pp. 50–53. [CrossRef]
- Cheng, B.; He, X.; Li, X.; Zhang, N.; Song, W.; Wu, H. Research on Positioning and Navigation System of Greenhouse Mobile Robot Based on Multi-Sensor Fusion. Sensors (Basel, Switzerland) 2024, 24, 4998. [CrossRef]
- Tsai, P.S.; Wu, T.F.; Wang, Y.C. Automatic Quadrotor Dispatch Missions Based on Air-Writing Gesture Recognition. Processes 2025, 13. [CrossRef]
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).