Submitted:
28 November 2025
Posted:
01 December 2025
You are already at the latest version
Abstract
Keywords:
1. Introduction
2. Drone Senor Technologies
2.1. Optical and Multispectral Sensor Imager
2.2. Hyperspectral and Thermal Sensor Imager
2.3. Lidar and Radar Sensor Imager
3. Drone Actuator Technologies
3.1. Gimbals and Stabilizers
3.2. Linear Actuators and Winches
3.3. Sampling and Dispersal Mechanisms
3.4. Rigid Grippers, Manipulators and Soft Robotic Actuators
4. Case Studies of Sensor-Actuator Deployments

| Year & study | Drone sensor used | Actuator description|How sensor and actuator work together | Usage |
| (McEvoy et al., 2016) Waterfowl survey evaluation |
Phase 1 medium-format camera (50 MP, 80 mm lens), Sony A7-R (36.4 MP), Sony RX-1 (24.3 MP) and GoPro Hero (5 MP) mounted on fixed-wing and multirotor UAVs | All cameras were mounted in gimbals to stabilise images. Gimbals kept the cameras level despite turbulence, and the UAVs’ power determined payload size. | Compare disturbance of different UAV shapes and flight paths on free-living waterfowl while testing various camera payloads. High-resolution imagery allowed species recognition and demonstrated that flights ≥60 m (fixed-wing) or ≥40 m (multirotor) caused minimal disturbance |
| (Shi et al., 2016) High-throughput phenotyping with fixed-wing and rotary UAVs (USA) |
Three UAV platforms were tested: a Sentek GEMS multispectral camera (1.2 MP) on an Anaconda fixed-wing UAV, a Nikon J3 RGB camera and a modified NIR-enabled Nikon J3 multispectral camera on a Lancaster fixed-wing UAV, and a DJI P3-005 4 K RGB camera on an X88 octocopter. The multispectral sensors provided reflectance in blue, green, red and near-infrared bands; the Nikon cameras provided RGB and NIR images. | Cameras were mounted on gimbal systems: the Lancaster’s cameras were triggered by an onboard controller that adjusted the frame rate based on flight speed; the X88 octocopter’s camera was remotely triggered. The gimbals kept the cameras level while the UAVs manoeuvred, and the onboard controllers triggered images at intervals to maintain sufficient forward and side overlap for mosaicking. This allowed the extraction of plant height, canopy coverage, and phenotypic traits from multispectral and RGB images. |
Provided early evidence that fixed-wing UAVs with multispectral cameras can rapidly acquire high-resolution imagery over breeding trials, enabling estimation of plant height and vegetation indices; allowed breeders to detect lodging, flowering time and yield differences across genotypes. |
| (Hodgson et al., 2016) Colony-nesting bird counts |
Canon EOS M mirrorless camera (5184×3456 px) with 40 mm lens | Camera mounted facing downward on an octocopter; vibration blur mitigated using a commercial vibration-dampening plate and iSPONGE for flying-wing platform. Firmware (MagicLantern) controlled an intervalometer for images every 2–3 seconds. | Obtain high-precision counts of colony-nesting birds in tropical and polar environments. UAV-derived counts were an order of magnitude more precise than ground counts |
| (Kefauver et al., 2017) Comparative UAV and field phenotyping for barley |
A Mikrokopter Oktokopter 6S12 XL multirotor UAV carries a Panasonic GX7 RGB camera (16 MP), a FLIR Tau2 640 thermal camera and a Tetracam mini-MCA multispectral camera (11 bands from 450 to 950 nm). | All cameras were mounted on an MK HiSight SLR2 active two-axis gimbal to correct for UAV pitch and roll; the gimbal ensured nadir pointing and consistent overlap. The gimbal stabilised each sensor, allowing the UAV to carry different payloads in separate flights. | Combined RGB, thermal and multispectral imagery allows estimation of canopy temperature, vegetation indices and nitrogen use efficiency across barley hybrids. UAV-based indices correlated strongly with yield, demonstrating that UAV phenotyping can replace labour-intensive field measurements. |
| (Raeva et al., 2019) Monitoring corn and barley fields with multispectral and thermal imagery |
An eBee fixed-wing UAV (senseFly) carried a multiSPEC 4C four-band multispectral camera (green 550 nm, red 660 nm, red-edge 735 nm, NIR 790 nm) and a senseFly thermoMAP thermal camera (7.5–13.5 µm, 640×512 px). Image resolution was ~15 cm (multispectral) and 20 cm (thermal) at 40–150 m flight heights. | The UAV’s autopilot maintained consistent altitude and high overlap (~90%) to create large orthomosaics. The multispectral camera irradiance sensor and calibration target allowed reflectance values to be computed; thermal images were calibrated by comparing shutter images with an internal temperature sensor. | Generated NDVI, NDRE and thermal maps for corn and barley fields over multiple months; thermal imagery complemented multispectral indices by revealing soil and canopy temperature variations. |
| (Lin et al., 2018) Aboveground Tree Biomass Estimation of Sparse Subalpine Coniferous Forest with UAV Oblique Photography |
A fixed-wing UAV carried a Sony ILCE-5100 RGB camera (6000 × 4000 px). The camera was mounted so that its lens axis was tilted ~20° from nadir to capture oblique views; images were processed with aerial triangulation (Structure-from-Motion) to produce point clouds and canopy height models | The UAV platform had a 1.2 m wingspan, 0.8 m fuselage, 4.2 kg working weight and used an ANXIANG 2012 flight controller to stabilise the aircraft during oblique imaging. An Inertial Measurement Unit (IMU) and GPS logged position and attitude; the ground control system handled trajectory planning and remote command. The camera’s tilt angle and the flight controller’s stabilisation allowed overlapping oblique photographs (80 % longitudinal and 60 % lateral overlap) at ~400 m altitude to be taken, producing high-resolution (0.05 m) point clouds over a sparse forest. | Demonstrated that oblique RGB photography can estimate above-ground biomass (AGB) in sparse subalpine forests. The allometric model using tree heights extracted from UAV point clouds achieved R² ≈ 0.96 and RMSE ~54.9 kg in AGB estimation. The study showed that fixed-wing UAVs with consumer cameras provide a cost-effective alternative to LiDAR for sparse forests. |
| (L. Zhang et al., 2019) Maize canopy temperature extraction with UAV thermal and RGB imagery. |
Developed a hexa-copter thermal remote sensing system with a PIXHAWK autopilot, a FLIR Vue Pro R 640 thermal camera (7.5–13.5 µm, 640×512 px) mounted on a Feiyu brushless gimbal, and a DJI Phantom 4 Pro quad-rotor with an integrated RGB camera (1-in CMOS, 4864×3648 px) | The Feiyu gimbal stabilised the thermal camera. The gimbal kept the thermal camera pointing vertically; calibration used black and white boards measured with an infrared thermometer. | Accurate co-registration of thermal and RGB orthomosaics enabled extraction of maize canopy temperature (Tc). Tc and derived indicators correlated strongly with stomatal conductance and water stress, demonstrating that high-resolution RGB imagery supplements thermal data for water stress monitoring. |
| (Shero et al., 2021) Grey seal energy dynamics via 3-D photogrammetry |
24.3 MP Sony α5100 mirrorless camera with 30 mm lens. | Mounted on a Freefly Mōvi M5 gimbal attached to a Freefly Alta 6 hexacopter. Two-person operation (pilot + gimbal/camera operator) enables smooth 360° orbit flights. Orbit-mode autopilot kept the UAV circling seal groups while the gimbal pointed the camera at the focal point. | Produced 3-D models and volumetric estimates of hundreds of grey seals simultaneously; 3-D body volume predicted true mass within 2–10 % error. Enabled energy-transfer studies across lactation seasons. |
| (Costa et al., 2023) Whale blow sampling |
Custom multirotor equipped with petri dishes (no imaging sensor) to collect whale exhaled mucus | The drone carried a petri-dish payload; pilots positioned the multirotor above whale blowholes to capture droplets. Stabilisation and precise hovering acted as the actuator enabling the petri dishes to intercept the exhaled plume. | Non-invasive collection of respiratory samples from humpback whales for microbiome and hormone analyses. Avoided more invasive biopsy methods and reduced stress to animals |
| (Ma et al., 2022) Cotton yield estimation with RGB imagery |
A DJI Phantom 4 Advanced quad-rotor equipped with an integrated RGB camera (5472×3648 px) captured ultra-high-resolution images (0.3 cm/pixel) at 10 m altitude. | The UAV’s built-in gimbal kept the camera pointing vertically. The gimbal and precise flight control enabled acquisition of 387 nadir images, which were mosaicked into orthophotographs retaining 8-bit RGB data. | Vegetation indices and texture features derived from the RGB images were used to build regression and machine-learning models (e.g., PLSR, support vector regression) for estimating cotton yield. The models achieved high correlation (R²≈0.91) and demonstrated that ultra-high-resolution RGB imagery can monitor yield just before harvest. |
| (Deane, 2023) Low-cost drones for habitat change |
DJI Mavic 2 Pro (20 MP Hasselblad camera), DJI Mavic 3T (48 MP), and 3DR Solo with GoPro Hero 4. | Integrated three-axis gimbals on each drone stabilized the RGB cameras during mapping flights. Mission Planner/Pix4D software executed automated grid flights with high overlap. | Produced high-resolution Ortho mosaics for mapping vegetation, mangroves and shoreline change in Bahamian national parks; demonstrated feasibility of low-cost platforms for conservation monitoring. |
| (Camenzind & Yu, 2023) Multi-temporal multispectral UAV remote sensing for early wheat yield prediction |
In 2021 the study used a DJI Phantom 4 Multispectral RTK UAV capturing reflectance at 450, 560, 650, 730 and 840 nm with an integrated sunlight sensor; flight height 10 m AGL produced 0.7 cm GSD. In 2022 a MicaSense Dual Camera Kit (444, 560, 650, 717, 842 nm) mounted on a DJI Matrice M300 RTK at 30 m AGL produced 2.5 cm GSD | Both UAVs have RTK positioning; cameras were mounted on integrated gimbals. The sunlight sensor and reflectance panels allowed radiometric calibration; the integrated gimbal stabilised the sensor during flight. | Time-series reflectance and texture features from heading to harvest enabled random-forest models to predict wheat grain yield weeks before flowering. Early-season spectral indices (e.g., red edge) were highly correlated with yield, offering breeders a non-destructive yield assessment tool. |
| (Hoffmann et al., 2023) Oyster reef volumetric monitoring |
DJI Phantom 4 Pro multirotor with integrated 20 MP RGB camera. | Camera mounted on a three-axis gimbal; flights planned via DroneDeploy with high overlap; numerous ground control points (GCPs) ensure precise geo-referencing. | Generated centimetre-scale digital elevation models and orthomosaics to monitor seasonal growth of oyster reefs in the Wadden Sea. |
| (Demmer et al., 2024) Behavioral monitoring of grey-crowned cranes |
DJI Mavic Air 2S with 1-inch 20 MP CMOS sensor and 8× zoom. | Three-axis gimbal stabilised camera; low-noise propellers reduced disturbance. The UAV hovered or slowly circled above cranes at ≥50 m | Recorded fine-scale behaviours of endangered cranes without eliciting flight responses; allowed quantification of foraging and social behaviours. |
| (Stern et al., 2024) Fine-scale soil-moisture mapping |
Black Swift S2 fixed-wing and E2 multirotor carrying optical, near-infrared, thermal cameras and a passive L-band radiometer | Sensors housed in the nose cone; autopilot maintained stable flight to match radiometer footprint. Multirotor performed low-altitude flights where terrain is required. | Produced 3–50 m resolution soil-moisture maps over oak–grassland; data informed drought and wildfire risk management. |
| (Li et al., 2024) Winter wheat biomass estimation using UAV RGB and multispectral oblique photogrammetry |
DJI Mavic 3M (20 MP RGB sensor and 5 MP multispectral sensor with green, red, red-edge and NIR bands) and DJI Mavic 3T (48 MP RGB sensor) were used. Oblique photography with a five-directional point cloud captured 3D structure. Flights were conducted at 30 m AGL with 80% forward and side overlap | Cameras were maintained vertically or obliquely using the UAVs’ integrated gimbals; flight routes planned in DJI Pilot 2. Nadir multispectral images provided spectral information, while oblique RGB images generated dense point clouds. | Integration of spectral indices and crop height metrics improved biomass estimation accuracy compared with spectral indices alone; oblique photogrammetry captured canopy structure that correlates with biomass. |
| (De Lima & Sepp, 2025) Fire-damage mapping in peatlands |
eBee X fixed-wing drone with S.O.D.A. 20 MP RGB camera and Parrot Sequoia multispectral sensor (green, red, red-edge, NIR) | Both cameras mounted in stabilised bays; RTK GNSS provided precise geolocation. Flights with high forward and side overlap created dense image sets for photogrammetry | Developed the triangular-area index (TAI) from multispectral reflectance to quantify fire-induced physiological damage in peatland vegetation; index captured subtle stress signals |
| (Sandino et al., 2025) Hyperspectral mapping of Antarctic mosses and lichens |
Headwall Nano-Hyperspec camera (400–1000 nm, 270 bands) on DJI M300 RTK; MicaSense Altum multispectral (RGB + NIR + thermal) and Sony α5100 on BMR4-40 UAV | Cameras mounted on Gremsy Pixy SM and Pixy U gimbals for stabilization. Flights planned to achieve high overlap; RTK ensured accurate mosaicking. | Generated hyperspectral and multispectral maps of mosses and lichens with >95 % classification accuracy using machine-learning algorithms. |
| (Wu et al., 2025) Developing compatibility biomass model with UAV LiDAR for Chinese fir |
A HuaCe BB4 multirotor UAV equipped with an AS-1300HL LiDAR system featuring a Riegl VUX-1LR scanner (wavelength 1500 nm, pulse duration 3.5 ns, divergence 0.5 mrad). Flights at 200 m altitude and 10 m/s produced ≈110 points/m² with 30° scan angle. | The UAV flew a criss-cross trajectory with 50% lateral overlap to ensure uniform point distribution. The on-board flight control and GPS/IMU maintained stable altitude and heading; Corepore 2.0 and LiDAR360 software processed the point clouds. | Built component-wise biomass models for trunk, bark, branches and leaves of Chinese-fir across different growth stages. High-density UAV-LiDAR data enabled precise tree height and crown width extraction, yielding accurate AGB estimates. |
| (Dorelis et al., 2025) Multi-year assessment of winter rye rotation effects using XAG M500 |
A XAG M500 RTK multirotor carries a 20 MP multispectral gimbal camera (6 bands: 450, 555, 660, 720, 750, 840 nm) with mechanical shutter and automatic lens distortion correction. | The RTK system integrated multiple GNSS constellations, providing centimetre-level accuracy. The camera is mounted on a 3-axis stabilised gimbal, allowing consistent nadir imaging at flight speed 8 m/s; lens distortion correction and mechanical shutter minimise motion blur. | Provided multi-year NDVI maps to evaluate crop rotation effects on winter rye; high temporal resolution enabled tracking canopy vigor and assessing long-term field management practices. |
5. Practical Constraints and Operational Considerations
6. Emerging Sensor Designs and Future Directions
7. Conclusions
References
- Acharya, B. S., Bhandari, M., Bandini, F., Pizarro, A., Perks, M., Joshi, D. R., Wang, S., Dogwiler, T., Ray, R. L., Kharel, G., & Sharma, S. (2021). Unmanned Aerial Vehicles in Hydrology and Water Management: Applications, Challenges, and Perspectives. In Water Resources Research (Vol. 57, Issue 11). John Wiley and Sons Inc. [CrossRef]
- Adhitama Putra Hernanda, R., Lee, H., Cho, J. il, Kim, G., Cho, B. K., & Kim, M. S. (2024). Current trends in the use of thermal imagery in assessing plant stresses: A review. In Computers and Electronics in Agriculture (Vol. 224). Elsevier B.V. [CrossRef]
- Alkan, M. N. (2024). High-Precision UAV Photogrammetry with RTK GNSS: Eliminating Ground Control Points. Hittite Journal of Science and Engineering, 11(4), 139–147. [CrossRef]
- Andersen, H. E., McGaughey, R. J., & Reutebuch, S. E. (2005). Estimating forest canopy fuel parameters using LIDAR data. Remote Sensing of Environment, 94(4), 441–449. [CrossRef]
- Arfaoui, A. (2017). Unmanned Aerial Vehicle: Review of Onboard Sensors, Application Fields, Open Problems and Research Issues. In Aymen Arfaoui International Journal of Image Processing (IJIP) (Issue 11). https://www.researchgate.net/publication/315076314.
- Aromoye, I. A., Lo, H. H., Sebastian, P., Mustafa Abro, G. E., & Ayinla, S. L. (2025). Significant Advancements in UAV Technology for Reliable Oil and Gas Pipeline Monitoring. In CMES - Computer Modeling in Engineering and Sciences (Vol. 142, Issue 2, pp. 1155–1197). Tech Science Press. [CrossRef]
- Aurell, J., & Gullett, B. K. (2024). Effects of UAS Rotor Wash on Air Quality Measurements. Drones, 8(3). [CrossRef]
- Balestrieri, E., Daponte, P., De Vito, L., & Lamonaca, F. (2021). Sensors and measurements for unmanned systems: An overview. In Sensors (Vol. 21, Issue 4, pp. 1–27). MDPI AG. [CrossRef]
- Bannon, D. (2009). Hyperspectral imaging: Cubes and slices. Nature Photonics, 3(11), 627–629. [CrossRef]
- Beigi, P., Rajabi, M. S., & Aghakhani, S. (2022). An Overview of Drone Energy Consumption Factors and Models. In Handbook of Smart Energy Systems (pp. 1–20). Springer International Publishing. [CrossRef]
- Besenyő, J., & Őszi, A. (2025). Sensing From the Skies: A Comprehensive Analysis of the Latest Sensors on Drones. In Journal of Robotics (Vol. 2025, Issue 1). John Wiley and Sons Ltd. [CrossRef]
- Camenzind, M. P., & Yu, K. (2023). Multi temporal multispectral UAV remote sensing allows for yield assessment across European wheat varieties already before flowering. Frontiers in Plant Science, 14. [CrossRef]
- Castro, J., Morales-Rueda, F., Alcaraz-Segura, D., & Tabik, S. (2023). Forest restoration is more than firing seeds from a drone. Restoration Ecology, 31(1). [CrossRef]
- Chance, C. M., Coops, N. C., Plowright, A. A., Tooke, T. R., Christen, A., & Aven, N. (2016). Invasive shrub mapping in an urban environment from hyperspectral and LiDAR-derived attributes. Frontiers in Plant Science, 7(OCTOBER2016). [CrossRef]
- Corte, A. P. D., Rex, F. E., de Almeida, D. R. A., Sanquetta, C. R., Silva, C. A., Moura, M. M., Wilkinson, B., Zambrano, A. M. A., da Cunha Neto, E. M., Veras, H. F. P., de Moraes, A., Klauberg, C., Mohan, M., Cardil, A., & Broadbent, E. N. (2020). Measuring individual tree diameter and height using gatoreye high-density UAV-lidar in an integrated crop-livestock-forest system. Remote Sensing, 12(5). [CrossRef]
- Costa, H., Rogan, A., Zadra, C., Larsen, O., Rikardsen, A. H., & Waugh, C. (2023). Blowing in the Wind: Using a Consumer Drone for the Collection of Humpback Whale (Megaptera novaeangliae) Blow Samples during the Arctic Polar Nights. Drones, 7(1). [CrossRef]
- Crowe, W., Costales, B., & Luthy, K. (2025). Design and fabrication of a 3D-printed drone-integrated winching system. [CrossRef]
- Daniels, L., Eeckhout, E., Wieme, J., Dejaegher, Y., Audenaert, K., & Maes, W. H. (2023). Identifying the Optimal Radiometric Calibration Method for UAV-Based Multispectral Imaging. Remote Sensing, 15(11). [CrossRef]
- De Lima, R. S., & Sepp, K. (2025). A novel spectral index designed for drone-based mapping of fire-damage levels: demonstration and relationship with biophysical variables in a peatland. Frontiers in Environmental Science, 13. [CrossRef]
- Deane, G. A. M. (2023). USING LOW-COST DRONES TO MAP HABITAT CHANGE IN BAHAMIAN NATIONAL PARKS.
- Demmer, C. R., Demmer, S., & McIntyre, T. (2024). Drones as a tool to study and monitor endangered Grey Crowned Cranes (Balearica regulorum): Behavioural responses and recommended guidelines. Ecology and Evolution, 14(2). [CrossRef]
- Dhruv, & Kaushal, H. (2025). A Review of Pointing Modules and Gimbal Systems for Free-Space Optical Communication in Non-Terrestrial Platforms. Photonics, 12(10), 1001. [CrossRef]
- Dorelis, M., Vaštakaitė-Kairienė, V., & Bogužas, V. (2025). UAV Multispectral Imaging for Multi-Year Assessment of Crop Rotation Effects on Winter Rye. Applied Sciences (Switzerland), 15(21). [CrossRef]
- Fumian, F., Di Giovanni, D., Martellucci, L., Rossi, R., & Gaudio, P. (2020). Application of miniaturized sensors to unmanned aerial systems, a new pathway for the survey of polluted areas: Preliminary results. Atmosphere, 11(5). [CrossRef]
- Galloway, K. C., Becker, K. P., Phillips, B., Kirby, J., Licht, S., Tchernov, D., Wood, R. J., & Gruber, D. F. (2016). Soft Robotic Grippers for Biological Sampling on Deep Reefs. Soft Robotics, 3(1), 23–33. [CrossRef]
- Ganesh Kumar, S. S., & Gudipalli, A. (2024). A comprehensive review on payloads of unmanned aerial vehicle. In Egyptian Journal of Remote Sensing and Space Science (Vol. 27, Issue 4, pp. 637–644). Elsevier B.V. [CrossRef]
- González-Desantos, L. M., Martínez-Sánchez, J., González-Jorge, H., Ribeiro, M., de Sousa, J. B., & Arias, P. (2019). Payload for contact inspection tasks with UAV systems. Sensors (Switzerland), 19(17). [CrossRef]
- GU, R., ZHAO, Y., & REN, X. (2025). Integrating wind field analysis in UAV path planning: Enhancing safety and energy efficiency for urban logistics. Chinese Journal of Aeronautics, 103605. [CrossRef]
- Guebsi, R., Mami, S., & Chokmani, K. (2024). Drones in Precision Agriculture: A Comprehensive Review of Applications, Technologies, and Challenges. In Drones (Vol. 8, Issue 11). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
- Han, C., Jeong, Y., Ahn, J., Kim, T., Choi, J., Ha, J. H., Kim, H., Hwang, S. H., Jeon, S., Ahn, J., Hong, J. T., Kim, J. J., Jeong, J. H., & Park, I. (2023). Recent Advances in Sensor–Actuator Hybrid Soft Systems: Core Advantages, Intelligent Applications, and Future Perspectives. In Advanced Science (Vol. 10, Issue 35). John Wiley and Sons Inc. [CrossRef]
- Haque, K. M. S., Joshi, P., & Subedi, N. (2025). Integrating UAV-based multispectral imaging with ground-truth soil nitrogen content for precision agriculture: A case study on paddy field yield estimation using machine learning and plant height monitoring. Smart Agricultural Technology, 12, 101542. [CrossRef]
- Hodgson, J. C., Baylis, S. M., Mott, R., Herrod, A., & Clarke, R. H. (2016). Precision wildlife monitoring using unmanned aerial vehicles. Scientific Reports, 6. [CrossRef]
- Hoffmann, T. K., Pfennings, K., Hitzegrad, J., Brohmann, L., Welzel, M., Paul, M., Goseberg, N., Wehrmann, A., & Schlurmann, T. (2023). Low-cost UAV monitoring: insights into seasonal volumetric changes of an oyster reef in the German Wadden Sea. Frontiers in Marine Science, 10. [CrossRef]
- Hollaus, M., Wagner, W., Eberhöfer, C., & Karel, W. (2006). Accuracy of large-scale canopy heights derived from LiDAR data under operational constraints in a complex alpine environment. ISPRS Journal of Photogrammetry and Remote Sensing, 60(5), 323–338. [CrossRef]
- Hong, G., & Zhang, Y. (2008). A comparative study on radiometric normalization using high resolution satellite images. International Journal of Remote Sensing, 29(2), 425–438. [CrossRef]
- Huang, D., Zhou, Z., Zhang, Z., Du, X., Fan, R., Li, Q., & Huang, Y. (2025a). From Application-Driven Growth to Paradigm Shift: Scientific Evolution and Core Bottleneck Analysis in the Field of UAV Remote Sensing. In Applied Sciences (Switzerland) (Vol. 15, Issue 15). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
- Huang, D., Zhou, Z., Zhang, Z., Du, X., Fan, R., Li, Q., & Huang, Y. (2025b). From Application-Driven Growth to Paradigm Shift: Scientific Evolution and Core Bottleneck Analysis in the Field of UAV Remote Sensing. In Applied Sciences (Switzerland) (Vol. 15, Issue 15). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
- Huelsman, K., Epstein, H., Yang, X., Mullori, L., Červená, L., & Walker, R. (2022). Spectral variability in fine-scale drone-based imaging spectroscopy does not impede detection of target invasive plant species. Frontiers in Remote Sensing, 3. [CrossRef]
- Imran, & Li, J. (2025). UAV Aerodynamics and Crop Interaction: Revolutionizing Modern Agriculture with Drone. In Smart Agriculture (Singapore) (Vol. 13, pp. 1–462). Springer. [CrossRef]
- Jakob, S., Zimmermann, R., & Gloaguen, R. (2016). Processing of drone-borne hyperspectral data for geological applications. Workshop on Hyperspectral Image and Signal Processing, Evolution in Remote Sensing, 0. [CrossRef]
- James Campbell, B. B., & Wynne, R. H. (2011). Sample Chapter: Introduction to Remote Sensing, Fifth Edition. www.guilford.com/p/campbell2.
- Jaskierniak, D., Lucieer, A., Kuczera, G., Turner, D., Lane, P. N. J., Benyon, R. G., & Haydon, S. (2021). Individual tree detection and crown delineation from Unmanned Aircraft System (UAS) LiDAR in structurally complex mixed species eucalypt forests. ISPRS Journal of Photogrammetry and Remote Sensing, 171, 171–187. [CrossRef]
- Karahan, A., Demircan, N., Özgeriş, M., Gökçe, O., & Karahan, F. (2025). Integration of Drones in Landscape Research: Technological Approaches and Applications. In Drones (Vol. 9, Issue 9). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
- Kedzierski, J., & Chea, H. (2021). Multilayer microhydraulic actuators with speed and force configurations. Microsystems and Nanoengineering, 7(1). [CrossRef]
- Kefauver, S. C., Vicente, R., Vergara-Díaz, O., Fernandez-Gallego, J. A., Kerfal, S., Lopez, A., Melichar, J. P. E., Serret Molins, M. D., & Araus, J. L. (2017). Comparative UAV and field phenotyping to assess yield and nitrogen use efficiency in hybrid and conventional barley. Frontiers in Plant Science, 8. [CrossRef]
- Köppl, C. J., Malureanu, R., Dam-Hansen, C., Wang, S., Jin, H., Barchiesi, S., Serrano Sandí, J. M., Muñoz-Carpena, R., Johnson, M., Durán-Quesada, A. M., Bauer-Gottwein, P., McKnight, U. S., & Garcia, M. (2021). Hyperspectral reflectance measurements from UAS under intermittent clouds: Correcting irradiance measurements for sensor tilt. Remote Sensing of Environment, 267. [CrossRef]
- Lally, H. T., O’Connor, I., Jensen, O. P., & Graham, C. T. (2019). Can drones be used to conduct water sampling in aquatic environments? A review. In Science of the Total Environment (Vol. 670, pp. 569–575). Elsevier B.V. [CrossRef]
- Lee, Q. R., Hesse, H., Naruangsri, K., Takaew, W., Elliot, S., & Bhatia, D. (2025). UAV-Based Precision Seed Dropping for Automated Reforestation. [CrossRef]
- Leem, J., Mehrishal, S., Kang, I. S., Yoon, D. H., Shao, Y., Song, J. J., & Jung, J. (2025). Optimizing Camera Settings and Unmanned Aerial Vehicle Flight Methods for Imagery-Based 3D Reconstruction: Applications in Outcrop and Underground Rock Faces. Remote Sensing, 17(11). [CrossRef]
- Li, Y., Li, C., Cheng, Q., Chen, L., Li, Z., Zhai, W., Mao, B., & Chen, Z. (2024). Precision estimation of winter wheat crop height and above-ground biomass using unmanned aerial vehicle imagery and oblique photoghraphy point cloud data. Frontiers in Plant Science, 15. [CrossRef]
- Lin, J., Wang, M., Ma, M., & Lin, Y. (2018). Aboveground tree biomass estimation of sparse subalpine coniferous forest with UAV oblique photography. Remote Sensing, 10(11). [CrossRef]
- Liu, X., & Wang, L. (2018). Feasibility of using consumer-grade unmanned aerial vehicles to estimate leaf area index in mangrove forest. Remote Sensing Letters, 9(11), 1040–1049. [CrossRef]
- Loarie, S. R., Joppa, L. N., & Pimm, S. L. (2007). Satellites miss environmental priorities. In Trends in Ecology and Evolution (Vol. 22, Issue 12, pp. 630–632). [CrossRef]
- Lussem, U., Hollberg, J., Menne, J., Schellberg, J., & Bareth, G. (2017). Using calibrated RGB imagery from low-cost UAVs for grassland monitoring: Case study at the Rengen Grassland Experiment (RGE), Germany. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives, 42(2W6), 229–233. [CrossRef]
- Lyu, X., Liu, S., Qiao, R., Jiang, S., & Wang, Y. (2025). Camera, LiDAR, and IMU Spatiotemporal Calibration: Methodological Review and Research Perspectives. In Sensors (Vol. 25, Issue 17). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
- Ma, Y., Ma, L., Zhang, Q., Huang, C., Yi, X., Chen, X., Hou, T., Lv, X., & Zhang, Z. (2022). Cotton Yield Estimation Based on Vegetation Indices and Texture Features Derived From RGB Image. Frontiers in Plant Science, 13. [CrossRef]
- Manfreda, S., McCabe, M. F., Miller, P. E., Lucas, R., Madrigal, V. P., Mallinis, G., Dor, E. Ben, Helman, D., Estes, L., Ciraolo, G., Müllerová, J., Tauro, F., de Lima, M. I., de Lima, J. L. M. P., Maltese, A., Frances, F., Caylor, K., Kohv, M., Perks, M., … Toth, B. (2018). On the use of unmanned aerial systems for environmental monitoring. In Remote Sensing (Vol. 10, Issue 4). MDPI AG. [CrossRef]
- Martínez-Carricondo, P., Agüera-Vega, F., & Carvajal-Ramírez, F. (2023). Accuracy assessment of RTK/PPK UAV-photogrammetry projects using differential corrections from multiple GNSS fixed base stations. Geocarto International, 38(1). [CrossRef]
- McEvoy, J. F., Hall, G. P., & McDonald, P. G. (2016). Evaluation of unmanned aerial vehicle shape, flight path and camera type for waterfowl surveys: Disturbance effects and species recognition. PeerJ, 2016(3). [CrossRef]
- Moghaddam, M., Saatchi, S., & Cuenca, R. H. (2000). Estimating subcanopy soil moisture with radar. Journal of Geophysical Research Atmospheres, 105(D11), 14899–14911. [CrossRef]
- Mohsan, S. A. H., Othman, N. Q. H., Li, Y., Alsharif, M. H., & Khan, M. A. (2023). Unmanned aerial vehicles (UAVs): practical aspects, applications, open challenges, security issues, and future trends. In Intelligent Service Robotics (Vol. 16, Issue 1, pp. 109–137). Springer Science and Business Media Deutschland GmbH. [CrossRef]
- Moreno-Martínez, Á., Izquierdo-Verdiguier, E., Maneta, M. P., Camps-Valls, G., Robinson, N., Muñoz-Marí, J., Sedano, F., Clinton, N., & Running, S. W. (2020). Multispectral high resolution sensor fusion for smoothing and gap-filling in the cloud. Remote Sensing of Environment, 247. [CrossRef]
- Mosharof, M. P. (n.d.). The Importance of Remote Sensing in Environmental Science. https://www.researchgate.net/publication/393977426.
- Mykytyn, P., Brzozowski, M., Dyka, Z., & Langendoerfer, P. (2024). A Survey on Sensor- and Communication-Based Issues of Autonomous UAVs. In CMES - Computer Modeling in Engineering and Sciences (Vol. 138, Issue 2, pp. 1019–1050). Tech Science Press. [CrossRef]
- Neuville, R., Bates, J. S., & Jonard, F. (2021). Estimating forest structure from UAV-mounted LiDAR point cloud using machine learning. Remote Sensing, 13(3), 1–19. [CrossRef]
- Nex, F., Armenakis, C., Cramer, M., Cucci, D. A., Gerke, M., Honkavaara, E., Kukko, A., Persello, C., & Skaloud, J. (2022). UAV in the advent of the twenties: Where we stand and what is next. In ISPRS Journal of Photogrammetry and Remote Sensing (Vol. 184, pp. 215–242). Elsevier B.V. [CrossRef]
- Nguyen, T. X. B., Rosser, K., & Chahl, J. (2021). A review of modern thermal imaging sensor technology and applications for autonomous aerial navigation. Journal of Imaging, 7(10). [CrossRef]
- Osmani, K., & Schulz, D. (2024). Comprehensive Investigation of Unmanned Aerial Vehicles (UAVs): An In-Depth Analysis of Avionics Systems. In Sensors (Vol. 24, Issue 10). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
- Ostrower, D. (2006). Optical Thermal Imaging - replacing microbolometer technology and achieving universal deployment. III-Vs Review, 19(6), 24–27. [CrossRef]
- Peksa, J., & Mamchur, D. (2024). A Review on the State of the Art in Copter Drones and Flight Control Systems. In Sensors (Vol. 24, Issue 11). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
- Poulsen, E., Rysgaard, S., Hansen, K., & Karlsson, N. B. (2024). HardwareX 18 (2024) e00518 Uncrewed aerial vehicle with onboard winch system for rapid, cost-effective, and safe oceanographic profiling in hazardous and inaccessible areas. HardwareX, 18, e00518. [CrossRef]
- Raeva, P. L., Šedina, J., & Dlesk, A. (2019). Monitoring of crop fields using multispectral and thermal imagery from UAV. European Journal of Remote Sensing, 52(sup1), 192–201. [CrossRef]
- Rietz, J., van Beeck Calkoen, S. T. S., Ferry, N., Schlüter, J., Wehner, H., Schindlatz, K. H., Lackner, T., von Hoermann, C., Conraths, F. J., Müller, J., & Heurich, M. (2023). Drone-Based Thermal Imaging in the Detection of Wildlife Carcasses and Disease Management. Transboundary and Emerging Diseases, 2023. [CrossRef]
- Saari, H., Aallos, V.-V., Akujärvi, A., Antila, T., Holmlund, C., Kantojärvi, U., Mäkynen, J., & Ollila, J. (2009). Novel miniaturized hyperspectral sensor for UAV and space applications. Sensors, Systems, and Next-Generation Satellites XIII, 7474, 74741M. [CrossRef]
- Sandino, J., Barthelemy, J., Doshi, A., Randall, K., Robinson, S. A., Bollard, B., & Gonzalez, F. (2025). Drone hyperspectral imaging and artificial intelligence for monitoring moss and lichen in Antarctica. Scientific Reports, 15(1). [CrossRef]
- Seidaliyeva, U., Ilipbayeva, L., Utebayeva, D., Smailov, N., & T.Matson, E. T. (2024). LiDAR Technology for UAV Detection: From Fundamentals and Operational Principles to Advanced Detection and Classification Techniques. [CrossRef]
- Sestras, P., Badea, G., Badea, A. C., Salagean, T., Oniga, V. E., Roșca, S., Bilașco, Ștefan, Bruma, S., Spalević, V., Kader, S., Billi, P., & Nedevschi, S. (2025). A novel method for landslide deformation monitoring by fusing UAV photogrammetry and LiDAR data based on each sensor’s mapping advantage in regards to terrain feature. Engineering Geology, 346. [CrossRef]
- Shahbazi, M., Théau, J., & Ménard, P. (2014). Recent applications of unmanned aerial imagery in natural resource management. GIScience and Remote Sensing, 51(4), 339–365. [CrossRef]
- Shaik, H. S. (2024). Design and development of a quadcopter for agricultural seeding. In Hyperautomation in Precision Agriculture: Advancements and Opportunities for Sustainable Farming (pp. 277–288). Elsevier. [CrossRef]
- Shero, M. R., Dale, J., Seymour, A. C., Hammill, M. O., Mosnier, A., Mongrain, S., & Johnston, D. W. (2021). Tracking wildlife energy dynamics with unoccupied aircraft systems and three-dimensional photogrammetry. Methods in Ecology and Evolution, 12(12), 2458–2472. [CrossRef]
- Shi, Y., Alex Thomasson, J., Murray, S. C., Ace Pugh, N., Rooney, W. L., Shafian, S., Rajan, N., Rouze, G., Morgan, C. L. S., Neely, H. L., Rana, A., Bagavathiannan, M. V., Henrickson, J., Bowden, E., Valasek, J., Olsenholler, J., Bishop, M. P., Sheridan, R., Putman, E. B., … Yang, C. (2016). Unmanned aerial vehicles for high-throughput phenotyping and agronomic research. PLoS ONE, 11(7). [CrossRef]
- Shin, J. Il, Seo, W. W., Kim, T., Park, J., & Woo, C. S. (2019). Using UAV multispectral images for classification of forest burn severity-A case study of the 2019 Gangneung forest fire. Forests, 10(11). [CrossRef]
- Sieberth, T., Wackrow, R., & Chandler, J. H. (2014). Motion blur disturbs - the influence of motion-blurred images in photogrammetry. Photogrammetric Record, 29(148), 434–453. [CrossRef]
- Stern, M., Ferrell, R., Flint, L., Kozanitas, M., Ackerly, D., Elston, J., Stachura, M., Dai, E., & Thorne, J. (2024). Fine-scale surficial soil moisture mapping using UAS-based L-band remote sensing in a mixed oak-grassland landscape. Frontiers in Remote Sensing, 5. [CrossRef]
- Straight, B. J., Castendyk, D. N., McKnight, D. M., Newman, C. P., Filiatreault, P., & Pino, A. (2021). Using an unmanned aerial vehicle water sampler to gather data in a pit-lake mining environment to assess closure and monitoring. Environmental Monitoring and Assessment, 193(9). [CrossRef]
- Stuart, M. B., McGonigle, A. J. S., & Willmott, J. R. (2019). Hyperspectral imaging in environmental monitoring: A review of recent developments and technological advances in compact field deployable systems. In Sensors (Switzerland) (Vol. 19, Issue 14). MDPI AG. [CrossRef]
- Stutsel, B., Johansen, K., Malbéteau, Y. M., & McCabe, M. F. (2021). Detecting Plant Stress Using Thermal and Optical Imagery From an Unoccupied Aerial Vehicle. Frontiers in Plant Science, 12. [CrossRef]
- Swaminathan, V., Thomasson, J. A., Hardin, R. G., Rajan, N., & Raman, R. (2024). Radiometric calibration of UAV multispectral images under changing illumination conditions with a downwelling light sensor. Plant Phenome Journal, 7(1). [CrossRef]
- Thompson, L. J., & Puntel, L. A. (2020). Transforming unmanned aerial vehicle (UAV) and multispectral sensor into a practical decision support system for precision nitrogen management in corn. Remote Sensing, 12(10). [CrossRef]
- Townsend, A., Jiya, I. N., Martinson, C., Bessarabov, D., & Gouws, R. (2020). A comprehensive review of energy sources for unmanned aerial vehicles, their shortfalls and opportunities for improvements. In Heliyon (Vol. 6, Issue 11). Elsevier Ltd. [CrossRef]
- Treccani, D., Adami, A., & Fregonese, L. (2024). Drones and Real-Time Kinematic Base Station Integration for Documenting Inaccessible Ruins: A Case Study Approach. Drones, 8(6). [CrossRef]
- Vera-Yanez, D., Pereira, A., Rodrigues, N., Molina, J. P., García, A. S., & Fernández-Caballero, A. (2024). Optical Flow-Based Obstacle Detection for Mid-Air Collision Avoidance. Sensors, 24(10). [CrossRef]
- Villi, Ö., & Yavuz, H. (n.d.). The utilization of gimbal systems in unmanned aerial vehicles. Advanced UAV, 2024(1), 19–30. http://publish.mersin.edu.tr/index.php/uav.
- Wagner, W., Pathe, C., Doubkova, M., Sabel, D., Bartsch, A., Hasenauer, S., Blöschl, G., Scipal, K., Martínez-Fernández, J., & Löw, A. (2008). Temporal Stability of Soil Moisture and Radar Backscatter Observed by the Advanced Synthetic Aperture Radar (ASAR). Sensors, 8, 1174–1197. www.mdpi.org/sensors.
- Wan, Q., Smigaj, M., Brede, B., & Kooistra, L. (2024). Optimizing UAV-based uncooled thermal cameras in field conditions for precision agriculture. International Journal of Applied Earth Observation and Geoinformation, 134. [CrossRef]
- Wu, Z., Xie, D., Liu, Z., Chen, Q., Ye, Q., Ye, J., Wang, Q., Liao, X., Wang, Y., Sharma, R. P., & Fu, L. (2025). Developing compatibility biomass model based on UAV LiDAR data of Chinese fir (Cunninghamia lanceolata) in Southern China. Frontiers in Plant Science, 16. [CrossRef]
- Xiao, C., Wang, B., Zhao, D., & Wang, C. (2023). Comprehensive investigation on Lithium batteries for electric and hybrid-electric unmanned aerial vehicle applications. Thermal Science and Engineering Progress, 38. [CrossRef]
- Xing, S., Zhang, X., Tian, J., Xie, C., Chen, Z., & Sun, J. (2024). Morphing Quadrotors: Enhancing Versatility and Adaptability in Drone Applications—A Review. In Drones (Vol. 8, Issue 12). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
- Xu, W., Yang, W., Wu, J., Chen, P., Lan, Y., & Zhang, L. (2023). Canopy Laser Interception Compensation Mechanism—UAV LiDAR Precise Monitoring Method for Cotton Height. Agronomy, 13(10). [CrossRef]
- Yan, Y., Lei, J., & Huang, Y. (2024). Forest Aboveground Biomass Estimation Based on Unmanned Aerial Vehicle–Light Detection and Ranging and Machine Learning. Sensors, 24(21). [CrossRef]
- Yu, K., Belwalkar, A., Wang, W., Hu, Y., Hunegnaw, A., Nurunnabi, A., Ruf, T., Li, F., Jia, L., Kooistra, L., Miao, Y., & Teferle, F. N. (2025). UAV Hyperspectral Remote Sensing for Crop Nitrogen Monitoring: Progress, Challenges, and Perspectives. Smart Agricultural Technology, 101507. [CrossRef]
- Zhan, Y., Chen, P., Xu, W., Chen, S., Han, Y., Lan, Y., & Wang, G. (2022). Influence of the downwash airflow distribution characteristics of a plant protection UAV on spray deposit distribution. Biosystems Engineering, 216, 32–45. [CrossRef]
- Zhang, G., & Hsu, L. T. (2018). Intelligent GNSS/INS integrated navigation system for a commercial UAV flight control system. Aerospace Science and Technology, 80, 368–380. [CrossRef]
- Zhang, L., Niu, Y., Zhang, H., Han, W., Li, G., Tang, J., & Peng, X. (2019). Maize Canopy Temperature Extracted From UAV Thermal and RGB Imagery and Its Application in Water Stress Monitoring. Frontiers in Plant Science, 10. [CrossRef]
- Zhao, J. S., Sun, X. C., & Sun, H. L. (2025). Design and strength analysis of a gimbaled nozzle mechanism. Mechanism and Machine Theory, 209. [CrossRef]

| Sensor Type | Key Specifications | Applications |
| RGB/optical camera | Highresolution CMOS sensor (≥ 20 MP); global shutter; high dynamic range | Photogrammetry; structurefrommotion 3D models; landcover and habitat mapping. |
| Multispectral camera | Discrete bands (e.g., blue, green, red, rededge, nearinfrared), radiometric calibration; centimetrelevel geolocation | Vegetation indices (e.g., NDVI) for crop monitoring, biomass estimation, irrigation and nitrogen management burnseverity mapping. |
| Thermal (microbolometer) | Uncooled microbolometer array; longwave IR (7.5–13.5 µm) with < 50 mK thermal sensitivity, often integrated with RGB camera | Detecting warmblooded animals and wildlife carcasses, crop waterstress mapping, hotspot detection in wildfires; assessing heat leaks and smouldering areas |
| Hyperspectral camera | Hundreds of contiguous bands; example mini sensor: 400–1100 nm, 5–10 nm spectral resolution, < 50 g weight. | Species discrimination and invasivespecies mapping assessing crop nutrient status and plant stress. |
| LiDAR | Scanning laser emits pulsed light; measures return time to generate dense 3-D point cloud typical range limited by power/weight | Forest inventory and above-ground biomass estimation, canopy height and crown dimension mapping, topographic mapping, landslide/erosion monitoring. |
| Radar | Millimeterwave or frequencymodulated continuouswave sensors; emit radio waves and record frequency/time shifts. | Collision avoidance and terrain following for small UAVs; canopy biomass and soilmoisture estimation via backscatter |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).