Preprint
Review

This version is not peer-reviewed.

Sensor and Actuator Technologies for Environmental Unmanned Aerial Vehicles Operations

Submitted:

28 November 2025

Posted:

01 December 2025

You are already at the latest version

Abstract
Over the past decade, drones have moved from experimental tools to practical workhorses in environmental monitoring. Their real value lies in the sensors and actuators they carry, which determine the kind of information that can be collected and how reliably it can be gathered. This review looks at the range of sensor systems now used in ecological, forestry, and agricultural work—from standard RGB cameras to multispectral and hyperspectral units, LiDAR, thermal imagers, and radar. It also considers the often-overlooked role of actuators, such as gimbals and stabilizing mounts, that keep sensors steady and improve data quality, as well as devices for collecting physical samples. Drawing field-based examples, this study explores how these technologies are deployed for tasks like mapping forest biomass, tracking habitat change, assessing fire recovery, and monitoring crops. It concludes by discussing practical constraints, emerging sensor designs, and the likely direction of drone-based monitoring in the years ahead.
Keywords: 
;  ;  ;  ;  

1. Introduction

Remote sensing has long been a cornerstone of environmental research and management (Mosharof, 2025), yet conventional satellite and crewed-aircraft imagery have imposed some significant limitations (Manfreda et al., 2018). These limitations include acquiring high-resolution, multi-temporal data, which can be expensive, and the resulting image sequences may not be available simultaneously with the same spectral and spatial resolution due to atmospheric and radiometric correction (Hong & Zhang, 2008; Moreno-Martínez et al., 2020). Also, many environmental and ecological phenomena are being missed because satellites may capture them across varying spatial and temporal scales (Loarie et al., 2007). However, in the early 1990s, the emergence of lightweight unmanned aerial vehicles (UAVs) also known as drones began to address these constraints submerged (Imran & Li, 2025). Early use of these UAVs was small-scale and often required custom payload mounts; however, within a decade, their usage evolved from experimental remote sensing platforms to consumer-grade platforms that enabled environmental users to collect data over remote or inaccessible areas at relatively low cost with high precision and quality data collection (Acharya et al., 2021; Huang et al., 2025a).
A key driver of this transformation has been the parallel evolution of the UAV platform, including its important functional components such as sensors and actuators (Arfaoui, 2017; Balestrieri et al., 2021; Mykytyn et al., 2024). By the mid-2010s, three-axis gimbals replaced the earlier two-axis gimbals, with capabilities to stabilise cameras across pitch, roll, and yaw, as well as replacing the vibration dampers used in early experiments (Zhao et al., 2025). These sensor mounts not only minimise motion blur and spectral smearing but also ensure consistent viewing geometry during manoeuvres, which is critical for both photogrammetry and radiometric calibration (Balestrieri et al., 2021). At the same time, advances in its propulsion and flight control systems and integrated Global Navigation Satellite Systems (GNSS) navigation dramatically improved positional accuracy (G. Zhang & Hsu, 2018). The evolving modern UAVs often carry real-time kinematics (RTK) GNSS receivers and communicate corrections to a ground base station, enabling centimetre-scale georeferencing without over-reliance on ground control points (Alkan, 2024), particularly when such points are unavailable or incomplete, while still allowing accurate stitching and processing of images for proper alignment(Martínez-Carricondo et al., 2023; Treccani et al., 2024).
Also, its sensor technology has matured in tandem with the actuator’s components (Han et al., 2023). Most UAVs carry consumer-grade RGB cameras at the start of the shift from experimental to consumer-grade sensor components, which limited image analysis to colour and texture only (Liu & Wang, 2018; Lussem et al., 2017). Well before the COVID-19 pandemic, off-the-shelf multispectral and near-infrared cameras were widely available, allowing users to derive varieties of vegetation, including the normalised difference vegetation index (NDVI) (Li et al., 2025). Thermal sensor cameras, capable of detecting emissivity and object temperature in a non-contact and non-destructive manner, also became accessible to users (Adhitama Putra Hernanda et al., 2024; Nguyen et al., 2021). More recent developments in compact LiDAR units, now mounted on multirotor platforms, have opened new possibilities for mapping canopy structure due to their penetration and return advantages (Neuville et al., 2021). Furthermore, hyperspectral sensors with hundreds of spectral bands (Bannon, 2009), in multiples beyond what RGB and multispectral sensors offer are also emerging, providing fine-grained spectral information for species identification and detailed forest and agricultural vegetation health assessment.
Collectively, these developments have shifted UAVs from experimental tools to standard instruments across ecology, forestry, and agriculture (Haque et al., 2025; Shin et al., 2019; Stutsel et al., 2021; Thompson & Puntel, 2020). Their ability to deploy lightweight platforms equipped with stabilised sensors ranging from RGB cameras to multispectral, thermal and LiDAR units has given users unprecedented flexibility in designing spatially and temporally specified surveys (Guebsi et al., 2024; Karahan et al., 2025; Shahbazi et al., 2014). Demonstrations of these platforms include studies on forest biomass mapping, habitat change, fire recovery tracking, crop growth, and lodging monitoring. This study provides a comprehensive overview of the current advancements in drone sensors and actuators, evaluating the capabilities of key sensor types, ranging from standard RGB to hyperspectral and thermal imaging, and analysing the role of stabilisation and sampling actuators in enhancing data fidelity. Additionally, we also discuss practical limitations, including payload capacity, endurance constraints, environmental hazards, data management complexities, and regulatory frameworks, while highlighting emerging innovations such as miniaturised hyperspectral sensors, solid-state LiDAR, on-board machine learning, and swarm robotics. This study reviews current capabilities alongside these gaps, offering foundational knowledge and understanding for improving technical know-how in deploying these platforms' sensors and actuators for UAV-based environmental monitoring and identifying future research directions.

2. Drone Senor Technologies

Environmental UAVs carry an array of onboard sensors that convert optical, structural, and thermal signals into digital data. When coupled with precise positioning and flight control systems, these sensors transform drones into versatile remote sensing platforms for ecology, forestry, and agriculture (Haque et al., 2025; Shin et al., 2019; Stutsel et al., 2021; Thompson & Puntel, 2020). Table 1 show their specifications along with their applications and brief visual description of these components in Figure 1.

2.1. Optical and Multispectral Sensor Imager

Building on remote sensing foundational advancement, optical and multispectral cameras enhance the splitting of incoming light into discrete bands, typically blue, green, red, red-edge, and near-infrared (NIR) ranges from 400nm to more than 700nm, recording each band with specific detectors (James Campbell & Wynne, 2011). Understanding the reflectance-based indices, such as the normalized difference vegetation index (NDVI) in analyzing the differences between the red and NIR bands is crucial in measuring chlorophyll content and detecting shifts in vegetation health. Multispectral cameras build on this foundation by splitting light into discrete bands typically green, red, red-edge and near-infrared—and recording each on dedicated detectors. Their reflectance-based indices, such as NDVI, exploit differences between red and NIR reflectance to measure chlorophyll content and plant vigor.

2.2. Hyperspectral and Thermal Sensor Imager

Integrating prisms and diffraction gratings in hyperspectral systems increases their efficiency in recording hundreds of spectral bands for each pixel through a reflectance spectrum (Bannon, 2009). The detailed information captured is crucial for assessing pigment and nutrient concentrations of plant species and their biochemical properties (Chance et al., 2016; Huelsman et al., 2022). The sensors' miniaturization contributes to wavelength coverage from approximately 500 to 1000 nm and the production of hyperspectral payloads weighing tens of grams, making them suitable for fixed-wing drones (Jakob et al., 2016; Stuart et al., 2019). Applications of hyperspectral sensors include invasive species mapping, plant-stress detection, and crop monitoring at high resolution (Yu et al., 2025). Thermal imagers integrate the long-wave infrared sensors to measure thermal emission using uncooled microbolometer arrays. For example, the DJI XT2 integrates the FLIR Tau microbolometer with a spectral range of 7.5 to 13.5 µm, coupled with a highly sensitive visual sensor, which enables temperature data to be overlaid on colour imagery (Ostrower, 2006).

2.3. Lidar and Radar Sensor Imager

Light detection and ranging (LiDAR) instruments emit laser pulses and measure their return time to generate three-dimensional point clouds that measure canopy height, crown size, and ground elevation (Andersen et al., 2005; Corte et al., 2020; Hollaus et al., 2006; Jaskierniak et al., 2021; Yan et al., 2024). However, the size, weight and energy consumption of lidar scanners currently confine their use to UAVs with higher payload capacity (Seidaliyeva et al., 2024). Likewise, its radar components, specifically the ranging capabilities of certain drone sensors, emit radio waves and record the time and frequency of the shift of the reflected signals to measure range and relative motion (Besenyő & Őszi, 2025). Small UAVs employ millimetre wave or frequency modulated continuous wave radars for collision avoidance and terrain following (Markow & Catherall, 2025). Radar backscatter analysis can also be used to estimate canopy biomass and soil moisture (Moghaddam et al., 2000; Wagner et al., 2008).

3. Drone Actuator Technologies

While sensors capture environmental information, actuators enable drones to physically interact with their surroundings and ensure consistent data collection (Shaik, 2024). They stabilise cameras and scanners, manipulate sampling devices and disperse materials such as seeds or reforestation pellets (Kunz, M., & Ertl, T., 2022). Advances in motor design, control algorithms, and mechanical integration have produced a diverse suite of actuators optimised for these tasks (Stamate et al., 2022; Shaik, 2025). Hence, the following sub-section reviews major actuator categories and their roles in ecological and agricultural missions.

3.1. Gimbals and Stabilizers

Starting with motorised gimbals, that uses and enables sensors to counteract unwanted movements for steady and smooth coverage, help in balancing multiple camera axes. They use two or three brushless motors arranged orthogonally and guided by an inertial measurement unit to counteract roll, pitch and yaw (Zhao et al., 2025). By keeping the optical axis level, gimbals suppress motion blur and distortion, increasing the accuracy of photogrammetry and the signal-to-noise ratio of multispectral or hyperspectral measurements (Leem et al., 2025; Sieberth et al., 2014). These gimbals, which are rated by payload capacity—from a few hundred grams to several kilograms—and by their degrees of freedom, are typically three-axis mounts for mapping and surveying, while two-axis versions suffice for smaller fixed-wing aircraft. (Dhruv & Kaushal, 2025). Their high-end mounts may also integrate optical zoom lenses, thermal modules, or laser rangefinders, delivering co-registered datasets from a single stabilised platform and some incorporate linear servos to adjust the mount instantly to the UAV’s motion.

3.2. Linear Actuators and Winches

Linear actuators are used to convert electrical energy to controlled linear movement to facilitate the operation of hatches, deployment of probes and its extension, and management of valves within payload bays (Mclvor & Chahl, 2020). Precision is essential, a mistimed release of a seed dispersal pod or sediment trap can endanger bystanders or yield poor sample quality. Modern micro actuators incorporate secure locking mechanisms, lead screw or belt-drive transmissions, and low-power motors, allowing millimetre-scale accuracy on lightweight platforms (Kedzierski & Chea, 2021). Though sampling often requires lowering instruments below the rotor wash or into water (Aurell & Gullett, 2024; Zhan et al., 2022), the motorised winches spool lines to raise and lower sensors or bottles, using feedback from tension and depth sensors to ensure safe deployment (Crowe et al., 2025). In complex terrain or dense canopies, it may be advantageous to insert cameras or sensors into cavities, beneath canopy layers, or into caves. Winch-borne camera systems lower a payload on a cable, allowing inspection with precise winch control and stabilisation at the camera end. These systems help minimise swinging and obtain usable footage (Poulsen et al., 2024)..

3.3. Sampling and Dispersal Mechanisms

Some ecological and agricultural missions often require UAVs to collect physical samples and disperse materials using air pumps positioned above the rotor wash, enabled by linear actuators and winches. Examples of this are illustrated by water-sampling drones that suspend Ruttner or Van Dorn bottles on tethers to collect uncontaminated samples from wetlands and rivers, while tethered surveillance platforms supply continuous power and high-bandwidth data for long-duration habitat monitoring, albeit at the cost of reduced mobility(Lally et al., 2019; Straight et al., 2021). The functionality of pump-based UAV devices was reported by Lally et al. (2019), noting that winches aided flow-rate control and sampling intervals, which could be adjusted remotely to avoid downwash and contamination. In habitat restoration, actuators can regulate the release of reforestation pellets or native seeds to rehabilitate degraded or post-fire landscapes using servo-controlled hoppers (Castro et al., 2023; Lee et al., 2025). Coupled with spectral data on burn severity or species composition, these dispersal systems enable targeted reseeding and minimise disturbance.

3.4. Rigid Grippers, Manipulators and Soft Robotic Actuators

Traditional rigid grippers and manipulators are heavy and require precise models and control, which limits their use on small UAVs and results in significant challenges to manoeuvrability. Hence, bio-inspired grippers use flexible materials and phase-change mechanisms to conform around objects, while self-contained soft grippers inspired by tendril-climbing plants can curl around targets and grasp them without complex planning (Guo Xinyu et al., 2024). Looking forward, developing deployable arms and grippers that allow drones to perch and manipulate objects, as well as soft robotic actuators inspired by plant tendrils and animal appendages, could enable drones to gently collect samples or install sensors in sensitive habitats (Galloway et al., 2016). Furthermore, miniaturisation and integration of actuators with control electronics will support more complex physical interactions while conserving weight and energy, thereby broadening the role of drones in environmental monitoring (Fumian et al., 2020; Nex et al., 2022)..

4. Case Studies of Sensor-Actuator Deployments

We highlighted diverse studies in Table 2 that span a decade of UAV-based environment use cases and collectively synthesize how UAV sensing and actuation have evolved and been applied across ecology, forestry and agriculture. Firstly, our compilation shows that early ecological studies and use cases relied on single-camera sensing, specifically RGB camera sensors mounted on gimbals for wildlife surveys (McEvoy et al., 2016). With progress in sensor technology, multispectral and thermal sensor cameras have been used in tandem (Kefauver et al., 2017), while more recent studies combine these sensors including hyperspectral and even LiDAR on a single platform to capture structural and physiological ecosystem complexities (Wu et al., 2025). This underscores the shift towards multi-modal sensing data acquisition to provide solutions to increasingly complex ecological questions.
Secondly, actuator roles in stabilising cameras that embed these sensors have evolved from simple mounts to active two-axis and three-axis gimbals. They synchronise with autopilot systems, ensuring consistent overlap and nadir pointing even when carrying heavy payloads such as LiDAR and thermal cameras. They also allow high-resolution cameras to remain level and reduce motion blur and spectral smearing, which are essential for species identification, crop phenotyping, and precision mapping (McEvoy et al., 2016). Beyond stabilising cameras, actuators also enable UAVs to collect physical samples and assist in re-seeding, signalling a broadening of UAV applications from remote sensing towards active environmental sampling.
Significantly, ecology-based applications rely on high-resolution RGB or thermal cameras with gimbals to minimise disturbance while enabling species detection and count surveys for behaviour response monitoring. Custom actuators such as petri-dish arrays have been demonstrated, turning UAVs into non-invasive biological samplers. Similarly, agriculture has relied on high-resolution multispectral and RGB sensors, including thermal cameras, as well as RTK-enabled multirotors for repeated phenotyping and yield estimation. Gimbals and mechanical shutters ensure radiometric fidelity, while autopilots maintain consistent altitude and overlap (Dorelis et al., 2025). Forestry has increasingly used oblique photogrammetry point clouds, either from structure-from-motion derived from RGB or multispectral sensors and LiDAR, to measure canopy structure and above-ground biomass.
Hence, we trace the technological evolution from early gimbal-stabilised RGB cameras to multi-sensor, multi-actuator systems of the past decade, comparing how sensor–actuator pairings align with domain-specific goals and how the integration of high-fidelity sensors is transforming ecological, forestry, and agricultural monitoring.
Figure 2. Flowchart categorizing the sensor and actuator technologies on environmental drones and their applications. The left branch lists key sensor types (RGB, multispectral, thermal, hyperspectral, LiDAR, radar) linked to typical uses like photogrammetry mapping, crop health monitoring, wildfire severity and risk hotspot assessment, species/invasive mapping, forest biomass estimation, and soil moisture assessment. The right branch shows major actuator types (gimbals, winches and sampling devices, grippers, robotic arms) associated with functions such as camera stabilization for blur-free imaging, lowering sensors for water sampling, and aerial seed dispersal for ecosystem restoration. Each category of sensor or actuator is thus mapped to environmental monitoring tasks that benefit from that technology. .
Figure 2. Flowchart categorizing the sensor and actuator technologies on environmental drones and their applications. The left branch lists key sensor types (RGB, multispectral, thermal, hyperspectral, LiDAR, radar) linked to typical uses like photogrammetry mapping, crop health monitoring, wildfire severity and risk hotspot assessment, species/invasive mapping, forest biomass estimation, and soil moisture assessment. The right branch shows major actuator types (gimbals, winches and sampling devices, grippers, robotic arms) associated with functions such as camera stabilization for blur-free imaging, lowering sensors for water sampling, and aerial seed dispersal for ecosystem restoration. Each category of sensor or actuator is thus mapped to environmental monitoring tasks that benefit from that technology. .
Preprints 187267 g002
Table 2. Summary of UAV sensor–actuator configurations and their ecological and agricultural applications across selected studies for the past decades.
Table 2. Summary of UAV sensor–actuator configurations and their ecological and agricultural applications across selected studies for the past decades.
Year & study Drone sensor used Actuator description|How sensor and actuator work together Usage
(McEvoy et al., 2016)
Waterfowl survey evaluation
Phase 1 medium-format camera (50 MP, 80 mm lens), Sony A7-R (36.4 MP), Sony RX-1 (24.3 MP) and GoPro Hero (5 MP) mounted on fixed-wing and multirotor UAVs All cameras were mounted in gimbals to stabilise images. Gimbals kept the cameras level despite turbulence, and the UAVs’ power determined payload size. Compare disturbance of different UAV shapes and flight paths on free-living waterfowl while testing various camera payloads. High-resolution imagery allowed species recognition and demonstrated that flights ≥60 m (fixed-wing) or ≥40 m (multirotor) caused minimal disturbance
(Shi et al., 2016)
High-throughput phenotyping with fixed-wing and rotary UAVs (USA)
Three UAV platforms were tested: a Sentek GEMS multispectral camera (1.2 MP) on an Anaconda fixed-wing UAV, a Nikon J3 RGB camera and a modified NIR-enabled Nikon J3 multispectral camera on a Lancaster fixed-wing UAV, and a DJI P3-005 4 K RGB camera on an X88 octocopter. The multispectral sensors provided reflectance in blue, green, red and near-infrared bands; the Nikon cameras provided RGB and NIR images. Cameras were mounted on gimbal systems: the Lancaster’s cameras were triggered by an onboard controller that adjusted the frame rate based on flight speed; the X88 octocopter’s camera was remotely triggered. The gimbals kept the cameras level while the UAVs manoeuvred, and the onboard controllers triggered images at intervals to maintain sufficient forward and side overlap for mosaicking. This allowed the extraction of plant height, canopy coverage, and phenotypic traits from multispectral and RGB images.
Provided early evidence that fixed-wing UAVs with multispectral cameras can rapidly acquire high-resolution imagery over breeding trials, enabling estimation of plant height and vegetation indices; allowed breeders to detect lodging, flowering time and yield differences across genotypes.
(Hodgson et al., 2016)
Colony-nesting bird counts
Canon EOS M mirrorless camera (5184×3456 px) with 40 mm lens Camera mounted facing downward on an octocopter; vibration blur mitigated using a commercial vibration-dampening plate and iSPONGE for flying-wing platform. Firmware (MagicLantern) controlled an intervalometer for images every 2–3 seconds. Obtain high-precision counts of colony-nesting birds in tropical and polar environments. UAV-derived counts were an order of magnitude more precise than ground counts
(Kefauver et al., 2017)
Comparative UAV and field phenotyping for barley
A Mikrokopter Oktokopter 6S12 XL multirotor UAV carries a Panasonic GX7 RGB camera (16 MP), a FLIR Tau2 640 thermal camera and a Tetracam mini-MCA multispectral camera (11 bands from 450 to 950 nm). All cameras were mounted on an MK HiSight SLR2 active two-axis gimbal to correct for UAV pitch and roll; the gimbal ensured nadir pointing and consistent overlap. The gimbal stabilised each sensor, allowing the UAV to carry different payloads in separate flights. Combined RGB, thermal and multispectral imagery allows estimation of canopy temperature, vegetation indices and nitrogen use efficiency across barley hybrids. UAV-based indices correlated strongly with yield, demonstrating that UAV phenotyping can replace labour-intensive field measurements.
(Raeva et al., 2019)
Monitoring corn and barley fields with multispectral and thermal imagery
An eBee fixed-wing UAV (senseFly) carried a multiSPEC 4C four-band multispectral camera (green 550 nm, red 660 nm, red-edge 735 nm, NIR 790 nm) and a senseFly thermoMAP thermal camera (7.5–13.5 µm, 640×512 px). Image resolution was ~15 cm (multispectral) and 20 cm (thermal) at 40–150 m flight heights. The UAV’s autopilot maintained consistent altitude and high overlap (~90%) to create large orthomosaics. The multispectral camera irradiance sensor and calibration target allowed reflectance values to be computed; thermal images were calibrated by comparing shutter images with an internal temperature sensor. Generated NDVI, NDRE and thermal maps for corn and barley fields over multiple months; thermal imagery complemented multispectral indices by revealing soil and canopy temperature variations.
(Lin et al., 2018)
Aboveground Tree Biomass Estimation of Sparse Subalpine Coniferous Forest with UAV Oblique Photography
A fixed-wing UAV carried a Sony ILCE-5100 RGB camera (6000 × 4000 px). The camera was mounted so that its lens axis was tilted ~20° from nadir to capture oblique views; images were processed with aerial triangulation (Structure-from-Motion) to produce point clouds and canopy height models The UAV platform had a 1.2 m wingspan, 0.8 m fuselage, 4.2 kg working weight and used an ANXIANG 2012 flight controller to stabilise the aircraft during oblique imaging. An Inertial Measurement Unit (IMU) and GPS logged position and attitude; the ground control system handled trajectory planning and remote command. The camera’s tilt angle and the flight controller’s stabilisation allowed overlapping oblique photographs (80 % longitudinal and 60 % lateral overlap) at ~400 m altitude to be taken, producing high-resolution (0.05 m) point clouds over a sparse forest. Demonstrated that oblique RGB photography can estimate above-ground biomass (AGB) in sparse subalpine forests. The allometric model using tree heights extracted from UAV point clouds achieved R² ≈ 0.96 and RMSE ~54.9 kg in AGB estimation. The study showed that fixed-wing UAVs with consumer cameras provide a cost-effective alternative to LiDAR for sparse forests.
(L. Zhang et al., 2019)
Maize canopy temperature extraction with UAV thermal and RGB imagery.
Developed a hexa-copter thermal remote sensing system with a PIXHAWK autopilot, a FLIR Vue Pro R 640 thermal camera (7.5–13.5 µm, 640×512 px) mounted on a Feiyu brushless gimbal, and a DJI Phantom 4 Pro quad-rotor with an integrated RGB camera (1-in CMOS, 4864×3648 px) The Feiyu gimbal stabilised the thermal camera. The gimbal kept the thermal camera pointing vertically; calibration used black and white boards measured with an infrared thermometer. Accurate co-registration of thermal and RGB orthomosaics enabled extraction of maize canopy temperature (Tc). Tc and derived indicators correlated strongly with stomatal conductance and water stress, demonstrating that high-resolution RGB imagery supplements thermal data for water stress monitoring.
(Shero et al., 2021)
Grey seal energy dynamics via 3-D photogrammetry
24.3 MP Sony α5100 mirrorless camera with 30 mm lens. Mounted on a Freefly Mōvi M5 gimbal attached to a Freefly Alta 6 hexacopter. Two-person operation (pilot + gimbal/camera operator) enables smooth 360° orbit flights. Orbit-mode autopilot kept the UAV circling seal groups while the gimbal pointed the camera at the focal point. Produced 3-D models and volumetric estimates of hundreds of grey seals simultaneously; 3-D body volume predicted true mass within 2–10 % error. Enabled energy-transfer studies across lactation seasons.
(Costa et al., 2023)
Whale blow sampling
Custom multirotor equipped with petri dishes (no imaging sensor) to collect whale exhaled mucus The drone carried a petri-dish payload; pilots positioned the multirotor above whale blowholes to capture droplets. Stabilisation and precise hovering acted as the actuator enabling the petri dishes to intercept the exhaled plume. Non-invasive collection of respiratory samples from humpback whales for microbiome and hormone analyses. Avoided more invasive biopsy methods and reduced stress to animals
(Ma et al., 2022)
Cotton yield estimation with RGB imagery
A DJI Phantom 4 Advanced quad-rotor equipped with an integrated RGB camera (5472×3648 px) captured ultra-high-resolution images (0.3 cm/pixel) at 10 m altitude. The UAV’s built-in gimbal kept the camera pointing vertically. The gimbal and precise flight control enabled acquisition of 387 nadir images, which were mosaicked into orthophotographs retaining 8-bit RGB data. Vegetation indices and texture features derived from the RGB images were used to build regression and machine-learning models (e.g., PLSR, support vector regression) for estimating cotton yield. The models achieved high correlation (R²≈0.91) and demonstrated that ultra-high-resolution RGB imagery can monitor yield just before harvest.
(Deane, 2023)
Low-cost drones for habitat change
DJI Mavic 2 Pro (20 MP Hasselblad camera), DJI Mavic 3T (48 MP), and 3DR Solo with GoPro Hero 4. Integrated three-axis gimbals on each drone stabilized the RGB cameras during mapping flights. Mission Planner/Pix4D software executed automated grid flights with high overlap. Produced high-resolution Ortho mosaics for mapping vegetation, mangroves and shoreline change in Bahamian national parks; demonstrated feasibility of low-cost platforms for conservation monitoring.
(Camenzind & Yu, 2023)
Multi-temporal multispectral UAV remote sensing for early wheat yield prediction
In 2021 the study used a DJI Phantom 4 Multispectral RTK UAV capturing reflectance at 450, 560, 650, 730 and 840 nm with an integrated sunlight sensor; flight height 10 m AGL produced 0.7 cm GSD. In 2022 a MicaSense Dual Camera Kit (444, 560, 650, 717, 842 nm) mounted on a DJI Matrice M300 RTK at 30 m AGL produced 2.5 cm GSD Both UAVs have RTK positioning; cameras were mounted on integrated gimbals. The sunlight sensor and reflectance panels allowed radiometric calibration; the integrated gimbal stabilised the sensor during flight. Time-series reflectance and texture features from heading to harvest enabled random-forest models to predict wheat grain yield weeks before flowering. Early-season spectral indices (e.g., red edge) were highly correlated with yield, offering breeders a non-destructive yield assessment tool.
(Hoffmann et al., 2023)
Oyster reef volumetric monitoring
DJI Phantom 4 Pro multirotor with integrated 20 MP RGB camera. Camera mounted on a three-axis gimbal; flights planned via DroneDeploy with high overlap; numerous ground control points (GCPs) ensure precise geo-referencing. Generated centimetre-scale digital elevation models and orthomosaics to monitor seasonal growth of oyster reefs in the Wadden Sea.
(Demmer et al., 2024)
Behavioral monitoring of grey-crowned cranes
DJI Mavic Air 2S with 1-inch 20 MP CMOS sensor and 8× zoom. Three-axis gimbal stabilised camera; low-noise propellers reduced disturbance. The UAV hovered or slowly circled above cranes at ≥50 m Recorded fine-scale behaviours of endangered cranes without eliciting flight responses; allowed quantification of foraging and social behaviours.
(Stern et al., 2024)
Fine-scale soil-moisture mapping
Black Swift S2 fixed-wing and E2 multirotor carrying optical, near-infrared, thermal cameras and a passive L-band radiometer Sensors housed in the nose cone; autopilot maintained stable flight to match radiometer footprint. Multirotor performed low-altitude flights where terrain is required. Produced 3–50 m resolution soil-moisture maps over oak–grassland; data informed drought and wildfire risk management.
(Li et al., 2024)
Winter wheat biomass estimation using UAV RGB and multispectral oblique photogrammetry
DJI Mavic 3M (20 MP RGB sensor and 5 MP multispectral sensor with green, red, red-edge and NIR bands) and DJI Mavic 3T (48 MP RGB sensor) were used. Oblique photography with a five-directional point cloud captured 3D structure. Flights were conducted at 30 m AGL with 80% forward and side overlap Cameras were maintained vertically or obliquely using the UAVs’ integrated gimbals; flight routes planned in DJI Pilot 2. Nadir multispectral images provided spectral information, while oblique RGB images generated dense point clouds. Integration of spectral indices and crop height metrics improved biomass estimation accuracy compared with spectral indices alone; oblique photogrammetry captured canopy structure that correlates with biomass.
(De Lima & Sepp, 2025)
Fire-damage mapping in peatlands
eBee X fixed-wing drone with S.O.D.A. 20 MP RGB camera and Parrot Sequoia multispectral sensor (green, red, red-edge, NIR) Both cameras mounted in stabilised bays; RTK GNSS provided precise geolocation. Flights with high forward and side overlap created dense image sets for photogrammetry Developed the triangular-area index (TAI) from multispectral reflectance to quantify fire-induced physiological damage in peatland vegetation; index captured subtle stress signals
(Sandino et al., 2025)
Hyperspectral mapping of Antarctic mosses and lichens
Headwall Nano-Hyperspec camera (400–1000 nm, 270 bands) on DJI M300 RTK; MicaSense Altum multispectral (RGB + NIR + thermal) and Sony α5100 on BMR4-40 UAV Cameras mounted on Gremsy Pixy SM and Pixy U gimbals for stabilization. Flights planned to achieve high overlap; RTK ensured accurate mosaicking. Generated hyperspectral and multispectral maps of mosses and lichens with >95 % classification accuracy using machine-learning algorithms.
(Wu et al., 2025)
Developing compatibility biomass model with UAV LiDAR for Chinese fir
A HuaCe BB4 multirotor UAV equipped with an AS-1300HL LiDAR system featuring a Riegl VUX-1LR scanner (wavelength 1500 nm, pulse duration 3.5 ns, divergence 0.5 mrad). Flights at 200 m altitude and 10 m/s produced ≈110 points/m² with 30° scan angle. The UAV flew a criss-cross trajectory with 50% lateral overlap to ensure uniform point distribution. The on-board flight control and GPS/IMU maintained stable altitude and heading; Corepore 2.0 and LiDAR360 software processed the point clouds. Built component-wise biomass models for trunk, bark, branches and leaves of Chinese-fir across different growth stages. High-density UAV-LiDAR data enabled precise tree height and crown width extraction, yielding accurate AGB estimates.
(Dorelis et al., 2025)
Multi-year assessment of winter rye rotation effects using XAG M500
A XAG M500 RTK multirotor carries a 20 MP multispectral gimbal camera (6 bands: 450, 555, 660, 720, 750, 840 nm) with mechanical shutter and automatic lens distortion correction. The RTK system integrated multiple GNSS constellations, providing centimetre-level accuracy. The camera is mounted on a 3-axis stabilised gimbal, allowing consistent nadir imaging at flight speed 8 m/s; lens distortion correction and mechanical shutter minimise motion blur. Provided multi-year NDVI maps to evaluate crop rotation effects on winter rye; high temporal resolution enabled tracking canopy vigor and assessing long-term field management practices.

5. Practical Constraints and Operational Considerations

While sensors and actuators have helped in aiding UAVs to become technological tools benefitting across domains, including ecology and agriculture, these components face their own several recurrent challenges. These challenges can limit the data quality, operational efficiency, and scalability due to the complex interactions among mechanisms for stabilisation, orientation, and flight control. This occurs because the sensor cameras and payloads are sensitive to changes in pitch, roll and yaw (Balestrieri et al., 2021; González-Desantos et al., 2019; Köppl et al., 2021). For instance, multirotor UAVs are found to rely heavily on two- or three-axis gimbals to stabilise these sensor cameras (Dhruv & Kaushal, 2025; Zhao et al., 2025); however, these actuators introduce additional weight, power consumption and mechanical complexity (Osmani & Schulz, 2024; Xing et al., 2024). In several phenotyping and wildlife ecology studies, gimbals successfully minimised motion blur and maintained consistent image overlap, but at the cost of reduced flight endurance and increased maintenance demands (Karahan et al., 2025). In contrast, fixed-wing UAVs often lack gimbals entirely, forcing users to depend solely on autopilot precision (Peksa & Mamchur, 2024). Under windy or turbulent conditions, rigidly mounted sensors experience unintended obliqueness, which can distort geometric accuracy and reduce the quality of orthomosaics or 3D reconstructions.
Additionally, geometric, and radiometric calibration remains a second major bottleneck, where spectral camera sensors require irradiance sensors, calibration panels, and often temperature-based correction functions to ensure spectral integrity (Daniels et al., 2023; Sestras et al., 2025). Although LiDAR enables precise synchronisation between the scanner, GPS, and inertial measurement units (IMUs), but small timing errors can distort point cloud structure, affecting biomass and canopy height estimation (Lyu et al., 2025; Xu et al., 2023). Many of these sensors impose significant payload demands, which reduce flight endurance, restrict manoeuvrability, and increase energy consumption (Aromoye et al., 2025; Beigi et al., 2022; Ganesh Kumar & Gudipalli, 2024). For example, in multirotor UAVs this directly limits coverage area per flight; for fixed-wing aircraft, it increases take-off requirements and susceptibility to wind. In some cases, multiple overlapping missions are required, thereby increasing operational time and data-processing burdens (GU et al., 2025; Huang et al., 2025b). Also, environmental sensitivity can impact thermal cameras, as illumination is influenced by humidity and the ambient temperature of surrounding objects (Wan et al., 2024).

6. Emerging Sensor Designs and Future Directions

Future directions on adequate adoption of sensors and actuators to ensure fidelity should include prioritising lightweight, lower-power gimbal systems, improved mechanical dampening, and hybrid stabilisation architectures that combine digital deblurring with physical stabilisation (Dhruv & Kaushal, 2025; Villi & Yavuz, 2024.; J. S. Zhao et al., 2025). Improved control algorithms could compensate for gusts through micro-adjustments in control surfaces to reduce unintended roll and camera tilt. In addition, the development of miniaturised, energy-efficient sensors embedded with battery technologies and solar-assisted charging systems is essential and should be prioritised (Mohsan et al., 2023; Rietz et al., 2023; Saari et al., 2009; Townsend et al., 2020; Xiao et al., 2023). Furthermore, sensor fusion cameras should be explored together with onboard calibration tools, thereby reducing the need for heavy multi-sensor payloads and integrated reflectance targets (Sestras et al., 2025). With respect to flight control, future UAVs could be upgraded with more advanced illumination and exposure correction algorithms, as well as turbulence-aware flight modes, to enable more efficient flight planning and minimise disturbance in wildlife monitoring use cases (McEvoy et al., 2016; Swaminathan et al., 2024; Vera-Yanez et al., 2024).
Artificial Intelligence is also gaining traction with embedded processors and machine-learning models, enabling drones to classify crop health, identify pollution plumes in real time, and detect wildlife movement while reducing the transmission and storage of raw data (Panigrahi et al., 2023). Swarm robotics and cooperative control algorithms will allow drone fleets to coordinate their movements and behaviour, share information, and adapt their trajectories autonomously (Duan et al., 2023). Ultimately, drones will not operate in isolation. Future monitoring systems will integrate data from unmanned aircraft with measurements from satellites, ground stations, and other robotic platforms. High-resolution UAV observations can be calibrated and validated against satellite products, helping to fill gaps between ground plots and spaceborne measurements. With the speed at which autonomy increases and communications improve, drones may evolve into mobile laboratories and network nodes, broadening the scope of environmental applications and supporting more sophisticated, multi-scale environmental sensing and management.

7. Conclusions

The use of unmanned aerial systems has evolved from niche gadgets into indispensable tools for environmental monitoring. Equipped with miniaturised cameras, LiDAR scanners, and thermal imagers, lightweight drones now collect spatial and temporal details that once required manned aircraft or painstaking ground surveys. Moreover, motorised gimbals, winches, pumps, and dispersal mechanisms keep these sensors steady, lower instruments into otherwise inaccessible environments and enable direct interaction through seeding or sampling. For example, case studies illustrate the breadth of these capabilities: drones estimate forest biomass more accurately than traditional plots, reveal fine-scale patterns of vegetation recovery after fire, guide crop management using spectral and thermal indices, monitor wildlife with minimal disturbance and even perform chemical analyses while in flight. Nevertheless, this technology is not without limitations. Payload and endurance constraints restrict the number and weight of instruments that can be carried, and adverse weather, short battery life and complex regulations further constrain missions.
Also, the volume of data generated can overwhelm pilots and end users, and ethical considerations must inform where and how flights are conducted. These difficulties demand more efficient power systems, durable technology, simplified data pipelines, and simpler regulations. Looking ahead, rapid advances promise to expand the role of drones from imaging platforms to mobile laboratories and networked agents. For instance, biomimetic and biodegradable sensor “seeds” and light-activated microrobots hint at new ways to sample the environment unobtrusively. Coupling drone data with satellite observations and ground networks will facilitate multi-scale environmental assessments and responsive management. The engineering, ecology, geography, and regulatory fields must continue in their collaboration to expand its future compatibilities. They could work on enhancing its payload connectivity and exchange of information and increase acceptance and interoperability. With these and many more uses, drones will be more helpful as integral tools for supporting conservation, agriculture, forestry and natural resource management.

References

  1. Acharya, B. S., Bhandari, M., Bandini, F., Pizarro, A., Perks, M., Joshi, D. R., Wang, S., Dogwiler, T., Ray, R. L., Kharel, G., & Sharma, S. (2021). Unmanned Aerial Vehicles in Hydrology and Water Management: Applications, Challenges, and Perspectives. In Water Resources Research (Vol. 57, Issue 11). John Wiley and Sons Inc. [CrossRef]
  2. Adhitama Putra Hernanda, R., Lee, H., Cho, J. il, Kim, G., Cho, B. K., & Kim, M. S. (2024). Current trends in the use of thermal imagery in assessing plant stresses: A review. In Computers and Electronics in Agriculture (Vol. 224). Elsevier B.V. [CrossRef]
  3. Alkan, M. N. (2024). High-Precision UAV Photogrammetry with RTK GNSS: Eliminating Ground Control Points. Hittite Journal of Science and Engineering, 11(4), 139–147. [CrossRef]
  4. Andersen, H. E., McGaughey, R. J., & Reutebuch, S. E. (2005). Estimating forest canopy fuel parameters using LIDAR data. Remote Sensing of Environment, 94(4), 441–449. [CrossRef]
  5. Arfaoui, A. (2017). Unmanned Aerial Vehicle: Review of Onboard Sensors, Application Fields, Open Problems and Research Issues. In Aymen Arfaoui International Journal of Image Processing (IJIP) (Issue 11). https://www.researchgate.net/publication/315076314.
  6. Aromoye, I. A., Lo, H. H., Sebastian, P., Mustafa Abro, G. E., & Ayinla, S. L. (2025). Significant Advancements in UAV Technology for Reliable Oil and Gas Pipeline Monitoring. In CMES - Computer Modeling in Engineering and Sciences (Vol. 142, Issue 2, pp. 1155–1197). Tech Science Press. [CrossRef]
  7. Aurell, J., & Gullett, B. K. (2024). Effects of UAS Rotor Wash on Air Quality Measurements. Drones, 8(3). [CrossRef]
  8. Balestrieri, E., Daponte, P., De Vito, L., & Lamonaca, F. (2021). Sensors and measurements for unmanned systems: An overview. In Sensors (Vol. 21, Issue 4, pp. 1–27). MDPI AG. [CrossRef]
  9. Bannon, D. (2009). Hyperspectral imaging: Cubes and slices. Nature Photonics, 3(11), 627–629. [CrossRef]
  10. Beigi, P., Rajabi, M. S., & Aghakhani, S. (2022). An Overview of Drone Energy Consumption Factors and Models. In Handbook of Smart Energy Systems (pp. 1–20). Springer International Publishing. [CrossRef]
  11. Besenyő, J., & Őszi, A. (2025). Sensing From the Skies: A Comprehensive Analysis of the Latest Sensors on Drones. In Journal of Robotics (Vol. 2025, Issue 1). John Wiley and Sons Ltd. [CrossRef]
  12. Camenzind, M. P., & Yu, K. (2023). Multi temporal multispectral UAV remote sensing allows for yield assessment across European wheat varieties already before flowering. Frontiers in Plant Science, 14. [CrossRef]
  13. Castro, J., Morales-Rueda, F., Alcaraz-Segura, D., & Tabik, S. (2023). Forest restoration is more than firing seeds from a drone. Restoration Ecology, 31(1). [CrossRef]
  14. Chance, C. M., Coops, N. C., Plowright, A. A., Tooke, T. R., Christen, A., & Aven, N. (2016). Invasive shrub mapping in an urban environment from hyperspectral and LiDAR-derived attributes. Frontiers in Plant Science, 7(OCTOBER2016). [CrossRef]
  15. Corte, A. P. D., Rex, F. E., de Almeida, D. R. A., Sanquetta, C. R., Silva, C. A., Moura, M. M., Wilkinson, B., Zambrano, A. M. A., da Cunha Neto, E. M., Veras, H. F. P., de Moraes, A., Klauberg, C., Mohan, M., Cardil, A., & Broadbent, E. N. (2020). Measuring individual tree diameter and height using gatoreye high-density UAV-lidar in an integrated crop-livestock-forest system. Remote Sensing, 12(5). [CrossRef]
  16. Costa, H., Rogan, A., Zadra, C., Larsen, O., Rikardsen, A. H., & Waugh, C. (2023). Blowing in the Wind: Using a Consumer Drone for the Collection of Humpback Whale (Megaptera novaeangliae) Blow Samples during the Arctic Polar Nights. Drones, 7(1). [CrossRef]
  17. Crowe, W., Costales, B., & Luthy, K. (2025). Design and fabrication of a 3D-printed drone-integrated winching system. [CrossRef]
  18. Daniels, L., Eeckhout, E., Wieme, J., Dejaegher, Y., Audenaert, K., & Maes, W. H. (2023). Identifying the Optimal Radiometric Calibration Method for UAV-Based Multispectral Imaging. Remote Sensing, 15(11). [CrossRef]
  19. De Lima, R. S., & Sepp, K. (2025). A novel spectral index designed for drone-based mapping of fire-damage levels: demonstration and relationship with biophysical variables in a peatland. Frontiers in Environmental Science, 13. [CrossRef]
  20. Deane, G. A. M. (2023). USING LOW-COST DRONES TO MAP HABITAT CHANGE IN BAHAMIAN NATIONAL PARKS.
  21. Demmer, C. R., Demmer, S., & McIntyre, T. (2024). Drones as a tool to study and monitor endangered Grey Crowned Cranes (Balearica regulorum): Behavioural responses and recommended guidelines. Ecology and Evolution, 14(2). [CrossRef]
  22. Dhruv, & Kaushal, H. (2025). A Review of Pointing Modules and Gimbal Systems for Free-Space Optical Communication in Non-Terrestrial Platforms. Photonics, 12(10), 1001. [CrossRef]
  23. Dorelis, M., Vaštakaitė-Kairienė, V., & Bogužas, V. (2025). UAV Multispectral Imaging for Multi-Year Assessment of Crop Rotation Effects on Winter Rye. Applied Sciences (Switzerland), 15(21). [CrossRef]
  24. Fumian, F., Di Giovanni, D., Martellucci, L., Rossi, R., & Gaudio, P. (2020). Application of miniaturized sensors to unmanned aerial systems, a new pathway for the survey of polluted areas: Preliminary results. Atmosphere, 11(5). [CrossRef]
  25. Galloway, K. C., Becker, K. P., Phillips, B., Kirby, J., Licht, S., Tchernov, D., Wood, R. J., & Gruber, D. F. (2016). Soft Robotic Grippers for Biological Sampling on Deep Reefs. Soft Robotics, 3(1), 23–33. [CrossRef]
  26. Ganesh Kumar, S. S., & Gudipalli, A. (2024). A comprehensive review on payloads of unmanned aerial vehicle. In Egyptian Journal of Remote Sensing and Space Science (Vol. 27, Issue 4, pp. 637–644). Elsevier B.V. [CrossRef]
  27. González-Desantos, L. M., Martínez-Sánchez, J., González-Jorge, H., Ribeiro, M., de Sousa, J. B., & Arias, P. (2019). Payload for contact inspection tasks with UAV systems. Sensors (Switzerland), 19(17). [CrossRef]
  28. GU, R., ZHAO, Y., & REN, X. (2025). Integrating wind field analysis in UAV path planning: Enhancing safety and energy efficiency for urban logistics. Chinese Journal of Aeronautics, 103605. [CrossRef]
  29. Guebsi, R., Mami, S., & Chokmani, K. (2024). Drones in Precision Agriculture: A Comprehensive Review of Applications, Technologies, and Challenges. In Drones (Vol. 8, Issue 11). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
  30. Han, C., Jeong, Y., Ahn, J., Kim, T., Choi, J., Ha, J. H., Kim, H., Hwang, S. H., Jeon, S., Ahn, J., Hong, J. T., Kim, J. J., Jeong, J. H., & Park, I. (2023). Recent Advances in Sensor–Actuator Hybrid Soft Systems: Core Advantages, Intelligent Applications, and Future Perspectives. In Advanced Science (Vol. 10, Issue 35). John Wiley and Sons Inc. [CrossRef]
  31. Haque, K. M. S., Joshi, P., & Subedi, N. (2025). Integrating UAV-based multispectral imaging with ground-truth soil nitrogen content for precision agriculture: A case study on paddy field yield estimation using machine learning and plant height monitoring. Smart Agricultural Technology, 12, 101542. [CrossRef]
  32. Hodgson, J. C., Baylis, S. M., Mott, R., Herrod, A., & Clarke, R. H. (2016). Precision wildlife monitoring using unmanned aerial vehicles. Scientific Reports, 6. [CrossRef]
  33. Hoffmann, T. K., Pfennings, K., Hitzegrad, J., Brohmann, L., Welzel, M., Paul, M., Goseberg, N., Wehrmann, A., & Schlurmann, T. (2023). Low-cost UAV monitoring: insights into seasonal volumetric changes of an oyster reef in the German Wadden Sea. Frontiers in Marine Science, 10. [CrossRef]
  34. Hollaus, M., Wagner, W., Eberhöfer, C., & Karel, W. (2006). Accuracy of large-scale canopy heights derived from LiDAR data under operational constraints in a complex alpine environment. ISPRS Journal of Photogrammetry and Remote Sensing, 60(5), 323–338. [CrossRef]
  35. Hong, G., & Zhang, Y. (2008). A comparative study on radiometric normalization using high resolution satellite images. International Journal of Remote Sensing, 29(2), 425–438. [CrossRef]
  36. Huang, D., Zhou, Z., Zhang, Z., Du, X., Fan, R., Li, Q., & Huang, Y. (2025a). From Application-Driven Growth to Paradigm Shift: Scientific Evolution and Core Bottleneck Analysis in the Field of UAV Remote Sensing. In Applied Sciences (Switzerland) (Vol. 15, Issue 15). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
  37. Huang, D., Zhou, Z., Zhang, Z., Du, X., Fan, R., Li, Q., & Huang, Y. (2025b). From Application-Driven Growth to Paradigm Shift: Scientific Evolution and Core Bottleneck Analysis in the Field of UAV Remote Sensing. In Applied Sciences (Switzerland) (Vol. 15, Issue 15). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
  38. Huelsman, K., Epstein, H., Yang, X., Mullori, L., Červená, L., & Walker, R. (2022). Spectral variability in fine-scale drone-based imaging spectroscopy does not impede detection of target invasive plant species. Frontiers in Remote Sensing, 3. [CrossRef]
  39. Imran, & Li, J. (2025). UAV Aerodynamics and Crop Interaction: Revolutionizing Modern Agriculture with Drone. In Smart Agriculture (Singapore) (Vol. 13, pp. 1–462). Springer. [CrossRef]
  40. Jakob, S., Zimmermann, R., & Gloaguen, R. (2016). Processing of drone-borne hyperspectral data for geological applications. Workshop on Hyperspectral Image and Signal Processing, Evolution in Remote Sensing, 0. [CrossRef]
  41. James Campbell, B. B., & Wynne, R. H. (2011). Sample Chapter: Introduction to Remote Sensing, Fifth Edition. www.guilford.com/p/campbell2.
  42. Jaskierniak, D., Lucieer, A., Kuczera, G., Turner, D., Lane, P. N. J., Benyon, R. G., & Haydon, S. (2021). Individual tree detection and crown delineation from Unmanned Aircraft System (UAS) LiDAR in structurally complex mixed species eucalypt forests. ISPRS Journal of Photogrammetry and Remote Sensing, 171, 171–187. [CrossRef]
  43. Karahan, A., Demircan, N., Özgeriş, M., Gökçe, O., & Karahan, F. (2025). Integration of Drones in Landscape Research: Technological Approaches and Applications. In Drones (Vol. 9, Issue 9). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
  44. Kedzierski, J., & Chea, H. (2021). Multilayer microhydraulic actuators with speed and force configurations. Microsystems and Nanoengineering, 7(1). [CrossRef]
  45. Kefauver, S. C., Vicente, R., Vergara-Díaz, O., Fernandez-Gallego, J. A., Kerfal, S., Lopez, A., Melichar, J. P. E., Serret Molins, M. D., & Araus, J. L. (2017). Comparative UAV and field phenotyping to assess yield and nitrogen use efficiency in hybrid and conventional barley. Frontiers in Plant Science, 8. [CrossRef]
  46. Köppl, C. J., Malureanu, R., Dam-Hansen, C., Wang, S., Jin, H., Barchiesi, S., Serrano Sandí, J. M., Muñoz-Carpena, R., Johnson, M., Durán-Quesada, A. M., Bauer-Gottwein, P., McKnight, U. S., & Garcia, M. (2021). Hyperspectral reflectance measurements from UAS under intermittent clouds: Correcting irradiance measurements for sensor tilt. Remote Sensing of Environment, 267. [CrossRef]
  47. Lally, H. T., O’Connor, I., Jensen, O. P., & Graham, C. T. (2019). Can drones be used to conduct water sampling in aquatic environments? A review. In Science of the Total Environment (Vol. 670, pp. 569–575). Elsevier B.V. [CrossRef]
  48. Lee, Q. R., Hesse, H., Naruangsri, K., Takaew, W., Elliot, S., & Bhatia, D. (2025). UAV-Based Precision Seed Dropping for Automated Reforestation. [CrossRef]
  49. Leem, J., Mehrishal, S., Kang, I. S., Yoon, D. H., Shao, Y., Song, J. J., & Jung, J. (2025). Optimizing Camera Settings and Unmanned Aerial Vehicle Flight Methods for Imagery-Based 3D Reconstruction: Applications in Outcrop and Underground Rock Faces. Remote Sensing, 17(11). [CrossRef]
  50. Li, Y., Li, C., Cheng, Q., Chen, L., Li, Z., Zhai, W., Mao, B., & Chen, Z. (2024). Precision estimation of winter wheat crop height and above-ground biomass using unmanned aerial vehicle imagery and oblique photoghraphy point cloud data. Frontiers in Plant Science, 15. [CrossRef]
  51. Lin, J., Wang, M., Ma, M., & Lin, Y. (2018). Aboveground tree biomass estimation of sparse subalpine coniferous forest with UAV oblique photography. Remote Sensing, 10(11). [CrossRef]
  52. Liu, X., & Wang, L. (2018). Feasibility of using consumer-grade unmanned aerial vehicles to estimate leaf area index in mangrove forest. Remote Sensing Letters, 9(11), 1040–1049. [CrossRef]
  53. Loarie, S. R., Joppa, L. N., & Pimm, S. L. (2007). Satellites miss environmental priorities. In Trends in Ecology and Evolution (Vol. 22, Issue 12, pp. 630–632). [CrossRef]
  54. Lussem, U., Hollberg, J., Menne, J., Schellberg, J., & Bareth, G. (2017). Using calibrated RGB imagery from low-cost UAVs for grassland monitoring: Case study at the Rengen Grassland Experiment (RGE), Germany. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives, 42(2W6), 229–233. [CrossRef]
  55. Lyu, X., Liu, S., Qiao, R., Jiang, S., & Wang, Y. (2025). Camera, LiDAR, and IMU Spatiotemporal Calibration: Methodological Review and Research Perspectives. In Sensors (Vol. 25, Issue 17). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
  56. Ma, Y., Ma, L., Zhang, Q., Huang, C., Yi, X., Chen, X., Hou, T., Lv, X., & Zhang, Z. (2022). Cotton Yield Estimation Based on Vegetation Indices and Texture Features Derived From RGB Image. Frontiers in Plant Science, 13. [CrossRef]
  57. Manfreda, S., McCabe, M. F., Miller, P. E., Lucas, R., Madrigal, V. P., Mallinis, G., Dor, E. Ben, Helman, D., Estes, L., Ciraolo, G., Müllerová, J., Tauro, F., de Lima, M. I., de Lima, J. L. M. P., Maltese, A., Frances, F., Caylor, K., Kohv, M., Perks, M., … Toth, B. (2018). On the use of unmanned aerial systems for environmental monitoring. In Remote Sensing (Vol. 10, Issue 4). MDPI AG. [CrossRef]
  58. Martínez-Carricondo, P., Agüera-Vega, F., & Carvajal-Ramírez, F. (2023). Accuracy assessment of RTK/PPK UAV-photogrammetry projects using differential corrections from multiple GNSS fixed base stations. Geocarto International, 38(1). [CrossRef]
  59. McEvoy, J. F., Hall, G. P., & McDonald, P. G. (2016). Evaluation of unmanned aerial vehicle shape, flight path and camera type for waterfowl surveys: Disturbance effects and species recognition. PeerJ, 2016(3). [CrossRef]
  60. Moghaddam, M., Saatchi, S., & Cuenca, R. H. (2000). Estimating subcanopy soil moisture with radar. Journal of Geophysical Research Atmospheres, 105(D11), 14899–14911. [CrossRef]
  61. Mohsan, S. A. H., Othman, N. Q. H., Li, Y., Alsharif, M. H., & Khan, M. A. (2023). Unmanned aerial vehicles (UAVs): practical aspects, applications, open challenges, security issues, and future trends. In Intelligent Service Robotics (Vol. 16, Issue 1, pp. 109–137). Springer Science and Business Media Deutschland GmbH. [CrossRef]
  62. Moreno-Martínez, Á., Izquierdo-Verdiguier, E., Maneta, M. P., Camps-Valls, G., Robinson, N., Muñoz-Marí, J., Sedano, F., Clinton, N., & Running, S. W. (2020). Multispectral high resolution sensor fusion for smoothing and gap-filling in the cloud. Remote Sensing of Environment, 247. [CrossRef]
  63. Mosharof, M. P. (n.d.). The Importance of Remote Sensing in Environmental Science. https://www.researchgate.net/publication/393977426.
  64. Mykytyn, P., Brzozowski, M., Dyka, Z., & Langendoerfer, P. (2024). A Survey on Sensor- and Communication-Based Issues of Autonomous UAVs. In CMES - Computer Modeling in Engineering and Sciences (Vol. 138, Issue 2, pp. 1019–1050). Tech Science Press. [CrossRef]
  65. Neuville, R., Bates, J. S., & Jonard, F. (2021). Estimating forest structure from UAV-mounted LiDAR point cloud using machine learning. Remote Sensing, 13(3), 1–19. [CrossRef]
  66. Nex, F., Armenakis, C., Cramer, M., Cucci, D. A., Gerke, M., Honkavaara, E., Kukko, A., Persello, C., & Skaloud, J. (2022). UAV in the advent of the twenties: Where we stand and what is next. In ISPRS Journal of Photogrammetry and Remote Sensing (Vol. 184, pp. 215–242). Elsevier B.V. [CrossRef]
  67. Nguyen, T. X. B., Rosser, K., & Chahl, J. (2021). A review of modern thermal imaging sensor technology and applications for autonomous aerial navigation. Journal of Imaging, 7(10). [CrossRef]
  68. Osmani, K., & Schulz, D. (2024). Comprehensive Investigation of Unmanned Aerial Vehicles (UAVs): An In-Depth Analysis of Avionics Systems. In Sensors (Vol. 24, Issue 10). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
  69. Ostrower, D. (2006). Optical Thermal Imaging - replacing microbolometer technology and achieving universal deployment. III-Vs Review, 19(6), 24–27. [CrossRef]
  70. Peksa, J., & Mamchur, D. (2024). A Review on the State of the Art in Copter Drones and Flight Control Systems. In Sensors (Vol. 24, Issue 11). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
  71. Poulsen, E., Rysgaard, S., Hansen, K., & Karlsson, N. B. (2024). HardwareX 18 (2024) e00518 Uncrewed aerial vehicle with onboard winch system for rapid, cost-effective, and safe oceanographic profiling in hazardous and inaccessible areas. HardwareX, 18, e00518. [CrossRef]
  72. Raeva, P. L., Šedina, J., & Dlesk, A. (2019). Monitoring of crop fields using multispectral and thermal imagery from UAV. European Journal of Remote Sensing, 52(sup1), 192–201. [CrossRef]
  73. Rietz, J., van Beeck Calkoen, S. T. S., Ferry, N., Schlüter, J., Wehner, H., Schindlatz, K. H., Lackner, T., von Hoermann, C., Conraths, F. J., Müller, J., & Heurich, M. (2023). Drone-Based Thermal Imaging in the Detection of Wildlife Carcasses and Disease Management. Transboundary and Emerging Diseases, 2023. [CrossRef]
  74. Saari, H., Aallos, V.-V., Akujärvi, A., Antila, T., Holmlund, C., Kantojärvi, U., Mäkynen, J., & Ollila, J. (2009). Novel miniaturized hyperspectral sensor for UAV and space applications. Sensors, Systems, and Next-Generation Satellites XIII, 7474, 74741M. [CrossRef]
  75. Sandino, J., Barthelemy, J., Doshi, A., Randall, K., Robinson, S. A., Bollard, B., & Gonzalez, F. (2025). Drone hyperspectral imaging and artificial intelligence for monitoring moss and lichen in Antarctica. Scientific Reports, 15(1). [CrossRef]
  76. Seidaliyeva, U., Ilipbayeva, L., Utebayeva, D., Smailov, N., & T.Matson, E. T. (2024). LiDAR Technology for UAV Detection: From Fundamentals and Operational Principles to Advanced Detection and Classification Techniques. [CrossRef]
  77. Sestras, P., Badea, G., Badea, A. C., Salagean, T., Oniga, V. E., Roșca, S., Bilașco, Ștefan, Bruma, S., Spalević, V., Kader, S., Billi, P., & Nedevschi, S. (2025). A novel method for landslide deformation monitoring by fusing UAV photogrammetry and LiDAR data based on each sensor’s mapping advantage in regards to terrain feature. Engineering Geology, 346. [CrossRef]
  78. Shahbazi, M., Théau, J., & Ménard, P. (2014). Recent applications of unmanned aerial imagery in natural resource management. GIScience and Remote Sensing, 51(4), 339–365. [CrossRef]
  79. Shaik, H. S. (2024). Design and development of a quadcopter for agricultural seeding. In Hyperautomation in Precision Agriculture: Advancements and Opportunities for Sustainable Farming (pp. 277–288). Elsevier. [CrossRef]
  80. Shero, M. R., Dale, J., Seymour, A. C., Hammill, M. O., Mosnier, A., Mongrain, S., & Johnston, D. W. (2021). Tracking wildlife energy dynamics with unoccupied aircraft systems and three-dimensional photogrammetry. Methods in Ecology and Evolution, 12(12), 2458–2472. [CrossRef]
  81. Shi, Y., Alex Thomasson, J., Murray, S. C., Ace Pugh, N., Rooney, W. L., Shafian, S., Rajan, N., Rouze, G., Morgan, C. L. S., Neely, H. L., Rana, A., Bagavathiannan, M. V., Henrickson, J., Bowden, E., Valasek, J., Olsenholler, J., Bishop, M. P., Sheridan, R., Putman, E. B., … Yang, C. (2016). Unmanned aerial vehicles for high-throughput phenotyping and agronomic research. PLoS ONE, 11(7). [CrossRef]
  82. Shin, J. Il, Seo, W. W., Kim, T., Park, J., & Woo, C. S. (2019). Using UAV multispectral images for classification of forest burn severity-A case study of the 2019 Gangneung forest fire. Forests, 10(11). [CrossRef]
  83. Sieberth, T., Wackrow, R., & Chandler, J. H. (2014). Motion blur disturbs - the influence of motion-blurred images in photogrammetry. Photogrammetric Record, 29(148), 434–453. [CrossRef]
  84. Stern, M., Ferrell, R., Flint, L., Kozanitas, M., Ackerly, D., Elston, J., Stachura, M., Dai, E., & Thorne, J. (2024). Fine-scale surficial soil moisture mapping using UAS-based L-band remote sensing in a mixed oak-grassland landscape. Frontiers in Remote Sensing, 5. [CrossRef]
  85. Straight, B. J., Castendyk, D. N., McKnight, D. M., Newman, C. P., Filiatreault, P., & Pino, A. (2021). Using an unmanned aerial vehicle water sampler to gather data in a pit-lake mining environment to assess closure and monitoring. Environmental Monitoring and Assessment, 193(9). [CrossRef]
  86. Stuart, M. B., McGonigle, A. J. S., & Willmott, J. R. (2019). Hyperspectral imaging in environmental monitoring: A review of recent developments and technological advances in compact field deployable systems. In Sensors (Switzerland) (Vol. 19, Issue 14). MDPI AG. [CrossRef]
  87. Stutsel, B., Johansen, K., Malbéteau, Y. M., & McCabe, M. F. (2021). Detecting Plant Stress Using Thermal and Optical Imagery From an Unoccupied Aerial Vehicle. Frontiers in Plant Science, 12. [CrossRef]
  88. Swaminathan, V., Thomasson, J. A., Hardin, R. G., Rajan, N., & Raman, R. (2024). Radiometric calibration of UAV multispectral images under changing illumination conditions with a downwelling light sensor. Plant Phenome Journal, 7(1). [CrossRef]
  89. Thompson, L. J., & Puntel, L. A. (2020). Transforming unmanned aerial vehicle (UAV) and multispectral sensor into a practical decision support system for precision nitrogen management in corn. Remote Sensing, 12(10). [CrossRef]
  90. Townsend, A., Jiya, I. N., Martinson, C., Bessarabov, D., & Gouws, R. (2020). A comprehensive review of energy sources for unmanned aerial vehicles, their shortfalls and opportunities for improvements. In Heliyon (Vol. 6, Issue 11). Elsevier Ltd. [CrossRef]
  91. Treccani, D., Adami, A., & Fregonese, L. (2024). Drones and Real-Time Kinematic Base Station Integration for Documenting Inaccessible Ruins: A Case Study Approach. Drones, 8(6). [CrossRef]
  92. Vera-Yanez, D., Pereira, A., Rodrigues, N., Molina, J. P., García, A. S., & Fernández-Caballero, A. (2024). Optical Flow-Based Obstacle Detection for Mid-Air Collision Avoidance. Sensors, 24(10). [CrossRef]
  93. Villi, Ö., & Yavuz, H. (n.d.). The utilization of gimbal systems in unmanned aerial vehicles. Advanced UAV, 2024(1), 19–30. http://publish.mersin.edu.tr/index.php/uav.
  94. Wagner, W., Pathe, C., Doubkova, M., Sabel, D., Bartsch, A., Hasenauer, S., Blöschl, G., Scipal, K., Martínez-Fernández, J., & Löw, A. (2008). Temporal Stability of Soil Moisture and Radar Backscatter Observed by the Advanced Synthetic Aperture Radar (ASAR). Sensors, 8, 1174–1197. www.mdpi.org/sensors.
  95. Wan, Q., Smigaj, M., Brede, B., & Kooistra, L. (2024). Optimizing UAV-based uncooled thermal cameras in field conditions for precision agriculture. International Journal of Applied Earth Observation and Geoinformation, 134. [CrossRef]
  96. Wu, Z., Xie, D., Liu, Z., Chen, Q., Ye, Q., Ye, J., Wang, Q., Liao, X., Wang, Y., Sharma, R. P., & Fu, L. (2025). Developing compatibility biomass model based on UAV LiDAR data of Chinese fir (Cunninghamia lanceolata) in Southern China. Frontiers in Plant Science, 16. [CrossRef]
  97. Xiao, C., Wang, B., Zhao, D., & Wang, C. (2023). Comprehensive investigation on Lithium batteries for electric and hybrid-electric unmanned aerial vehicle applications. Thermal Science and Engineering Progress, 38. [CrossRef]
  98. Xing, S., Zhang, X., Tian, J., Xie, C., Chen, Z., & Sun, J. (2024). Morphing Quadrotors: Enhancing Versatility and Adaptability in Drone Applications—A Review. In Drones (Vol. 8, Issue 12). Multidisciplinary Digital Publishing Institute (MDPI). [CrossRef]
  99. Xu, W., Yang, W., Wu, J., Chen, P., Lan, Y., & Zhang, L. (2023). Canopy Laser Interception Compensation Mechanism—UAV LiDAR Precise Monitoring Method for Cotton Height. Agronomy, 13(10). [CrossRef]
  100. Yan, Y., Lei, J., & Huang, Y. (2024). Forest Aboveground Biomass Estimation Based on Unmanned Aerial Vehicle–Light Detection and Ranging and Machine Learning. Sensors, 24(21). [CrossRef]
  101. Yu, K., Belwalkar, A., Wang, W., Hu, Y., Hunegnaw, A., Nurunnabi, A., Ruf, T., Li, F., Jia, L., Kooistra, L., Miao, Y., & Teferle, F. N. (2025). UAV Hyperspectral Remote Sensing for Crop Nitrogen Monitoring: Progress, Challenges, and Perspectives. Smart Agricultural Technology, 101507. [CrossRef]
  102. Zhan, Y., Chen, P., Xu, W., Chen, S., Han, Y., Lan, Y., & Wang, G. (2022). Influence of the downwash airflow distribution characteristics of a plant protection UAV on spray deposit distribution. Biosystems Engineering, 216, 32–45. [CrossRef]
  103. Zhang, G., & Hsu, L. T. (2018). Intelligent GNSS/INS integrated navigation system for a commercial UAV flight control system. Aerospace Science and Technology, 80, 368–380. [CrossRef]
  104. Zhang, L., Niu, Y., Zhang, H., Han, W., Li, G., Tang, J., & Peng, X. (2019). Maize Canopy Temperature Extracted From UAV Thermal and RGB Imagery and Its Application in Water Stress Monitoring. Frontiers in Plant Science, 10. [CrossRef]
  105. Zhao, J. S., Sun, X. C., & Sun, H. L. (2025). Design and strength analysis of a gimbaled nozzle mechanism. Mechanism and Machine Theory, 209. [CrossRef]
Figure 1. Schematic diagram of a multirotor drone outfitted with key environmental monitoring components. An RGB & multispectral camera is mounted on a 3-axis gimbal at the front underside (for stabilized visible/NIR imaging), alongside a dedicated thermal camera and an optional miniaturized hyperspectral camera. A scanning LiDAR unit is attached at the front (underside) for 3D structural mapping. On the underside rear, a seed dispersal/sampling mechanism (e.g., a winch-operated drop pod) is shown for tasks like seed release or water sampling. The drone’s top side carries a GNSS receiver (RTK) for high-precision positioning, and a power module (battery) at the back. These components enable the UAV to capture multi-modal environmental data (optical, spectral, thermal, LiDAR) and perform physical actions such as stable sensor pointing and targeted payload deployment.
Figure 1. Schematic diagram of a multirotor drone outfitted with key environmental monitoring components. An RGB & multispectral camera is mounted on a 3-axis gimbal at the front underside (for stabilized visible/NIR imaging), alongside a dedicated thermal camera and an optional miniaturized hyperspectral camera. A scanning LiDAR unit is attached at the front (underside) for 3D structural mapping. On the underside rear, a seed dispersal/sampling mechanism (e.g., a winch-operated drop pod) is shown for tasks like seed release or water sampling. The drone’s top side carries a GNSS receiver (RTK) for high-precision positioning, and a power module (battery) at the back. These components enable the UAV to capture multi-modal environmental data (optical, spectral, thermal, LiDAR) and perform physical actions such as stable sensor pointing and targeted payload deployment.
Preprints 187267 g001
Table 1. outlining the principal sensor categories currently employed on environmental drones, noting representative technical specifications (e.g., spectral range, spatial resolution, or detection mechanism) and their corresponding applications.
Table 1. outlining the principal sensor categories currently employed on environmental drones, noting representative technical specifications (e.g., spectral range, spatial resolution, or detection mechanism) and their corresponding applications.
Sensor Type Key Specifications Applications
RGB/optical camera Highresolution CMOS sensor (≥ 20 MP); global shutter; high dynamic range Photogrammetry; structurefrommotion 3D models; landcover and habitat mapping.
Multispectral camera Discrete bands (e.g., blue, green, red, rededge, nearinfrared), radiometric calibration; centimetrelevel geolocation Vegetation indices (e.g., NDVI) for crop monitoring, biomass estimation, irrigation and nitrogen management burnseverity mapping.
Thermal (microbolometer) Uncooled microbolometer array; longwave IR (7.5–13.5 µm) with < 50 mK thermal sensitivity, often integrated with RGB camera Detecting warmblooded animals and wildlife carcasses, crop waterstress mapping, hotspot detection in wildfires; assessing heat leaks and smouldering areas
Hyperspectral camera Hundreds of contiguous bands; example mini sensor: 400–1100 nm, 5–10 nm spectral resolution, < 50 g weight. Species discrimination and invasivespecies mapping assessing crop nutrient status and plant stress.
LiDAR Scanning laser emits pulsed light; measures return time to generate dense 3-D point cloud typical range limited by power/weight Forest inventory and above-ground biomass estimation, canopy height and crown dimension mapping, topographic mapping, landslide/erosion monitoring.
Radar Millimeterwave or frequencymodulated continuouswave sensors; emit radio waves and record frequency/time shifts. Collision avoidance and terrain following for small UAVs; canopy biomass and soilmoisture estimation via backscatter
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated