Preprint
Article

This version is not peer-reviewed.

Research on Cam-Kalm Automatic Tracking Technology of Low, Slow and Small Target Based on Gm-APD LiDAR

A peer-reviewed article of this preprint also exists.

Submitted:

12 November 2024

Posted:

18 November 2024

You are already at the latest version

Abstract
With the wide application of UAVs in modern intelligent warfare and civil fields, the demand for C-UAS technology is increasingly urgent. Traditional detection methods have many limitations in dealing with "low, slow, and small" targets. This paper presents a pure laser automatic tracking system based on Geiger mode avalanche photodiode (Gm-APD). Combining the target motion state prediction of the Kalman filter and the adaptive target tracking of Camshift, a Cam-Kalm algorithm is proposed to achieve high-precision and stable tracking of moving targets. The system also introduces two-dimensional Gaussian fitting and edge detection algorithms to automatically determine the target's center position and the tracking rectangular box, improving the automation of target tracking. The experimental results show that the system designed in this paper can effectively track UAVs in a 70 m laboratory environment and 3.07 km to 3.32 km long-distance scene and has low center positioning error and MSE. This technology provides a new solution for real-time tracking and ranging of long-distance UAVs, shows the potential of pure laser in long-distance "low-slow-small" target tracking, and provides essential technical support for C-UAS technology.
Keywords: 
;  ;  ;  ;  

1. Introduction

Uncrewed Aerial Vehicles (UAVs), as the representative of a "low, slow and small" target, have the advantages of good concealment, anti-jamming solid ability, low requirements for landing environment and low cost. They are gradually turning to operational use and have become essential to intelligence, early warning, and reconnaissance equipment in various countries [1,2]. At the same time, with the rapid expansion of the civilian market of UAVs, its security risks have become increasingly apparent, posing a severe threat to civil aviation traffic, social security, and personal privacy [3,4]. In January 2015, the DJI Elf UAV crashed into the White House in the United States. Since then, the phenomenon of "black flying" and "indiscriminate flying" of UAVs has been repeatedly banned. In May 2017, Kunming Kunming Changshui International Airport, China, was disturbed by drones, resulting in the forced landing of 35 flights and the closure of the airport runway for 45 minutes. In August 2018, Venezuelan President Maduro was attacked by a drone during a televised speech, further causing social panic.
Therefore, developing early warning and defence technology for UAVs has become an urgent problem for all countries [5]. At present, detection methods for low-altitude UAVs include radar early warning, passive imaging detection, acoustic array detection and other technologies, among which [6,7,8]:
  • Radar early warning technology:
    (1)
    American Echo Shield 4D radar combines Ku-band BSA beamforming and dynamic waveform synthesis. It offers wireless positioning service at 15.7-16.6 GHz and wireless navigation function at 15.4-15.7 GHz, with a tracking accuracy of 0.5° and an effective detection distance of 3 km;
    (2)
    HARRIER DSR UAV surveillance radar developed by DeTecT Company of the United States adopts a solid-state Doppler detection radar system, which is used for small RCS targets in complex clutter environments. The detection range of UAVs flying without RF and GPS programs is 3.2 km;
    (3)
    Hof Institute of High-Frequency Physics and Radar Technology in Flawn, Germany, proposed Scanning Surveillance Radar System (SSRS) technology. The FMCW radar sensor was scanned mechanically, with a scanning frequency of 8 Hz and a bandwidth of 1 GHz. The maximum range resolution was 15 cm, and the maximum detection range was 153.6 m.
  • Passive optical imaging technology:
    (1)
    Poland Advanced Protection System Company designed the SKYctrl anti-UAV system, which includes an ultra-precision FIELDctrl 3D MIMO radar, PTZ day/night camera, WiFi sensor, and acoustic array. The detecting target’s minimum height is 1 m;
    (2)
    France’s France-Germany Institute in Saint Louis adopts two optical sensing channels, and the passive colour camera provides the image of the target area, with a field of view of 4°×3°, and realizes the detection of DJI UAV from 100 m to 200 m;
    (3)
    The National Defense Development Agency of Korea proposed a UAV tracking method using KCF and adaptive threshold. The system includes a visual camera, a pan/tilt, an image processing system, a pan/tilt control computer, a driving system and a liquid crystal display, which can detect the UAV with a target size of about 30×30 cm, a flying speed of about 10 m/s and a distance of 1.5 km;
    (4)
    DMA’s low-altitude early warning and tracking photoelectric system, developed by China Hepu Company, integrates a high-definition visible light camera and infrared imaging sensor. The system can detect micro UAVs within 2 km, and the tracking angular velocity can reach 200°.
  • Acoustic detection technology:
    (1)
    Dedrone Company of Germany has developed Drone Tracker, a UAV detection system that uses distributed acoustic and optical sensors to detect UAVs comprehensively. It can detect UAVs illegally invading 50-80 m in advance;
    (2)
    Holy Cross University of Technology in Poland studied the acoustic signal of a rotary-wing UAV. Using an Olympus LS-11 digital recorder, the acoustic signal was recorded at a 44 kHz sampling rate and 16-bit signal resolution, and it was obtained as far as 200 m.
However, the concealment and flexibility of low-slow small UAVs make the existing detection technology face many challenges [9,10]:
  • The target is small and difficult to detect: the reflection cross-section of UAV radar is actively tiny, which is difficult for traditional radar to capture effectively;
  • Complex background interference: it is greatly influenced by ground clutter, which increases the detection difficulty;
  • Low noise and low infrared radiation: the sound wave and infrared characteristics generated by UAVs are weak, so it is difficult to detect by sound wave or infrared detector;
  • Significant environmental impact: weather conditions such as smog and rainy days will affect the performance of traditional optical and acoustic detection.
Compared with the above technology, single photon LiDAR has apparent advantages: by actively emitting narrow beam laser for detection, clutter interference is effectively reduced; It can respond to a single echo photon and can still capture a weak echo signal in a complex environment, especially suitable for long-distance detection of small and low-reflection targets; Strong ability to resist weather interference, able to work all day in day and night and light fog weather; Nano-scale time resolution can locate the target in real-time and high-precision three-dimensional space. The high-precision and high-sensitivity technology of single photon LiDAR breaks through the limitations of traditional technology in detection range, resolution and anti-interference. It makes up for the shortcomings of traditional detection technology [11,12,13,14].
In recent years, with the continuous development of high-frequency laser [15], time-dependent single photon counting (TCSPC) technology [16,17], and semiconductor photodetector [18,19,20,21,22], the performance of Gm-APD single photon LiDAR based on area array detection has been significantly improved. Compared with LiDAR based on Lm-APD, PMT, Si PMT or SP-SPD [23,24], Gm-APD detectors have a higher degree of avalanche ionization and an enormous pixel array, and its application is studied in the field of multi-concentration and long-distance fast three-dimensional imaging or biological imaging [25,26,27,28,29,30,31,32]. However, the existing research on UAV early warning detection still needs to be improved, and further exploration is needed. In 2021, Ding Yuanxue of the Harbin Institute of Technology and others proposed an improved YOLOv3 network detection method based on the Gm-APD LiDAR. Although real-time detection was realized in the range of 238 m, the tracking problem of UAVs still needs to be solved [33].
This paper proposes an unmanned aerial vehicle detection and tracking method based on area array Gm-APD LiDAR. The automatic target extraction is realized by improving the two-dimensional Gaussian model, and the automatic tracking is realized by combining the Kalman filter and Cam-shift tracking algorithm (Cam-Kalm).
The research of moving object tracking focuses on machine vision, and the Camshift and Particle filter algorithms are widely studied and used [34,35]. The particle filter algorithm can track the target well, but its complexity greatly influences real-time performance [36]. Compared with the particle filter algorithm, the Camshift algorithm based on colour features is more straightforward and can achieve a better tracking effect [37]. The target tracking algorithm in machine vision relies on the initial appearance model (colour, shape, texture, etc.) for subsequent tracking. In a complex environment, background noise, illumination changes, occlusion and other factors will increase the similarity between the tracked object and the background, increasing the risk of false and missed detection. Therefore, it is necessary to accurately calibrate the initial target through manual intervention to ensure that the algorithm can accurately lock the target from the beginning and reduce errors in subsequent tracking.
Compared with the characteristics of machine vision imaging, the echo signal of a single photon LiDAR shows a noticeable intensity gradient difference between the target and background, and shows a point spread distribution. In addition, a single photon LiDAR is insensitive to environmental illumination and other factors, and it can automatically identify the target position and track it through physical quantities such as signal strength, time delay, and echo characteristics. Based on these characteristics, this paper improves the two-dimensional Gaussian model for target detection and extraction and combines the active sensing characteristics of single photon LiDAR to achieve higher detection accuracy and environmental robustness so that it can automatically track without human intervention [38,39]. Then, the fusion algorithm of the Kalman filter and Camshift is designed to increase the prediction of the speed and acceleration of the target, which further improves the tracking accuracy and real-time performance [40,41,42]. The performance of the Cam-Kalm algorithm and traditional tracking algorithm is evaluated through the close-range experiment in the range of 70 m to 100 m. Finally, the dynamic tracking of UAVs in the range of 3077.8 m to 3320.3 m is realized.
The structure of this paper is as follows: the first section introduces the research background, the second section describes the area array Gm-APD LiDAR imaging system, the third section describes the designed automatic tracking algorithm, and the fourth section analyzes the tracking results. Finally, the conclusion and prospect are summarized.

2. System Design

2.1. Design of Gm-APD LiDAR System

Figure 1 shows the design of the LiDAR imaging system based on the area array GM-APD. The system consists of five subsystems: transmitting laser system, receiving laser system, Gm-APD detector, data acquisition and processing system based on FPGA and servo system. The output laser wavelength of the laser is 1064 nm, the output power is dynamically adjustable between 0 and 30 W, and the repetition frequency is 20 kHz. The pixel number of the Gm-APD detector is 64 × 64. The servo system is equipped with the LiDAR imaging system to realize dynamic tracking. The data acquisition and processing system is responsible for the control of laser equipment, communication with the upper computer, and sending control instructions for the two-dimensional servo turntable.

2.2. Detection Process of Gm-APD LiDAR Imaging System

(1)
Laser emission: The high-power and narrow-pulse laser is pointed to the UAV target after optical shaping;
(2)
Laser echo reception: Echo photons reflected by the target are converged on the GM-APD detector through the receiving optical system, triggering the avalanche effect;
(3)
Signal recording: GM-APD detector records the echo signal and transmits it to the FPGA data acquisition and processing system;
(4)
Signal processing: The acquisition system processes the signal and reconstructs the intensity image and the range image of the UAV target;
(5)
Target fitting and tracking: Threshold filtering, Gaussian fitting, and Canny edge detection methods are adopted, and the Cam-Kalm algorithm is combined to realize real-time and automatic target tracking;
(6)
Dynamic adjustment: The FPGA system calculates the angle according to the miss distance, adjusts the pitch or azimuth of the servo platform, and dynamically tracks the target.

3. Description of Tracking Algorithm

This section proposes an automatic target tracking algorithm—the Cam-Kalm algorithm—based on the Gm-APD LiDAR imaging system, as shown in Figure 2. The algorithm mainly includes two essential parts: target fitting and tracking. Next, these two parts will be introduced in detail.

3.1. Target Extraction

Compared with the false "Noise target" in the natural environment, the geometry of the UAV target is more regular. How to use this feature effectively will become the key to the fitting and detection of tracking target centres. In the grey morphological analysis of lidar intensity images, UAV targets usually appear convex in pixel values compared with the background, showing noticeable grey differences. That is, the UAV target has an "additive" relationship with the background information in the lidar intensity sequence, which can be expressed as:
F ( x , y , t ) = F T ( x , y , t ) + F B ( x , y , t ) + F n ( x , y , t )
Where F ( x , y , t ) is the intensity value of the LiDAR; F T ( x , y , t ) is the intensity value of the target; F B ( x , y , t ) is the intensity value of the background; F n ( x , y , t ) is the intensity value of noise; ( x , y ) is the pixel coordinate in the intensity image; t is the frame sequence number corresponding to the intensity image.
Literature [44] takes the optical diffusion function as the core and describes the expression of the ideal grey distribution of UAV targets in airspace:
F ( x , y , t ) = τ ( t ) e 1 2 [ ( x c x r x ( t ) ) 2 + ( y c y r y ( t ) ) 2 ]
Where F ( x , y , t ) is the pixel intensity of the target at a specific time and position; ( c x , c y ) is the centroid of the target; τ ( t ) is the centre intensity of the target at this moment which is generally stable between adjacent frames; r x ( t ) and r y ( t ) respectively represent the standard deviation of the target in the X and Y directions of the two-dimensional space.
According to Eq. 2, the standard target Gaussian shape is established, as shown in Figure 3, where the core strength is 80, and the centre position is located at c x = c y = 32 . Figure 3 shows the spatial intensity distribution of real UAV targets detected by Gm-APD LiDAR.
Comparing the spatial distribution scale of the set target with the imaging results of Gm-APD LiDAR:
(1)
There is little difference between the spatial distribution of the target intensity image and the geometric ideal shape;
(2)
The actual detected target presents an irregular convex pattern relative to the background, consistent with the smooth central convex feature of the standard model.
Based on this, we use a two-dimensional Gaussian model to fit the grey morphology of the target, and the algorithm flow is shown in Figure 4. In the intensity image of lidar, the target intensity is higher than the background noise. We use the percentage threshold filtering method to remove the low-intensity background and keep the high-intensity signal. Using two-dimensional Gaussian fitting to obtain the target centroid as the tracking centre, Canny edge detection is used to detect the region of intensity image, and the connected region attribute is extracted by Regionprops function to determine the region of interest (ROI) of the target, that is, the bounding box of the tracking target.
The LiDAR intensity image and the filtering result are shown in Figure 5. The filtered result is fitted, and the fitting result is shown in Figure 6. The red circle mark point in Figure 6 is the fitted target centre position, and the red dotted line in Figure 6 is the detected tracking frame.
Through the above fitting algorithm, we can accurately mark the centre position of the target in the intensity image of LiDAR and complete the extraction of the ROI region through the Canny edge detection algorithm, which provides the target centre position ( x 0 , y 0 ) and tracking rectangular box ( w , l ) of the first frame for the follow-up tracking algorithm.

3.2. Target Tracking and State Prediction

Recognizing the particularity of UAV tracking in airspace, Kalman filtering algorithm is introduced for three purposes [45]:
(1)
Increase forecasting ability: Based on predicting the target position, velocity, and acceleration variables are introduced to improve the adaptability to the rapid change of UAV position. This not only improves the system’s robustness but also reduces the iterative calculation of the Camshift algorithm and accelerates the coordination with the two-dimensional tracking platform;
(2)
Reduce the false alarm rate: By predicting the speed and acceleration of the target, the moving trend of the target can be judged, and the influence of ghosts in LiDAR imaging can be reduced, thus reducing the false alarm rate;
(3)
Adaptive adjustment of search window: Realize the adaptive adjustment of the search window and effectively solve the loss of the tracking target caused by the expansion of the tracking window of the Camshift algorithm.
The tracking algorithm flow is shown in Figure 7.
Step 1 
Initialization.
(1)
Automatic extraction of the first frame target. Use the fitting method in Section 3.1 to obtain the ROI of the target point and synchronously initialize the center position ( x 0 , y 0 ) and size ( w , l ) of the search window.
(2)
Initialize the Kalman filter. The state prediction equation and the observation equation of the tracking system is:
X t k = A X t k 1 + V t k Y t k = H X t k + W t k
Where X t k and X t k 1 represent the motion state vectors of t k and t k 1 respectively; Y t k is the system state observation vector at the t k moment; A is the state transition matrix, which contains the dynamic model of target position, velocity and acceleration; H is the observation matrix, indicating the position of direct observation; V t k and W t k respectively representing the system state noise and the observed Gaussian distribution noise matrix [46].
A = 1 0 Δ t 0 0.5 Δ t 2 0 0 1 0 Δ t 0 0.5 Δ t 2 0 0 1 0 Δ t 0 0 0 0 1 0 Δ t 0 0 0 0 1 0 0 0 0 0 0 1
H = 1 0 0 0 0 0 0 1 0 0 0 0
Where Δ t = Δ t k Δ t k 1 is the time interval between two adjacent frames. Accordingly, the covariance matrix of V t k and W t k corresponds to Q and R represent the uncertainty in the modeling process and the observed noise respectively, and its Formula updated to:
Q = Δ t 4 / 4 0 Δ t 3 / 2 0 Δ t 2 / 2 0 0 Δ t 4 / 4 0 Δ t 3 / 2 0 Δ t 2 / 2 Δ t 3 / 2 0 Δ t 0 Δ t 0 0 Δ t 3 / 2 0 Δ t 0 Δ t Δ t 2 / 2 0 Δ t 0 1 0 0 Δ t 2 / 2 0 Δ t 0 1
R = 1 0 0 1
Step 2 
Backprojection of histogram of LiDAR intensity image.
(1)
Color space conversion. Read the intensity image of Gm-APD lidar, extract H and S components, and convert them into HSV colour space to enhance the colour difference between the target and background [47]. At the same time, the corresponding background component is set to zero to realize background filtering.
(2)
Back projection of histogram of the search area.
Step 3 
Calculate the search window.
The centroid of the range profile search window of LiDAR is calculated by using its zero-order moment and first-order moment: I k ( x , y ) is the pixel values at ( x , y ) in the back projection image, where x and y change within the search window ( w , l ) , the calculation steps are as follows:
(1)
Calculate the zero-order moment at the initial moment as follows:
M 00 , k = x w i d t h y h e i g h t I k ( x , y )
(2)
Calculate the first moment in the x and y directions as follows:
x : M 10 , k = x w i d t h y h e i g h t x I k ( x , y )
y : M 01 , k = x w i d t h y h e i g h t y I k ( x , y )
(3)
Calculate the centroid coordinates of the tracking window as follows:
( x c , y c ) = M 10 , k M 00 , k , M 01 , k M 00 , k
(4)
Moving the center of the search window ( x k , y k ) to the center-of-mass position ( x c , y c ) .
Step 4 
Calculate the trace window.
Camshift algorithm obtains the directional target area through the second-order matrix:
(1)
Calculate the second moment in the x and y directions as follows:
M 20 , k = x w i d t h y h e i g h t x 2 I k ( x , y ) M 02 , k = x w i d t h y h e i g h t y 2 I k ( x , y ) M 11 , k = x w i d t h y h e i g h t x y I k ( x , y )
(2)
Calculate the new tracking window size ( W , L ) as:
W = 2 ( a + c ) b 2 + ( a c ) 2 2 L = 2 ( a + c ) + b 2 + ( a c ) 2 2
Among it,
a = M 20 , k M 00 , k x c 2 b = 2 ( M 11 , k M 00 , k x c y c ) c = M 02 , k M 00 , k y c 2
(3)
Iterate continuously until the centroid position converges. The centroid of the target point ( x c , y c ) is the iterative result ( x t k , y t k ) , which is used to update and predict the Kalman filtering time t k .
Step 5 
Kalman filter prediction.
At the moment t k , the state vector of UAV target motion is:
X t k = x t k , y t k , v x t k , v y t k , α v x t k , α v y t k T
Where, v x t k and v y t k is the speed of the target in the X and Y axis directions in the field of view, α v x t k and α v y t k is the acceleration of the target. According to Eq. (3) and Eq. (4), the motion state equation of the system is updated as follows:
x t k y t k ν x t k ν y t k α ν x t k α ν y t k = 1 0 Δ t 0 0.5 Δ t 2 0 0 1 0 Δ t 0 0.5 Δ t 2 0 0 1 0 Δ t 0 0 0 0 1 0 Δ t 0 0 0 0 1 0 0 0 0 0 0 1 x t k + 1 y t k + 1 ν x t k 1 ν y t k 1 α ν x t k 1 α ν y t k 1 + V t k
The motion observation equation of the system is updated as follows:
x t k y t k = 1 0 0 0 0 0 0 1 0 0 0 0 x t k 1 y t k 1 v x t k 1 v y t k 1 α v x t k 1 α v y t k 1 + W t k
Then, the measured values obtained by Camshift algorithm are used to predict and update Kalman filter [48,49].
Step 6 
Repeat Steps 2, 3, 4 and 5.
Take the predicted value X t k + 1 in Step 3 as the search window center of the ( k + 1 ) th frame search area.

4. Analysis of Experimental Results

4.1. Estimation Results of Target Tracking Center of Gm-APD LiDAR System

In this section, we evaluate the performance of the fitting method proposed in Section 3.1 in target tracking center estimation and compare it with the local peak and centroid weighting techniques.
Firstly, the centre point of the imaging result of the Gm-APD LiDAR system is manually marked, which is used as the reference standard for algorithm evaluation. The Center Location Error (CLE) is defined as the Euclidean distance between the calculated and truth values. The closer the CLE curve is to the X-axis, the more the algorithm error converges to zero, indicating better fitting performance. Mean square error (MSE) is the square of the error between each calculated value and the actual value, reflecting the average deviation between the calculated and actual values. The determinant coefficient (R2) is a quantitative index of fitting quality used to measure the goodness of fit between the calculated value of the fitting model and the actual value.
In the experiment, three fitting methods are used to fit the multi-frame UAV imaging results, and Figure 8 shows the distribution of the calculated value and the actual value of the model’s centre position.
The comparison of center positioning errors of different algorithms is shown in Figure 9, and the calculation results of MSE and R2 parameters are shown in Table 1. Due to the deviation of the maximum intensity of the target irradiated by laser many times, the local peak method leads to a large central error, high MSE, and negative R2, which shows that the fitting effect of this method on this data set is very poor and can hardly explain the changing trend of the data. The centroid weighting method is easily influenced by noise, and the MSE value is high. When the noise is high, the position of the centroid may shift, resulting in the fitting result deviating from the true value and increasing CLE.
In contrast, the fitting method proposed in Section 3.1 fully considers the influence of noise and outliers, and the central distribution error is less than 1, which shows that this method has high accuracy in dealing with data migration. The R2 of fitting is close to 1, which means that the fitting model can effectively capture the central variability of data, showing the solid explanatory ability of the model to strengthen data. The MSE is 0.25, which shows that the deviation between the fitting result and the real value is minimal, and further verifies the accuracy and reliability of this method in practical application.

4.2. Results of Tracking Algorithm for Gm-APD LiDAR System

In this section, the tracking performance of Cam-Kalm algorithm proposed in Section 3.2 is compared with Meanshift algorithm and Camshift algorithm in dynamic tracking. In the experiment, a DJI Elf 4 UAV is used, with a flying distance of 70-100 m. The real UAV is shown in Figure 10, and the flight scene is shown in Figure 11.
The tracking results corresponding to the Cam-Kalm algorithm proposed in this paper are shown in the Figure 12.
The fitting method proposed in the Section 3.1 marks the point centre of the imaging results. This method is used as the reference basis for the error calculation of the Camshift algorithm, Meanshift algorithm, and Cam-Kalm algorithm. Three algorithms are used to track the UAV, and Figure 13 shows the multi-frame tracking trajectory and the actual centre trajectory of the marker.
As can be seen from the Figure 13, the blue line shows the track of the tracking centre of the Meanshift algorithm; The red line shows the trajectory of the tracking centre of the Camshift algorithm; The yellow line shows the trajectory of the tracking centre of the Cam-Kalm algorithm; The purple line is the actual trajectory of the target centre obtained by fitting. The Meanshift algorithm tracks a target loss of around 215 frames, the Camshift algorithm tracks a target loss of around 160 frames, and the Cam-Kalm algorithm keeps stable tracking.
Figure 14 calculates the CLE of each frame when different tracking algorithms are used.
As seen in Figure 14, the Meanshift algorithm shows the biggest CLE between the 215th and 250th frames due to the loss of the target. This is because the search window size of the Meanshift algorithm is fixed. When the target is at the edge of the field of view, and the imaging size is less than half of the actual size, the algorithm cannot readjust the search window to capture the correct boundary of the target. It cannot track it effectively, leading to tracking failure. The Camshift algorithm began to lose the target in the 162nd frame, which was attributed to the ghost phenomenon in the imaging results of this frame. The algorithm mistakenly regarded the ghost as the real target, which led to the tracking failure. In addition, when the moving speed of the target is too fast, the Meanshift algorithm and Camshift algorithm will appear in the phenomenon of tracking frame lag, which leads to the failure to effectively predict the target position from 67 frames to 89 frames, resulting in a sizeable CLE.
In the tracking process of 250 frames, the Cam-Kalman algorithm proposed in this paper shows stable tracking performance, and the Kalman filter smoothes the target’s trajectory and corrects the jitter caused by noise or temporary occlusion in the tracking process, making the tracking more robust and continuous. At the same time, by modelling the speed and acceleration of the target, the position of the target can be predicted in the next frame, significantly when the target suddenly accelerates or decelerates, and the continuity and accuracy of tracking can be maintained. Moreover, the Kalman filter can reduce the range of the Camshift search window by predicting between frames, thus improving the tracking efficiency and reducing the calculation amount. Kalman filter can not only deal with uniform motion but also predict non-uniform complex motion (when the acceleration of the target UAV changes), and the tracking effect is more robust than that of Camshift alone.Compared with the Meanshift algorithm and Camshift algorithm, the CLE of the Cam-Kalm algorithm is always low, and its values are all less than 3 pixels. The average value of multi-frame CLE is 0.8964, less than 1 pixel. The variance of multi-frame CLE is 0.4028, which shows high tracking accuracy and stability.

4.3. Long-Distance Tracking Results of Gm-APD LiDAR Based on Cam-Kalm Algorithm

To further realize long-distance target tracking, the field of view angle of the Gm-APD LiDAR imaging system is automatically adjusted to 0.06° according to the target distance. In the experiment, the MAVIC 3 UAV is used to complete a long-distance flight mission, and the real object of the UAV is shown in Figure 15.
At this time, Gm-APD LiDAR detected MAVIC 3 UAV in 3 km airspace, and the imaging results of the system as shown in Figure 16 and Figure 16. The fitting distribution of the two-dimensional Gaussian model is shown in Figure 16. The model’s corresponding MSE is 0.0909, and the R2 value is 0.9952.
The corresponding tracking results of the Cam-Kalm tracking algorithm proposed in this paper are shown in the Figure 17.
Based on the Cam-Kalm algorithm, the Gm-APD LiDAR calculates the miss distance of the target according to the fitted centroid change and the field of view angle of the LiDAR so as to adjust the servo turntable for target tracking. After smoothing, the space trajectory of the UAV target is shown in Figure 18. During the tracking process, the initial distance of the target measured by LiDAR is 3077.8 m, and at the end of the tracking, the target distance is 3320.3 m. The effective tracking distance is 242.5 m.

5. Conclusions

In this paper, an automatic laser tracking and ranging system based on Gm-APD LiDAR is designed and implemented, and a Cam-Kalm algorithm combining the Kalman filter and Camshift algorithm is proposed, significantly improving the tracking accuracy and stability of low and slow targets. The system can independently determine the target’s center position and tracking frame by introducing a two-dimensional Gaussian fitting and edge detection algorithm and realize automatic tracking. Experiments show that the system not only shows high fitting accuracy in the range of 70 m but also successfully tracks the UAV in real-time in the long-distance scene from 3.07 km to 3.32 km, which verifies its practicability in long-distance target detection.
This study provides a new solution based on LiDAR for UAV early warning technology, especially in long-distance tracking and ranging of non-cooperative targets in airspace. Future work will focus on expanding the system’s non-cooperative target recognition ability, realizing real-time three-dimensional positioning and tracking of various types of UAVs, and further improving the operational effectiveness of the UAV early warning system.

Author Contributions

Conceptualization, Dongfang Guo and Yancheng Qu; methodology, Yancheng Qu and Xin Zhou; software, Jie Lu and Feng Liu; validation, Dongfang Guo, and Shengwen Yin; formal analysis, Dongfang Guo; investigation, Jie Lu and Feng Liu; resources, Dongfang Guo; data curation, Jianfeng Sun and Xin Zhou; writing—original draft preparation, Dongfang Guo; writing—review and editing, Yancheng Qu and Jianfeng Sun; visualization, Dongfang Guo and Xin Zhou; supervision, Yancheng Qu and Xin Zhou; project administration, Xin Zhou and Jianfeng Sun; funding acquisition, Jianfeng Sun. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Farlík J, Gacho L. Researching UAV threat–new challenges[C]//2021 International Conference on Military Technologies (ICMT). IEEE, 2021: 1-6. [CrossRef]
  2. Lyu C, Zhan R. Global analysis of active defense technologies for unmanned aerial vehicle[J]. IEEE Aerospace and Electronic Systems Magazine, 2022, 37(1): 6-31. [CrossRef]
  3. Zhou Y, Rao B, Wang W. UAV swarm intelligence: Recent advances and future trends[J]. Ieee Access, 2020, 8: 183856-183878. [CrossRef]
  4. Mohsan S A H, Khan M A, Noor F, et al. Towards the unmanned aerial vehicles (UAVs): A comprehensive review[J]. Drones, 2022, 6(6): 147. [CrossRef]
  5. Wang J, Liu Y, Song H. Counter-unmanned aircraft system (s)(C-UAS): State of the art, challenges, and future trends[J]. IEEE Aerospace and Electronic Systems Magazine, 2021, 36(3): 4-29. [CrossRef]
  6. Anil A, Hennemann A, Kimmel H, et al. PERSEUS-Post-Emergency Response and Surveillance UAV System[M]. Deutsche Gesellschaft für Luft-und Raumfahrt-Lilienthal-Oberth eV, 2024. [CrossRef]
  7. Bi Z, Chen H, Hu J, et al. Analysis of UAV Typical War Cases and Combat Assessment Research[C]//2022 IEEE International Conference on Unmanned Systems (ICUS). IEEE, 2022: 1449-1453. [CrossRef]
  8. Wang J, Liu Y, Song H. Counter-unmanned aircraft system (s)(C-UAS): State of the art, challenges, and future trends[J]. IEEE Aerospace and Electronic Systems Magazine, 2021, 36(3): 4-29. [CrossRef]
  9. Lykou G, Moustakas D, Gritzalis D. Defending airports from UAS: A survey on cyber-attacks and counter-drone sensing technologies[J]. Sensors, 2020, 20(12): 3537. [CrossRef]
  10. Park S, Kim H T, Lee S, et al. Survey on anti-drone systems: Components, designs, and challenges[J]. IEEE access, 2021, 9: 42635-42659. [CrossRef]
  11. McManamon P, F. Review of ladar: a historic, yet emerging, sensor technology with rich phenomenology[J]. Optical Engineering, 2012, 51(6): 060901-060901. [CrossRef]
  12. Advanced time-correlated single photon counting applications[M]. Switzerland: Springer International Publishing, 2015. [CrossRef]
  13. Becker W, Bergmann A. Multi-dimensional time-correlated single photon counting[J]. Reviews in fluorescence 2005, 2005: 77-108. [CrossRef]
  14. Prochazka I, Hamal K, Sopko B. Recent achievements in single photon detectors and their applications[J]. Journal of Modern Optics, 2004, 51(9-10): 1289-1313. [CrossRef]
  15. Natarajan C M, Tanner M G, Hadfield R H. Superconducting nanowire single-photon detectors: physics and applications[J]. Superconductor science and technology, 2012, 25(6): 063001. [CrossRef]
  16. Yuan Z L, Kardynal B E, Sharpe A W, et al. High speed single photon detection in the near infrared[J]. Applied Physics Letters, 2007, 91(4). [CrossRef]
  17. Buller G S, Collins R J. Single-photon generation and detection[J]. Measurement Science and Technology, 2009, 21(1): 012002. [CrossRef]
  18. Eisaman M D, Fan J, Migdall A, et al. Invited review article: Single-photon sources and detectors[J]. Review of scientific instruments, 2011, 82(7). [CrossRef]
  19. Fersch T, Weigel R, Koelpin A. Challenges in miniaturized automotive long-range LiDAR system design[C] //Three-Dimensional Imaging, Visualization, and Display 2017. SPIE, 2017, 10219: 160-171. [CrossRef]
  20. Raj T, Hanim Hashim F, Baseri Huddin A, et al. A survey on LiDAR scanning mechanisms[J]. Electronics, 2020, 9(5): 741. [CrossRef]
  21. Pfeifer N, Briese C. Laser scanning–principles and applications[C]//Geosiberia 2007-international exhibition and scientific congress. European Association of Geoscientists & Engineers, 2007: cp-59-00077. [CrossRef]
  22. Kim B H, Khan D, Bohak C, et al. V-RBNN based small drone detection in augmented datasets for 3D LADAR system[J]. Sensors, 2018, 18(11): 3825. [CrossRef]
  23. Chen Z, Liu B, Guo G. Adaptive single photon detection under fluctuating background noise[J]. Optics express, 2020, 28(20): 30199-30209. [CrossRef]
  24. Pfennigbauer M, Möbius B, do Carmo J P. Echo digitizing imaging LiDAR for rendezvous and docking[C]//Laser Radar Technology and Applications XIV. SPIE, 2009, 7323: 9-17. [CrossRef]
  25. McCarthy A, Ren X, Della Frera A, et al. Kilometer-range depth imaging at 1550 nm wavelength using an InGaAs/InP single-photon avalanche diode detector[J]. Optics express, 2013, 21(19): 22098-22113. [CrossRef]
  26. Pawlikowska A M, Halimi A, Lamb R A, et al. Single-photon three-dimensional imaging at up to 10 kilometers range[J]. Optics express, 2017, 25(10): 11919-11931. [CrossRef]
  27. Zhou H, He Y, You L, et al. Few-photon imaging at 1550 nm using a low-timing-jitter superconducting nanowire single-photon detector[J]. Optics express, 2015, 23(11): 14603-14611. [CrossRef]
  28. Liu B, Yu Y, Chen Z, et al. True random coded photon counting LiDAR[J]. Opto-Electronic Advances, 2020, 3(2): 190044-1-190044-6. [CrossRef]
  29. Li Z P, Ye J T, Huang X, et al. Single-photon imaging over 200 km[J]. Optica, 2021, 8(3): 344-349. [CrossRef]
  30. Kirmani A, Venkatraman D, Shin D, et al. First-photon imaging[J]. Science, 2014, 343(6166): 58-61. [CrossRef]
  31. Hua K, Liu B, Chen Z, et al. Fast photon-counting imaging with low acquisition time method[J]. IEEE Photonics Journal, 2021, 13(3): 1-12. [CrossRef]
  32. Chen Z, Liu B, Guo G, et al. Single photon imaging with multi-scale time resolution[J]. Optics Express, 2022, 30(10): 15895-15904. [CrossRef]
  33. Ding Y, Qu Y, Zhang Q, et al. Research on UAV detection technology of Gm-APD LiDAR based on YOLO model[C]//2021 IEEE International Conference on Unmanned Systems (ICUS). IEEE, 2021: 105-109. [CrossRef]
  34. Bi H, Ma J, Wang F. An improved particle filter algorithm based on ensemble Kalman filter and Markov chain Monte Carlo method[J]. IEEE journal of selected topics in applied earth observations and remote sensing, 2014, 8(2): 447-459. [CrossRef]
  35. Kulkarni M, Wadekar P, Dagale H. Block division based camshift algorithm for real-time object tracking using distributed smart cameras[C]//2013 IEEE International Symposium on Multimedia. IEEE, 2013: 292-296. [CrossRef]
  36. Yang, P. Efficient particle filter algorithm for ultrasonic sensor-based 2D range-only simultaneous localisation and mapping application[J]. IET Wireless Sensor Systems, 2012, 2(4): 394-401. [CrossRef]
  37. Cong D, Shi P, Zhou D. An improved camshift algorithm based on RGB histogram equalization[C]//2014 7th International Congress on Image and Signal Processing. IEEE, 2014: 426-430. [CrossRef]
  38. Xu X, Zhang H, Luo M, et al. Research on target echo characteristics and ranging accuracy for laser radar[J]. Infrared Physics & Technology, 2019, 96: 330-339. [CrossRef]
  39. Chauve A, Mallet C, Bretar F, et al. Processing full-waveform lidar data: modelling raw signals[J]. International archives of photogrammetry, remote sensing and spatial information sciences, 2007, 36(part 3): W52.
  40. Laurenzis M, Bacher E, Christnacher F. Measuring laser reflection cross-sections of small unmanned aerial vehicles for laser detection, ranging and tracking[C]//Laser Radar Technology and Applications XXII. SPIE, 2017, 10191: 74-82. [CrossRef]
  41. Zhang, Y. Detection and tracking of human motion targets in video images based on Camshift algorithms[J]. IEEE Sensors Journal, 2019, 20(20): 11887-11893. [CrossRef]
  42. Bankar R, Salankar S. Improvement of head gesture recognition using Camshift based face tracking with UKF[C]//2019 9th International Conference on Emerging Trends in Engineering and Technology-Signal and Information Processing (ICETET-SIP-19). IEEE, 2019: 1-5. [CrossRef]
  43. Zhang J, Zhang Y, Shen S, et al. Research and application of police UAVs target tracking based on improved Camshift algorithm[C]//2019 3rd International Conference on Electronic Information Technology and Computer Engineering (EITCE). IEEE, 2019: 1238-1242. [CrossRef]
  44. Nishiguchi K I, Kobayashi M, Ichikawa A. Small target detection from image sequences using recursive max filter[C]//Signal and Data Processing of Small Targets 1995. SPIE, 1995, 2561: 153-166. [CrossRef]
  45. Zhang, Y. Detection and tracking of human motion targets in video images based on Camshift algorithms[J]. IEEE Sensors Journal, 2019, 20(20): 11887-11893. [CrossRef]
  46. Solanki P B, Al-Rubaiai M, Tan X. Extended Kalman filter-based active alignment control for LED optical communication[J]. IEEE/ASME Transactions on Mechatronics, 2018, 23(4): 1501-1511. [CrossRef]
  47. Exner D, Bruns E, Kurz D, et al. Fast and robust Camshift tracking[C]//2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops. IEEE, 2010: 9-16. [CrossRef]
  48. Guo G, Zhao S. 3D multi-object tracking with adaptive cubature Kalman filter for autonomous driving[J]. IEEE Transactions on Intelligent Vehicles, 2022, 8(1): 512-519. [CrossRef]
  49. Li Y, Bian C, Chen H. Object tracking in satellite videos: Correlation particle filter tracking method with motion estimation by Kalman filter[J]. IEEE Transactions on Geoscience and Remote Sensing, 2022, 60: 1-12. [CrossRef]
Figure 1. Principle block diagram of Gm-APD LiDAR system.
Figure 1. Principle block diagram of Gm-APD LiDAR system.
Preprints 139317 g001
Figure 2. Schematic diagram of automatic tracking algorithm.
Figure 2. Schematic diagram of automatic tracking algorithm.
Preprints 139317 g002
Figure 3. Comparison of intensity distribution between standard Gaussian model and real target.
Figure 3. Comparison of intensity distribution between standard Gaussian model and real target.
Preprints 139317 g003
Figure 4. Algorithm flow of target center fitting.
Figure 4. Algorithm flow of target center fitting.
Preprints 139317 g004
Figure 5. LiDAR intensity image and threshold filtering results.
Figure 5. LiDAR intensity image and threshold filtering results.
Preprints 139317 g005
Figure 6. Target centre estimation and tracking frame detection results.
Figure 6. Target centre estimation and tracking frame detection results.
Preprints 139317 g006
Figure 7. Principle block diagram of Cam-Kalm tracking algorithm.
Figure 7. Principle block diagram of Cam-Kalm tracking algorithm.
Preprints 139317 g007
Figure 8. Comparison of distribution between center position and model calculation position under different frame numbers.
Figure 8. Comparison of distribution between center position and model calculation position under different frame numbers.
Preprints 139317 g008
Figure 9. Comparison of center positioning errors of different fitting methods under different frame numbers.
Figure 9. Comparison of center positioning errors of different fitting methods under different frame numbers.
Preprints 139317 g009
Figure 10. Physical object of flying target.
Figure 10. Physical object of flying target.
Preprints 139317 g010
Figure 11. Close-range tracking experimental scene.
Figure 11. Close-range tracking experimental scene.
Preprints 139317 g011
Figure 12. Multi-frame UAV dynamic tracking results of Gm-APD LiDAR based on Cam-Kalm algorithm.
Figure 12. Multi-frame UAV dynamic tracking results of Gm-APD LiDAR based on Cam-Kalm algorithm.
Preprints 139317 g012
Figure 13. Trajectory comparison using different tracking algorithms under multiple frames.
Figure 13. Trajectory comparison using different tracking algorithms under multiple frames.
Preprints 139317 g013
Figure 14. Comparison of center positioning errors using different tracking algorithms under multiple frames.
Figure 14. Comparison of center positioning errors using different tracking algorithms under multiple frames.
Preprints 139317 g014
Figure 15. Physical map of UAV in long-distance flight experiment.
Figure 15. Physical map of UAV in long-distance flight experiment.
Preprints 139317 g015
Figure 16. Reconstruction results and fitting distribution of Gm-APD LiDAR detecting long-range the MAVIC 3 UAV.
Figure 16. Reconstruction results and fitting distribution of Gm-APD LiDAR detecting long-range the MAVIC 3 UAV.
Preprints 139317 g016
Figure 17. Multi-frame UAV dynamic tracking results at 3 km of Gm-APD LiDAR based on Cam-Kalm algorithm.
Figure 17. Multi-frame UAV dynamic tracking results at 3 km of Gm-APD LiDAR based on Cam-Kalm algorithm.
Preprints 139317 g017
Figure 18. Trajectory of UAV after smoothing (3 km).
Figure 18. Trajectory of UAV after smoothing (3 km).
Preprints 139317 g018
Table 1. Comparison of quantitative indexes of different fitting methods.
Table 1. Comparison of quantitative indexes of different fitting methods.
Fitting method MSE R2
Local peak 17.08 -5.1084
Centroid weighting 0.70 0.7569
Proposed in this paper 0.25 0.9143
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated