Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Event-Assisted Object Tracking on High-Speed Drones under Harsh Illumination Environment

Version 1 : Received: 5 December 2023 / Approved: 14 December 2023 / Online: 14 December 2023 (06:40:14 CET)

A peer-reviewed article of this Preprint also exists.

Han, Y.; Yu, X.; Luan, H.; Suo, J. Event-Assisted Object Tracking on High-Speed Drones in Harsh Illumination Environment. Drones 2024, 8, 22. Han, Y.; Yu, X.; Luan, H.; Suo, J. Event-Assisted Object Tracking on High-Speed Drones in Harsh Illumination Environment. Drones 2024, 8, 22.

Abstract

Drones have been used in a variety of scenarios such as atmospheric monitoring, fire rescue, agricultural irrigation, etc., in which accurate environmental perception is of crucial importance for both decision-making and control. Among the drone sensors, the RGB camera is indispensable for capturing rich visual information for vehicle navigation but encounters a grand challenge in high-dynamic-range scenes that occur frequently in real applications. Specifically, the recorded frames suffer from under-exposure and over-exposure simultaneously and degenerate the successive vision tasks. To solve the problem, we take object tracking as an example and leverage the superior response of event cameras over a large intensity range to propose an event-assisted object tracking algorithm that can achieve reliable tracking under large intensity variations. Specifically, we propose to pursue feature matching from dense event signals, and based on which to (i) design a UNet-based image enhancement algorithm to balance the RGB intensity with the help of neighboring frames in the time domain, and then (ii) construct a dual-input tracking model to track the moving objects from intensity balanced RGB video and event sequence. The proposed approach is comprehensively validated in both simulation and real experiments.

Keywords

Drones; harsh illumination; image enhancement; event-assisted object tracking; multi-sensor fusion

Subject

Computer Science and Mathematics, Computer Vision and Graphics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.