Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Intelligent Space Object Detect Driven by Space Object Data

Version 1 : Received: 9 November 2023 / Approved: 9 November 2023 / Online: 9 November 2023 (11:00:23 CET)

A peer-reviewed article of this Preprint also exists.

Tang, Q.; Li, X.; Xie, M.; Zhen, J. Intelligent Space Object Detection Driven by Data from Space Objects. Appl. Sci. 2024, 14, 333. Tang, Q.; Li, X.; Xie, M.; Zhen, J. Intelligent Space Object Detection Driven by Data from Space Objects. Appl. Sci. 2024, 14, 333.

Abstract

With the rapid development of space programs in various countries, the number of satellites in space is increasing, resulting in an increasingly complex space environment. Therefore, improving space object identification technology has become highly important. We proposes a method of applying deep learning to intelligent detection of space object. We utilize 49 authentic 3D satellite models including 16 scenarios to generate a dataset comprising 17,942 images, which contains over 500 actual satellite photos. Additionally, we acquired a substantial amount of annotated data using a semi-automatic labeling method, which resulted in significant labor cost savings, and obtained a total of 39,000 labels. We validate the feasibility of the dataset using YOLOv3 and YOLOv7 models. What's more, we optimize the YOLOv7 model by integrating deformable convolution RepPoint into the YOLOv7 backbone to obtain the YOLOv7-R model. Through training with these two models, experimental results show that YOLOv3 achieves an accuracy of 0.927, YOLOv7 reaches an accuracy of 0.964, and YOLOv7-R achieves the highest accuracy at 0.983. This provides an effective solution for intelligent space object detection.

Keywords

space object identification; deep learning; YOLO; deformable convolution

Subject

Computer Science and Mathematics, Computer Vision and Graphics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.