Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

YOLOv5s-Fog: An Improved Model Based on YOLOv5s for Object Detection in Foggy Weather Scenarios

Version 1 : Received: 9 May 2023 / Approved: 10 May 2023 / Online: 10 May 2023 (10:05:03 CEST)

A peer-reviewed article of this Preprint also exists.

Meng, X.; Liu, Y.; Fan, L.; Fan, J. YOLOv5s-Fog: An Improved Model Based on YOLOv5s for Object Detection in Foggy Weather Scenarios. Sensors 2023, 23, 5321. Meng, X.; Liu, Y.; Fan, L.; Fan, J. YOLOv5s-Fog: An Improved Model Based on YOLOv5s for Object Detection in Foggy Weather Scenarios. Sensors 2023, 23, 5321.

Abstract

In foggy weather scenarios, the scattering and absorption of light by water droplets and particulate matter cause object features in images to become blurred or lost, presenting a significant challenge for target detection in autonomous driving vehicles. To tackle this problem, this study proposes a foggy weather detection method, YOLOv5s-Fog, based on the YOLOv5s framework. The model enhances the feature extraction and expression capabilities of YOLOv5s by introducing a novel target detection layer, SwinFocus. Additionally, this research incorporates decoupled head into the model and replaces the conventional non-maximum suppression method with Soft-NMS. Experimental results demonstrate that these improvements effectively enhance the detection performance for blurry objects and small targets in foggy weather conditions. Compared to the baseline model YOLOv5s, YOLOv5s-Fog achieves a 5.4% increase in mAP on the RTTS dataset, reaching 73.4%. This method provides technical support for rapid and accurate target detection in adverse weather conditions, such as foggy weather, for autonomous driving vehicles.

Keywords

foggy weather scenarios; deep learning; SwinFoucs; decoupled head; Soft-NMS

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.