Working Paper Article Version 1 This version is not peer-reviewed

Single Image Rain Removal Based on Multi-Scale and Channel Attention Mechanism

Version 1 : Received: 28 May 2021 / Approved: 31 May 2021 / Online: 31 May 2021 (11:41:25 CEST)

How to cite: Liu, Q.; Zhou, G.; Jia, Z. Single Image Rain Removal Based on Multi-Scale and Channel Attention Mechanism. Preprints 2021, 2021050753 Liu, Q.; Zhou, G.; Jia, Z. Single Image Rain Removal Based on Multi-Scale and Channel Attention Mechanism. Preprints 2021, 2021050753

Abstract

Deep convolutional neural network (CNN) has shown their great advantages in the single image deraining task. However, most existing CNN-based single image deraining methods still suffer from residual rain streaks and details lost. In this paper, we propose a deep neural network including the Multi-scale feature extraction module and the channel attention module, which are embed in the feature extraction sub-network and the rain removal sub-network respectively. In the feature extraction sub-network, the Multi-scale feature extraction module is constructed by a Multi-layer Laplacian pyramid, and is then integrated multi-scale feature maps by a feature fusion module. In the rain removal sub-network, the channel attention module, which assigns different weights to the different channels, is introduced for preserving image details. Experimental results on visually and quantitatively comparison demonstrate that the proposed method performs favorably against other state-of-the-art approaches

Subject Areas

Single image deraining; Multi-layer Laplacian pyramid; Multi-scale feature extraction module; Channel attention module.

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our diversity statement.

Leave a public comment
Send a private comment to the author(s)
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.