Article
Version 1
This version is not peer-reviewed
Single Image Rain Removal Based on Multi-Scale and Channel Attention Mechanism
Version 1
: Received: 28 May 2021 / Approved: 31 May 2021 / Online: 31 May 2021 (11:41:25 CEST)
How to cite: Liu, Q.; Zhou, G.; Jia, Z. Single Image Rain Removal Based on Multi-Scale and Channel Attention Mechanism. Preprints 2021, 2021050753 Liu, Q.; Zhou, G.; Jia, Z. Single Image Rain Removal Based on Multi-Scale and Channel Attention Mechanism. Preprints 2021, 2021050753
Abstract
Deep convolutional neural network (CNN) has shown their great advantages in the single image deraining task. However, most existing CNN-based single image deraining methods still suffer from residual rain streaks and details lost. In this paper, we propose a deep neural network including the Multi-scale feature extraction module and the channel attention module, which are embed in the feature extraction sub-network and the rain removal sub-network respectively. In the feature extraction sub-network, the Multi-scale feature extraction module is constructed by a Multi-layer Laplacian pyramid, and is then integrated multi-scale feature maps by a feature fusion module. In the rain removal sub-network, the channel attention module, which assigns different weights to the different channels, is introduced for preserving image details. Experimental results on visually and quantitatively comparison demonstrate that the proposed method performs favorably against other state-of-the-art approaches
Keywords
Single image deraining; Multi-layer Laplacian pyramid; Multi-scale feature extraction module; Channel attention module.
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)