Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Generative Adversarial Unsupervised Image Restoration in Hybrid Degradation Scenes

Version 1 : Received: 10 February 2022 / Approved: 11 February 2022 / Online: 11 February 2022 (08:42:43 CET)

How to cite: Tang, F.; Zhu, X.; Hu, J.; Tie, J.; Zhou, J.; Fu, Y. Generative Adversarial Unsupervised Image Restoration in Hybrid Degradation Scenes. Preprints 2022, 2022020159. https://doi.org/10.20944/preprints202202.0159.v1 Tang, F.; Zhu, X.; Hu, J.; Tie, J.; Zhou, J.; Fu, Y. Generative Adversarial Unsupervised Image Restoration in Hybrid Degradation Scenes. Preprints 2022, 2022020159. https://doi.org/10.20944/preprints202202.0159.v1

Abstract

In this paper, we propose an unsupervised blind restoration model for images in hybrid degradation scenes. The proposed model encodes the content information and degradation information of images and then uses the attention module to disentangle the two kinds of information. It can improve the ability of disentangled presentation learning for a generative adversarial network (GAN) to restore the images in hybrid degradation scenes, enhance the detailed features of restored image and remove the artifact combining the adversarial loss, cycle-consistency loss, and perception loss. The experimental results on the DIV2K dataset and medical images show that the proposed method outperforms existing unsupervised image restoration algorithms in terms of peak signal-to-noise ratio (PSNR), structural similarity (SSIM), and subjective visual evaluation.

Keywords

hybrid degradation; generative adversarial network; attention mechanism; disentangled representation

Subject

Computer Science and Mathematics, Applied Mathematics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.