Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Low Tensor Rank Constrained Image Inpainting Using a Novel Arrangement Scheme

Version 1 : Received: 23 October 2023 / Approved: 23 October 2023 / Online: 24 October 2023 (11:23:06 CEST)

How to cite: Ma, S.; Fan, Y.; Fang, S.; Yang, W.; Li, L. Low Tensor Rank Constrained Image Inpainting Using a Novel Arrangement Scheme. Preprints 2023, 2023101523. https://doi.org/10.20944/preprints202310.1523.v1 Ma, S.; Fan, Y.; Fang, S.; Yang, W.; Li, L. Low Tensor Rank Constrained Image Inpainting Using a Novel Arrangement Scheme. Preprints 2023, 2023101523. https://doi.org/10.20944/preprints202310.1523.v1

Abstract

Employing low tensor rank decompositions in image inpainting has attracted increasing attention. This paper exploits a novel tensor-augmentation schemes to transform an image (a low-order tensor) to a higher-order tensor without changing the total number of pixels. The developed augmentation schemes enhance the low-rankness of an image under three tensor decompositions: matrix SVD, tensor train (TT) decomposition, and tensor singular value decomposition (t-SVD). By exploiting the schemes, we solve the image inpainting problem with three low-rank con-strained models which use the matrix rank, TT rank, and tubal rank as constrained priors re-spectively. The tensor tubal rank and tensor train multi-rank are developed from t-SVD and TT decomposition respectively. We exploit efficient ADMM algorithms for solving the three models. Experimental results demonstrate that our methods are effective for image inpainting and supe-rior to numerous close methods.

Keywords

image inpainting; tensor decomposition; rearrangement scheme; unfolding matrix; alternating direction multiplier method

Subject

Computer Science and Mathematics, Computer Vision and Graphics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.