Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Rectification for Image Stitching with Deformable Meshes and Residual Networks

Version 1 : Received: 19 January 2024 / Approved: 19 January 2024 / Online: 19 January 2024 (13:35:31 CET)

How to cite: Fan, Y.; Mao, S.; Mei, L.; Zheng, W.; Jitong, K.; Ben, L. Rectification for Image Stitching with Deformable Meshes and Residual Networks. Preprints 2024, 2024011513. https://doi.org/10.20944/preprints202401.1513.v1 Fan, Y.; Mao, S.; Mei, L.; Zheng, W.; Jitong, K.; Ben, L. Rectification for Image Stitching with Deformable Meshes and Residual Networks. Preprints 2024, 2024011513. https://doi.org/10.20944/preprints202401.1513.v1

Abstract

Image stitching is a crucial aspect of image processing. However, factors like perspective and environment often lead to irregular shapes in stitched images. Cropping or completion methods typically result in substantial loss of information. This paper proposes a method for rectifying irregularly images into rectangles using deformable meshes and residual networks. The method utilizes a convolutional neural network to quantify rigid structures of images. Choosing the most suitable mesh structure based on the extraction results, offering options such as triangular, rectangular, and hexagonal. Subsequently, the irregularly image, predefined mesh structure, and predicted mesh structure are input into a wide residual neural network for regression. The loss function comprises local and global, aimed at minimizing the loss of image information within the mesh and global target. This method not only significantly reduces information loss during rectification but also adapting to different images with various rigid structures. Validation on the DIR-D dataset shows this method outperforms state-of-the-art methods in image rectification.

Keywords

image rectangular; deformable mesh; width residual network; global loss function

Subject

Computer Science and Mathematics, Computer Vision and Graphics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.