Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Dual-Pyramid Wide Residual Network for Semantic Segmentation on Cross-Style Datasets

Version 1 : Received: 24 October 2023 / Approved: 25 October 2023 / Online: 25 October 2023 (09:55:08 CEST)

A peer-reviewed article of this Preprint also exists.

Shen, G.-T.; Huang, Y.-F. Dual-Pyramid Wide Residual Network for Semantic Segmentation on Cross-Style Datasets. Information 2023, 14, 630. Shen, G.-T.; Huang, Y.-F. Dual-Pyramid Wide Residual Network for Semantic Segmentation on Cross-Style Datasets. Information 2023, 14, 630.

Abstract

Image segmentation is the process of partitioning an image into multiple segments where the goal is to simplify the representation of the image and make the image more meaningful and easier to analyze. In particular, semantic segmentation is an approach of detecting the classes of objects, based on each pixel. In the past, most semantic segmentation models were for only one single style, such as urban street views, medical images, or even Manga. In this paper, we proposed a semantic segmentation model called Dual-Pyramid Wide Residual Network (DPWRN) to solve the segmentation for cross-style datasets, which is suitable for diverse segmentation applications. The DPWRN integrated the Pyramid of Kernel paralleled with Dilation (PKD) and Multi-Feature Fusion (MFF) to improve the accuracy of segmentation. To evaluate the generalization of the DPWRN and its superiority over most state-of-the-art models, three datasets with completely different styles are tested in the experiments. Finally, the DPWRN verified its generalization and also showed its superiority over most state-of-the-art models.

Keywords

semantic segmentation; dilated convolution; multi-scale objects; feature fusions; cross-style datasets

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.