Preprint Technical Note Version 1 Preserved in Portico This version is not peer-reviewed

An Algorithm for High-Resolution Satellite Imagery Pre-processing

Version 1 : Received: 4 March 2022 / Approved: 7 March 2022 / Online: 7 March 2022 (09:43:08 CET)

How to cite: Shrawankar, U.; Shrawankar, C. An Algorithm for High-Resolution Satellite Imagery Pre-processing. Preprints 2022, 2022030095. https://doi.org/10.20944/preprints202203.0095.v1 Shrawankar, U.; Shrawankar, C. An Algorithm for High-Resolution Satellite Imagery Pre-processing. Preprints 2022, 2022030095. https://doi.org/10.20944/preprints202203.0095.v1

Abstract

During the few years, various algorithms have been developed to extract features from high-resolution satellite imagery. For the classification of these extracted features, several complex algorithms have been developed. But these algorithms do not possess critical refining stages of processing the data at the preliminary phase. Various satellite sensors have been launched such as LISS3, IKONOS, QUICKBIRD, and WORLDVIEW etc. Before classification and extraction of semantic data, imagery of the high resolution must be refined. The whole refinement process involves several steps of interaction with the data. These steps are pre-processing algorithms that are presented in this paper. Pre-processing steps involves Geometric correction, radiometric correction, Noise removal, Image enhancement etc. Due to these pre-processing algorithms, the accuracy of the data is increased. Various applications of these pre-processing of the data are in meteorology, hydrology, soil science, forest, physical planning etc. This paper also provides a brief description of the local maximum likelihood method, fuzzy method, stretch method and pre-processing methods, which are used before classifying and extracting features from the image.

Keywords

pre-processing; image transformation; image enhancement; geometric correction; radiometric correction; Satellite Imagery

Subject

Engineering, Control and Systems Engineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.