Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Vegetation Identification in Hyperspectral Images Using Distance/Correlation Metrics

Version 1 : Received: 8 June 2023 / Approved: 9 June 2023 / Online: 9 June 2023 (04:15:53 CEST)

A peer-reviewed article of this Preprint also exists.

Chanchí Golondrino, G.E.; Ospina Alarcón, M.A.; Saba, M. Vegetation Identification in Hyperspectral Images Using Distance/Correlation Metrics. Atmosphere 2023, 14, 1148. Chanchí Golondrino, G.E.; Ospina Alarcón, M.A.; Saba, M. Vegetation Identification in Hyperspectral Images Using Distance/Correlation Metrics. Atmosphere 2023, 14, 1148.

Abstract

Vegetation identification using remote sensing is essential for understanding terrestrial ecosystems and monitoring climate change. This technology by providing high-resolution images of the Earth's surface from satellites, aircraft, and drones has revolutionized the way scientists study and un-derstand vegetation patterns, providing information on the distribution, structure, and health of plant communities in different regions and biomes. However, identifying vegetation in multi and hyperspectral images remains a difficult, time-consuming, and costly process. Traditional methods for vegetation mapping involve the in-terpretation of spectral indices such as the Normalized Difference Vegetation Index (NDVI), alt-hough they often suffer from limitations such as manual interpretation and limited spatial resolution of satellite images. Furthermore, machine learning techniques, artificial neural networks, decision trees, support vector machines, and random forests have shown promising results in identifying and classifying vegetation cover from hyperspectral images. However, due to the need for per-son-nel with specialized expertise, their application on a large scale is still challenging. On the other hand, correlation methods between curves have emerged as powerful and simplified tool for comparing spectral signatures of hyperspectral image pixels and efficiently classifying vegetation in a study area. Pearson's correlation coefficient and spectral angle mapper are among the most common correlation methods used in vegetation mapping. These correlation methods, among many others available, offer an easily implemented and computationally efficient approach, making them particularly useful for applications in developing countries or regions with limited resources. In this article, a comparative study of correlation/distance metrics was conducted for the detection of vegetation pixels in hyperspectral images. The study allowed for the comparison of five distance and/or correlation metrics: direct correlation, cosine similarity, normalized Euclidean distance, Bray-Curtis’s distance, and Pearson correlation. The two metrics that yielded the best accuracy results in vegetation pixel detection were direct correlation and Pearson correlation. Based on the selected methods, a vegetation detection algorithm was implemented and validated on a hyper-spectral image of the Manga neighborhood in Cartagena de Indias, Colombia. The spectral library was utilized for image processing, while the numpy and scipy libraries in the Python programming language were used for the mathematical calculation of correlations. Both the study's approach and the implemented algorithm aim to serve as a reference for conducting detection studies of various material types in hyperspectral images using open-access programming platforms.

Keywords

Distance/correlation-based method; Urban vegetation; Vegetation Clustering; Vegetation classifi-cation interpolation

Subject

Environmental and Earth Sciences, Environmental Science

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.