Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Classification of Hyperspectral Image Based on Double-Branch Dual-Attention Mechanism Network

Version 1 : Received: 4 December 2019 / Approved: 5 December 2019 / Online: 5 December 2019 (04:05:48 CET)
Version 2 : Received: 11 February 2020 / Approved: 12 February 2020 / Online: 12 February 2020 (05:40:08 CET)

A peer-reviewed article of this Preprint also exists.

Li, R.; Zheng, S.; Duan, C.; Yang, Y.; Wang, X. Classification of Hyperspectral Image Based on Double-Branch Dual-Attention Mechanism Network. Remote Sens. 2020, 12, 582. Li, R.; Zheng, S.; Duan, C.; Yang, Y.; Wang, X. Classification of Hyperspectral Image Based on Double-Branch Dual-Attention Mechanism Network. Remote Sens. 2020, 12, 582.

Abstract

In recent years, more and more researchers have gradually paid attention to Hyperspectral Image (HSI) classification. It is significant to implement researches on how to use HSI's sufficient spectral and spatial information to its fullest potential. To capture spectral and spatial features, we propose a Double-Branch Dual-Attention mechanism network (DBDA) for HSI classification in this paper, Two branches aer designed to extract spectral and spatial features separately to reduce the interferences between these two kinds of features. What is more, because distinguishing characteristics exist in the two branches, two types of attention mechanisms are applied in two branches above separately, ensuring to exploit spectral and spatial features more discriminatively. Finally, the extracted features are fused for classification. A series of empirical studies have been conducted on four hyperspectral datasets, and the results show that the proposed method performs better than the state-of-the-art method.

Keywords

hyperspectral image classification; spectral-spatial feature fusion; channel-wise attention; spatial-wise attention

Subject

Computer Science and Mathematics, Computer Vision and Graphics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.