Preprint Technical Note Version 1 Preserved in Portico This version is not peer-reviewed

Interpretability in Convolutional Neural Networks for Building Damage Classification in Satellite Imagery

Version 1 : Received: 1 January 2021 / Approved: 4 January 2021 / Online: 4 January 2021 (15:58:21 CET)

How to cite: Chen, T. Interpretability in Convolutional Neural Networks for Building Damage Classification in Satellite Imagery. Preprints 2021, 2021010053. https://doi.org/10.20944/preprints202101.0053.v1 Chen, T. Interpretability in Convolutional Neural Networks for Building Damage Classification in Satellite Imagery. Preprints 2021, 2021010053. https://doi.org/10.20944/preprints202101.0053.v1

Abstract

Natural disasters ravage the world's cities, valleys, and shores on a monthly basis. Having precise and efficient mechanisms for assessing infrastructure damage is essential to channel resources and minimize the loss of life. Using a dataset that includes labeled pre- and post- disaster satellite imagery, we train multiple convolutional neural networks to assess building damage on a per-building basis. In order to investigate how to best classify building damage, we present a highly interpretable deep-learning methodology that seeks to explicitly convey the most useful information required to train an accurate classification model. We also delve into which loss functions best optimize these models. Our findings include that ordinal-cross entropy loss is the most optimal loss function to use and that including the type of disaster that caused the damage in combination with a pre- and post-disaster image best predicts the level of damage caused. Our research seeks to computationally contribute to aiding in this ongoing and growing humanitarian crisis, heightened by climate change.

Keywords

climate; disasters; interpretability; relief; satellite imagery

Subject

Engineering, Safety, Risk, Reliability and Quality

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.