Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Crop-Type Recognition in Street-level Images with Convolutional Neural Networks

Version 1 : Received: 11 July 2023 / Approved: 11 July 2023 / Online: 12 July 2023 (04:40:51 CEST)

A peer-reviewed article of this Preprint also exists.

Orduna-Cabrera, F.; Sandoval-Gastelum, M.; McCallum, I.; See, L.; Fritz, S.; Karanam, S.; Sturn, T.; Javalera-Rincon, V.; Gonzalez-Navarro, F.F. Investigating the Use of Street-Level Imagery and Deep Learning to Produce In-Situ Crop Type Information. Geographies 2023, 3, 563-573. Orduna-Cabrera, F.; Sandoval-Gastelum, M.; McCallum, I.; See, L.; Fritz, S.; Karanam, S.; Sturn, T.; Javalera-Rincon, V.; Gonzalez-Navarro, F.F. Investigating the Use of Street-Level Imagery and Deep Learning to Produce In-Situ Crop Type Information. Geographies 2023, 3, 563-573.

Abstract

The creation of crop-type maps from satellite data has proven challenging, often impeded by a lack of accurate in-situ data. This paper aims to demonstrate a method for crop-type (ie. Maize, Wheat and Other) recognition based on Convolutional Neural Networks using a bottom-up approach. We trained the model with a highly accurate dataset of crowdsourced labelled street-level imagery. Classification results achieved an AUC of 0.87 for wheat, 0.85 for maize and 0.73 for other. Given that wheat and maize are the two most common food crops globally, combined with an ever-increasing amount of available street-level imagery, this approach could help address the need for improved crop-type monitoring globally. Challenges remain in addressing the noisy aspect of street-level imagery (ie. buildings, hedgerows, automobiles, etc.), where a variety of different objects tend to restrict the view and confound the algorithms

Keywords

crop type recognition; deep learning; crowdsourcing; street-level imagery

Subject

Environmental and Earth Sciences, Remote Sensing

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.