Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

UAV-LiDAR-based 3D Mapping of Apple Orchards and Automatic Georeferencing of Segmented Apple Tree Locations

Version 1 : Received: 28 March 2023 / Approved: 29 March 2023 / Online: 29 March 2023 (11:47:00 CEST)

How to cite: Soki, N.; Emaru, T. UAV-LiDAR-based 3D Mapping of Apple Orchards and Automatic Georeferencing of Segmented Apple Tree Locations. Preprints 2023, 2023030507. https://doi.org/10.20944/preprints202303.0507.v1 Soki, N.; Emaru, T. UAV-LiDAR-based 3D Mapping of Apple Orchards and Automatic Georeferencing of Segmented Apple Tree Locations. Preprints 2023, 2023030507. https://doi.org/10.20944/preprints202303.0507.v1

Abstract

In this paper, we propose a system to create high-precision maps using UAV-LiDAR and to determine the location of individual fruit trees (apple trees) on the maps. The system is based on a UAV-LiDAR system that flies over an actual orchard. A UAV was flown over an actual orchard, and the point cloud of the onboard LiDAR and the location information of RTK-GNSS were obtained. The system records the LiDAR point cloud and RTK-GNSS position information. Automated software processes point cloud data offline and Automated software processes point cloud data offline and automatically segments each apple tree in the map. The RTK-GNSS position information is used for the segmented trees. The positional information obtained from RTK-GNSS was georeferenced to the segmented trees without using ground evaluation points. As a sample, location information was obtained from trees using the Quasi-Zenith Satellite System (QZSS) MICHIBIKI. The positional accuracy of the trees was evaluated using the positional information obtained from the Quasi-Zenith Satellite System MICHIBIKI as a reference. As a result, the alignment accuracy was sufficient to identify individual fruit trees.

Keywords

UAV; LiDAR; Apple Orchard; georeferencing; GNSS

Subject

Engineering, Other

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.