Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows

Version 1 : Received: 18 September 2023 / Approved: 18 September 2023 / Online: 19 September 2023 (04:00:26 CEST)

A peer-reviewed article of this Preprint also exists.

Gu, B.; Liu, Q.; Gao, Y.; Tian, G.; Zhang, B.; Wang, H.; Li, H. Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows. Sensors 2023, 23, 8807. Gu, B.; Liu, Q.; Gao, Y.; Tian, G.; Zhang, B.; Wang, H.; Li, H. Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows. Sensors 2023, 23, 8807.

Abstract

The relative position of the orchard robot to the rows of fruit trees is an important parameter for achieving autonomous navigations. The current methods for estimating the position parameters between rows of orchard robots obtain low parameter accuracy, and to address this problem, this paper proposes a machine vision-based method for detecting the relative position of orchard robots and fruit tree rows. Firstly, the fruit tree trunk is identified based on the improved YOLOv4 model; secondly, the camera coordinates of the tree trunk are calculated from the principle of binocular camera triangulation, and the ground projection coordinates of the tree trunk are obtained through coordinate conversion; finally, the midpoints of the projection coordinates of different sides are combined and the navigation path is obtained by linear fitting with the least squares method, and the position parameters of the orchard robot are obtained through calculation. The experimental results show that the average accuracy and average recall of the improved YOLOv4 model for fruit tree trunk detection are 97.05% and 95.42%, respectively, which are 5.92 and 7.91 percentage points higher than those of the original YOLOv4 model. The average errors of heading angle and lateral deviation estimates obtained based on the method in this paper are 0.57° and 0.02 m. The method can accurately calculate heading angle and lateral deviation values at different positions between rows, and can provide a reference for autonomous visual navigation of orchard robots.

Keywords

Orchard robot; Autonomous navigation; Positional parameters; Machine vision; YOLO

Subject

Computer Science and Mathematics, Robotics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.