Autonomous navigation in agriculture is very challenging as it usually takes place outdoors where there is rough terrain, uncontrolled natural lighting, constantly changing organic scenarios and sometimes the absence of Global Navigation Satellite System (GNSS) signal. In this work, a monocular visual system is proposed to estimate angular orientation and navigate between woody crops, more specifically a vineyard, using a Proportional Integrative Derivative (PID)-based controller. The guidance is provided by combining two ways to find the center of the vineyard: First, by estimating the vanishing point and second, by averaging the position of the two closest base trunk detections. Then, by the monocular angle perception, the angular error is determined. For obtaining the trunk position in the image, object detection using Deep Learning (DL) based Neural Networks (NN) is used. To evaluate the proposed controller, a visual vineyard simulation is created using Gazebo. The proposed joint controller is able to travel along a simulated straight vineyard with an RMS error of 1.17 cm. Moreover, a simulated curved vineyard modeled after the Douro region is tested in this work, where the robot was able to steer with an RMS error of 7.28 cm.
Robot navigation in vineyards based on the visual vanish point concept
2021-09-20
2577564 byte
Conference paper
Electronic Resource
English
Automated Visual Yield Estimation in Vineyards
British Library Online Contents | 2014
|Erratum to -Automated Visual Yield Estimation in Vineyards-
British Library Online Contents | 2014
|Position-Agnostic Autonomous Navigation in Vineyards with Deep Reinforcement Learning
ArXiv | 2022
|