Abstract
Autonomous navigation is a key defining feature that allows agricultural robots to perform automated farming tasks. Global navigation satellite system (GNSS) technology is providing autonomous navigation solutions for current commercial robotic platforms that can achieve centimeter-level accuracy when real-time kinematic (RTK) corrections are available. However, GNSS-based solutions are expensive and require a long preparation phase where the field has to be surveyed with a GNSS rover to collect waypoints for the navigation path. An alternative navigation approach can be provided by Local perception sensors, such as LiDAR scanners, by tracking geometric features in the perceived scene. This paper presents a robust LiDAR-based solution for structure tracking along vine rows. The proposed method does not require prior field surveying, and it is insensitive to crop characteristics such as row width and spacing. Moreover, the proposed algorithm identifies and builds an online regression model of the structure. This is done by applying the Hough transform with a parameterization and search method motivated by a practical interpretation of point cloud statistics. The proposed method was tested on a commercial robotic platform in two configurations of vineyards. The experiments show that the proposed algorithm achieves consistent and accurate row tracking, which was validated against a reliable RTK-GNSS ground truth.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Aly, M.: Real time detection of lane markers in urban streets. In: 2008 IEEE Intelligent Vehicles Symposium, pp 7–12. IEEE (2008). https://doi.org/10.1109/IVS.2008.4621152
Åstrand, B., Baerveldt, A.J.: A vision based row-following system for agricultural field machinery. Mechatronics 15(2), 251–269 (2005). https://doi.org/10.1016/j.mechatronics.2004.05.005
Bah, M.D., Hafiane, A., Canals, R.: Weeds detection in uav imagery using slic and the hough transform. In: 2017 Seventh International Conference on Image Processing Theory, Tools and Applications (IPTA), pp 1–6. IEEE (2017). https://doi.org/10.1109/IPTA.2017.8310102
Bah, M.D., Hafiane, A., Canals, R.: Crownet: Deep network for crop row detection in uav images. IEEE Access 8, 5189–5200 (2019). https://doi.org/10.1109/ACCESS.2019.2960873
Basso, M., de Freitas, E.P.: A uav guidance system using crop row detection and line follower algorithms. J. Intell. Robot. Syst. 97(3), 605–621 (2020). https://doi.org/10.1007/s10846-019-01006-0
Cheng, Y.: Mean shift, mode seeking, and clustering. IEEE Trans. Pattern Anal. Mach. Intell. 17(8), 790–799 (1995). https://doi.org/10.1109/34.400568
Duda, R.O., Hart, P.E.: Use of the hough transformation to detect lines and curves in pictures. Commun. ACM 15(1), 11–15 (1972). https://doi.org/10.1145/361237.361242
English, A., Ross, P., Ball, D., Corke, P.: Vision based guidance for robot navigation in agriculture. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), pp 1693–1698. IEEE, Hong Kong, China (2014). https://doi.org/10.1109/ICRA.2014.6907079
English, A., Ross, P., Ball, D., Upcroft, B., Corke, P.: Learning crop models for vision-based guidance of agricultural robots. In: 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 1158–1163. IEEE, Hamburg, Germany (2015). https://doi.org/10.1109/IROS.2015.7353516
Gée, C., Bossu, J., Jones, G., Truchetet, F.: Crop/weed discrimination in perspective agronomic images. Comput. Electron. Agric. 60(1), 49–59 (2008). https://doi.org/10.1016/j.compag.2007.06.003
Hamner, B., Singh, S., Bergerman, M.: Improving orchard efficiency with autonomous utility vehicles. In: 2010 Pittsburgh, Pennsylvania, June 20-June 23, 2010, pp. 1. American Society of Agricultural and Biological Engineers. https://doi.org/10.13031/2013.29902 (2010)
Hamuda, E., Glavin, M., Jones, E.: A survey of image processing techniques for plant extraction and segmentation in the field. Comput. Electron. Agric. 125, 184–199 (2016). https://doi.org/10.1016/j.compag.2016.04.024
Hiremath, S.A., Van Der Heijden, G.W., Van Evert, F.K., Stein, A., Ter Braak, C.J.: Laser range finder model for autonomous navigation of a robot in a maize field using a particle filter. Comput. Electron. Agric. 100, 41–50 (2014). https://doi.org/10.1016/j.compag.2013.10.005
Jones, G., Gée, C., Truchetet, F.: Modelling agronomic images for weed detection and comparison of crop/weed discrimination algorithm performance. Precis. Agric. 10(1), 1–15 (2009). https://doi.org/10.1007/s11119-008-9086-9
Lenain, R., Thuilot, B., Cariou, C., Martinet, P.: Adaptive and predictive non linear control for sliding vehicle guidance: Application to trajectory tracking of farm vehicles relying on a single rtk gps. In: 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No. 04CH37566), vol. 1, pp 455–460. IEEE (2004). https://doi.org/10.1109/IROS.2004.1389394
Merriaux, P., Dupuis, Y., Boutteau, R., Vasseur, P., Savatier, X.: Robust robot localization in a complex oil and gas industrial environment. J. Field Robot. 35(2), 213–230 (2018). https://doi.org/10.1002/rob.21735
Nagham, S., Tobias, L., Cheryl, M., Nigel, H.: A review of autonomous navigation systems in agricultural environments. Innovative Agricultural Technologies for a Sustainable Futu pp. 22–25 (2013)
Ort, T., Paull, L., Rus, D.: Autonomous vehicle navigation in rural environments without detailed prior maps. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), pp 2040–2047. IEEE (2018). https://doi.org/10.1109/ICRA.2018.8460519
Pan, X., Shi, J., Luo, P., Wang, X., Tang, X.: Spatial as deep: Spatial cnn for traffic scene understanding. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)
Rabab, S., Badenhorst, P., Chen, Y.P.P., Daetwyler, H.D.: A template-free machine vision-based crop row detection algorithm. Precis. Agric., 1–30. https://doi.org/10.1007/s11119-020-09732-4 (2020)
Reiser, D., Miguel, G., Arellano, M.V., Griepentrog, H.W., Paraforos, D.S.: Crop row detection in maize for developing navigation algorithms under changing plant growth stages. In: Robot 2015: Second Iberian Robotics Conference, pp 371–382. Springer (2016). https://doi.org/10.1007/978-3-319-27146-0_29
Rovira-Más, F., Zhang, Q., Reid, J.F., Will, J.D.: Hough-transform-based vision algorithm for crop row detection of an automated agricultural vehicle. Proc. Inst. Mech. Eng. Part D J. Automob. Eng. 219(8), 999–1010 (2005). https://doi.org/10.1243/095440705X34667
Rusu, R.B., Cousins, S.: 3D is here: Point cloud library (PCL). In: IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China (2011). https://doi.org/10.1109/ICRA.2011.5980567
Søgaard, H.T., Olsen, H.J.: Determination of crop rows by image analysis without segmentation. Comput. Electron. Agric. 38(2), 141–158 (2003). https://doi.org/10.1016/S0168-1699(02)00140-0
Tipaldi, G.D., Meyer-Delius, D., Burgard, W.: Lifelong localization in changing environments. Int. J. Robot. Res. 32(14), 1662–1678 (2013). https://doi.org/10.1177/0278364913502830
Winterhalter, W., Fleckenstein, F.V., Dornhege, C., Burgard, W.: Crop row detection on tiny plants with the pattern hough transform. IEEE Robot. Autom. Lett. 3(4), 3394–3401 (2018). https://doi.org/10.1109/LRA.2018.2852841
Acknowledgements
This work is partially supported by the Agence Nationale de la Recherche ANR through the ROSE challenge, ANR-17-ROSE-0002-01 and by the Association Nationale de la Recherche et de la Technologie ANRT through the CIFRE PhD, ANRT CIFRE N∘ 2018/0792.
Funding
This work is partially supported by the Agence Nationale de la Recherche ANR through the ROSE challenge, ANR-17-ROSE-0002-01 and by the Association Nationale de la Recherche et de la Technologie ANRT through the CIFRE PhD, ANRT CIFRE N∘ 2018/0792.
Author information
Authors and Affiliations
Contributions
All authors contributed to the study’s methodology, conception and design. Formal analysis: Hassan Nehme; Software and experimental validation: Hassan Nehme, Clément Aubry, Thomas Solatges; Resources: Hassan Nehme, Clément Aubry, Thomas Solatges; Writing: The first draft of the manuscript was written by Hassan Nehme and all authors commented on previous versions of the manuscript, all authors read and approved the final manuscript; Supervision and administration: Clément Aubry, Romain Rossi, Rémi Boutteau; Funding acquisition: Thomas Solatges, Xavier Savatier.
Corresponding author
Ethics declarations
Consent for Publication
Participant consented to the submission of this article to the journal.
Conflict of Interests
The authors have no conflicts of interest to declare that are relevant to the content of this article.
Additional information
Consent to participate
Informed consent was obtained from all participants in the study.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Nehme, H., Aubry, C., Solatges, T. et al. LiDAR-based Structure Tracking for Agricultural Robots: Application to Autonomous Navigation in Vineyards. J Intell Robot Syst 103, 61 (2021). https://doi.org/10.1007/s10846-021-01519-7
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10846-021-01519-7