3D Maize Plant Reconstruction Based on Georeferenced Overlapping LiDAR Point Clouds
Abstract
:1. Introduction
2. Materials and Methods
2.1. Hardware, Sensors and Configuration
Functional Data | General Data |
---|---|
Operating range: from 0.5 m to 20 m | LiDAR Class: 1 (IEC 60825-1) |
Field of view/Scanning angle: 270° | Enclosure Rating: IP 67 |
Scanning Frequency: 25 Hz | Temperature Range: −30 °C to +50 °C |
Angular resolution: 0.5° | Light source: Infrared (905 nm) |
Systematic error: ±30 mm | Total weight: 1.1 kg |
Statistical error: 12 mm | Light spot size at optics cover/18 m: 8 mm/300 mm |
2.2. Field Experiments
2.3. Data Processing
2.3.1. Pre-Processing Data
2.3.2. Serial Transformations and Translations
From Total Station to LiDAR Sensor
- Firstly, the LiDARs’ coordinates at the robot's coordinate system were obtained, taking into account the prism as the origin, following the rigid translation (see Equation (1)) values of each LiDAR (see translation values at Table 2).
- Secondly, a transformation for evaluating the three dimensional orientation of the robot was applied at the coordinates obtained for each LiDAR (see Equation (2)). For that, the IMU values (roll “φ”, pitch “θ”, and yaw “ψ”) were considered at the LiDAR’s timestamp “t” .
- Finally, a second translation was performed (see Equation (2)) from the prism-tracked coordinates to the total station for that specific timestamp . Thus, the LiDAR coordinates at the total station’s coordinate system for the evaluated time were obtained.
From LiDAR Sensor to Scanned Point
- The starting points were the Cartesian coordinates obtained at every LiDAR scan, using the horizontal orientation as reference, for the evaluated time .
- In order to integrate the LiDAR’s scan into the robot’s coordinate system, a different transformation was applied (see Equation (3)) depending on the LiDAR’s orientation and the robot’s direction (see Table 2). Table 2 presents the first rough calibration of the LiDAR’s positions and orientations. This was performed using a measuring tape and a bullseye level for the two-dimensional plane levelling (roll and pitch angles). An angle meter was used for precise measurement of the inclined LiDAR slope. The thorough calibrations were performed later on during the implementation of the ICP algorithm at the point cloud overlapping.
LiDAR Orientation | Direction | Roll (°) | Pitch (°) | Yaw (°) | x Translation (m) | y Translation (m) | z Translation (m) |
---|---|---|---|---|---|---|---|
Horizontal | Go | 0 | −180 | 0 | −0.45 | 0 | 0.205 |
Return | 180 | 0 | 0 | 0.45 | 0 | 0.205 | |
Inclined | Go | 0 | −30 | 180 | −0.51 | 0 | 0.58 |
Return | 0 | −30 | 0 | 0.51 | 0 | 0.58 | |
Vertical | Go | 0 | −90 | 0 | 0.5726 | −0.0177 | 0.1997 |
Return | 180 | −90 | 0 | −0.5726 | 0.0177 | 0.1997 |
- 3.
- Then, a second transformation to evaluate the three dimensional orientation of the robot was applied at LiDAR’s scan (see Equation (4)). For that, the IMU values (roll “φ”, pitch “θ”, and yaw “ψ”) were considered at timestamp “t” .
- 4.
- Finally, a translation from the LiDAR coordinates to the total station’s coordinate system for the evaluated time “t” was performed (see Equation (4)); thereby each LiDAR scanned point at the total station's coordinate system was obtained.
2.4. Point Cloud Overlapping
- Gridding filter: returns a down sampled point cloud using box grid filter. GridStep specifies the size of a 3D box. Points within the same box are merged to a single point in the output (see Table 3).
- Noise Removal filter: returns a filtered point cloud that removes outliers. A point is considered to be an outlier if the average distance to its K nearest neighbors is above a threshold [34] (see Table 3). In the case of the inclined LiDAR’s orientation it was necessary to apply serially, as mentioned earlier, a second noise removal filter.
- RANSAC fit plane: this function, developed by Peter Kovesi [35], uses the RANSAC algorithm to robustly fit a plane. It only requires the selection of the distance threshold value between a data point and the defined plane to decide whether a point is an inlier or not. For plane detection, which was the same as ground detection in our case, the evaluations were conducted at every defined evaluation interval or vehicle advance (see Table 3).
LiDAR Orient. | GridStep (m3) | Noise Removal Filter I | Noise Removal II | RANSAC | |||
---|---|---|---|---|---|---|---|
Neighbors | Std. threshold | Neighbors | Std. threshold | Eval. Intervals (m) | Threshold (m) | ||
Vertical | (3 × 3 × 3) × 10−9 | 10 | 1 | - | - | 0.195 | 0.07 |
Inclined | 5 | 0.3 | 10 | 1 |
- 4.
- Iterative Closest Point (ICP): the ICP algorithm takes two point clouds as an input and returns the rigid transformation (rotation matrix R and translation vector T) that best aligns the new point cloud to the previous referenced point cloud. It is using the least squares minimization criterion [36]. The starting referenced point cloud was the one obtained by the vertical LiDAR’s orientation when going.
- 5.
- Gridding filter: the same 3D box size was considered, as defined at the aerial point cloud extraction (see Table 3).
3. Results and Discussion
Go | Return | |||||
---|---|---|---|---|---|---|
Row | Vertical | Inclined | Horizontal | Vertical | Inclined | Horizontal |
1 | 147,839 (67.8%) | 52,267 (24.0%) | 17,788 (8.2%) | 230,918 (72.6%) | 61,832 (19.4%) | 25,453 (8.0%) |
2 | 135,406 (67.2%) | 52,069 (25.8%) | 14,107 (7%) | 118,372 (66.3%) | 49,799 (27.9%) | 10,503 (5.9%) |
3 | 176,956 (68.4%) | 74,659 (28.9%) | 6928 (2.7%) | 122,287 (63.3%) | 62,085 (32.1%) | 8894 (4.6%) |
4 | 182,801 (63.6%) | 83,829 (29.2%) | 20,891 (7.3%) | 110,882 (64.4%) | 49,799 (28.9%) | 11,568 (6.7%) |
5 | 96,887 (61.3%) | 46,535 (29.4%) | 14,703 (9.3%) | 121,867 (58.3%) | 74,329 (35.5%) | 12,969 (6.2%) |
LiDAR | Direction | Row | Raw Data | Gridding | Noise Removal | Aerial Points |
---|---|---|---|---|---|---|
Vertical | Go | 1 | 147,839 (100%) | 138,832 (93.9%) | 133,035 (90.0%) | 3392 (2.3%) |
2 | 135,406 (100%) | 129,824 (95.9%) | 124,515 (92.0%) | 3667 (2.7%) | ||
3 | 176,956 (100%) | 152,789 (86.3%) | 147,981 (83.6%) | 1765 (1.0%) | ||
4 | 182,801 (100%) | 158,960 (87.0%) | 153,960 (84.2%) | 3615 (2.0%) | ||
5 | 96,887 (100%) | 93,892 (96.9%) | 90,659 (93.6%) | 1739 (1.8%) | ||
Return | 1 | 230,918 (100%) | 187,909 (81.4%) | 182,355 (79.0%) | 4466 (1.9%) | |
2 | 118,372 (100%) | 100,197 (84.6%) | 95,605 (80.8%) | 3333 (2.8%) | ||
3 | 122,287 (100%) | 115,917 (94.8%) | 111,752 (91.4%) | 1786 (1.5%) | ||
4 | 110,882 (100%) | 107,510 (97.0%) | 103,605 (93.4%) | 3153 (2.8%) | ||
5 | 121,867 (100%) | 102,549 (84.1%) | 98,623 (80.9%) | 2086 (1.7%) | ||
Inclined | Go | 1 | 52,267 (100%) | 41,913 (80.2%) | 33,965 (65.0%) | 3808 (7.3%) |
2 | 52,069 (100%) | 48,364 (92.9%) | 39,149 (75.2%) | 2418 (4.6%) | ||
3 | 74,659 (100%) | 64,623 (86.6%) | 52,327 (70.1%) | 1052 (1.4%) | ||
4 | 83,829 (100%) | 65,919 (78.6%) | 55,350 (66.0%) | 3259 (3.9%) | ||
5 | 46,535 (100%) | 42,868 (92.1%) | 35,045 (75.3%) | 1274 (2.7%) | ||
Return | 1 | 61,832 (100%) | 57,381 (92.8%) | 46,914 (75.9%) | 3845 (6.2%) | |
2 | 49,799 (100%) | 46,843 (94.1%) | 38,192 (76.7%) | 4593 (9.2%) | ||
3 | 62,085 (100%) | 56,734 (91.4%) | 46,328 (74.6%) | 2123 (3.4%) | ||
4 | 60,891 (100%) | 58,000 (95.3%) | 48,954 (80.4%) | 2925 (4.8%) | ||
5 | 74,329 (100%) | 58,928 (79.3%) | 48,663 (65.5%) | 1846 (2.5%) | ||
Total | 2,062,510 (100%) | 1,829,952 (88.7%) | 1,686,977 (81.8%) | 56,145 (2.7%) |
Row | VGo | VReturn | VGo + VReturn “Grid” | VGo + VReturn + IGo “Grid” | VGo + VReturn + IGo + IReturn “Grid” |
---|---|---|---|---|---|
1 | 3392 (22.4%) | 4466 (29.4%) | 7711 (50.8%) | 11,391 (75.1%) | 15,173 (100%) |
2 | 3667 (26.6%) | 3333 (24.2%) | 6915 (50.2%) | 9297 (67.5%) | 13,775 (100%) |
3 | 1765 (26.8%) | 1786 (27.1%) | 3491 (53.0%) | 4500 (68.3%) | 6591 (100%) |
4 | 3615 (28.5%) | 3153 (24.8%) | 6684 (52.6%) | 9820 (77.3%) | 12,705 (100%) |
5 | 1739 (25.4%) | 2086 (30.5%) | 3809 (55.7%) | 5072 (74.2%) | 6839 (100%) |
Total | 14,178 (25.7%) | 14,824 (26.9%) | 28,610 (51.9%) | 40,080 (72.8%) | 55,083 (100%) |
4. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Omasa, K.; Hosoi, F.; Konishi, A. 3D LiDAR imaging for detecting and understanding plant responses and canopy structure. J. Exp. Bot. 2007, 58, 881–898. [Google Scholar] [CrossRef] [PubMed]
- Li, F.; Cohen, S.; Naor, A.; Shaozong, K.; Erez, A. Studies of canopy structure and water use of apple trees on three rootstocks. Agric. Water Manag. 2002, 55, 1–14. [Google Scholar] [CrossRef]
- Kjaer, K.H.; Ottosen, C.-O. 3D laser triangulation for plant phenotyping in challenging environments. Sensors 2015, 15, 13533–13547. [Google Scholar] [CrossRef] [PubMed]
- Li, L.; Zhang, Q.; Huang, D. A review of imaging techniques for plant phenotyping. Sensors 2014, 14, 20078–20111. [Google Scholar] [CrossRef] [PubMed]
- Ivanov, N.; Boissard, P.; Chapron, M.; Andrieu, B. Computer stereo plotting for 3-D reconstruction of a maize canopy. Agric. For. Meteorol. 1995, 75, 85–102. [Google Scholar] [CrossRef]
- De Moraes Frasson, R.P.; Krajewski, W.F. Three-dimensional digital model of a maize plant. Agric. For. Meteorol. 2010, 150, 478–488. [Google Scholar] [CrossRef]
- Nguyen, T.T.; Slaughter, D.C.; Max, N.; Maloof, J.N.; Sinha, N. Structured light-based 3D reconstruction system for plants. Sensors 2015, 15, 18587–18612. [Google Scholar] [CrossRef] [PubMed]
- Chaivivatrakul, S.; Tang, L.; Dailey, M.N.; Nakarmi, A.D. Automatic morphological trait characterization for corn plants via 3D holographic reconstruction. Comput. Electron. Agric. 2014, 109, 109–123. [Google Scholar] [CrossRef]
- Li, D.; Xu, L.; Tan, C.; Goodman, E.D.; Fu, D.; Xin, L. Digitization and visualization of greenhouse tomato plants in indoor environments. Sensors 2015, 15, 4019–4051. [Google Scholar] [CrossRef] [PubMed]
- Andújar, D.; Fernández-Quintanilla, C.; Dorado, J. Matching the best viewing angle in depth cameras for biomass estimation based on poplar seedling geometry. Sensors 2015, 15, 12999–13011. [Google Scholar] [CrossRef] [PubMed]
- Palleja, T.; Landers, A.J. Real time canopy density estimation using ultrasonic envelope signals in the orchard and vineyard. Comput. Electron. Agric. 2015, 115, 108–117. [Google Scholar] [CrossRef]
- Arikapudi, R.; Vougioukas, S.; Saracoglu, T. Orchard tree digitization for structural-geometrical modeling. In Precision Agriculture 2015; Stafford, J.V., Ed.; Wageningen Academic: Volcani Center, Israel, 2015; pp. 329–336. [Google Scholar]
- Rose, J.C.; Paulus, S.; Kuhlmann, H. Accuracy analysis of a multi-view stereo approach for phenotyping of tomato plants at the organ level. Sensors 2015, 15, 9651–9665. [Google Scholar] [CrossRef] [PubMed]
- Rosell, J.; Sanz, R. A review of methods and applications of the geometric characterization of tree crops in agricultural activities. Comput. Electron. Agric. 2012, 81, 124–141. [Google Scholar] [CrossRef]
- Walklate, P.; Cross, J.; Richardson, G.; Murray, R.; Baker, D. It—Information technology and the human interface: Comparison of different spray volume deposition models using lidar measurements of apple orchards. Biosyst. Eng. 2002, 82, 253–267. [Google Scholar] [CrossRef]
- Rosell Polo, J.R.; Sanz, R.; Llorens, J.; Arnó, J.; Ribes-Dasi, M.; Masip, J.; Camp, F.; Gràcia, F.; Solanelles, F.; Pallejà, T.s. A tractor-mounted scanning lidar for the non-destructive measurement of vegetative volume and surface area of tree-row plantations: A comparison with conventional destructive measurements. Biosyst. Eng. 2009, 102, 128–134. [Google Scholar] [CrossRef] [Green Version]
- Rosell, J.R.; Llorens, J.; Sanz, R.; Arno, J.; Ribes-Dasi, M.; Masip, J.; Escolà, A.; Camp, F.; Solanelles, F.; Gràcia, F. Obtaining the three-dimensional structure of tree orchards from remote 2D terrestrial lidar scanning. Agric. For. Meteorol. 2009, 149, 1505–1515. [Google Scholar] [CrossRef]
- Shi, Y.; Wang, N.; Taylor, R.K.; Raun, W.R.; Hardin, J.A. Automatic corn plant location and spacing measurement using laser line-scan technique. Precis. Agric. 2013, 14, 478–494. [Google Scholar] [CrossRef]
- Garrido, M.; Perez-Ruiz, M.; Valero, C.; Gliever, C.J.; Hanson, B.D.; Slaughter, D.C. Active optical sensors for tree stem detection and classification in nurseries. Sensors 2014, 14, 10783–10803. [Google Scholar] [CrossRef] [PubMed]
- Paraforos, D.S.; Griepentrog, H.W.; Geipel, J.; Stehle, T. Fused inertial measurement unit and real time kinematic-global navigation satellite system data assessment based on robotic total station information for in-field dynamic positioning. In Precision Agriculture 2015; Stafford, J.V., Ed.; Wageningen Academic: Volcani Center, Israel, 2015; pp. 275–282. [Google Scholar]
- Gold, S.; Rangarajan, A.; Lu, C.-P.; Pappu, S.; Mjolsness, E. New algorithms for 2D and 3D point matching: Pose estimation and correspondence. Pattern Recognit. 1998, 31, 1019–1031. [Google Scholar] [CrossRef]
- Besl, P.J.; McKay, N.D. Method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 1992, 14, 239–254. [Google Scholar] [CrossRef]
- Liu, Y. Automatic registration of overlapping 3D point clouds using closest points. Image Vis. Comput. 2006, 24, 762–781. [Google Scholar] [CrossRef]
- Hosoi, F.; Nakabayashi, K.; Omasa, K. 3-D modeling of tomato canopies using a high-resolution portable scanning LiDAR for extracting structural information. Sensors 2011, 11, 2166–2174. [Google Scholar] [CrossRef] [PubMed]
- Keightley, K.E.; Bawden, G.W. 3D volumetric modeling of grapevine biomass using tripod LiDAR. Comput. Electron. Agric. 2010, 74, 305–312. [Google Scholar] [CrossRef]
- Reiser, D.; Garrido, M.; Vazquez, M.; Paraforos, D.S.; Griepentrog, H.W. Crop row detection in maize for developing navigation algorithms under changing plant growth stages. In Proceedings of Robot 2015, Second Iberian Robotics Conference, Lisbon, Portugal, 19–21 November 2015; pp. 371–382.
- Waldkirch, S.A. Laser Measurement Systems of the LMS100 Product Family: Operating Instructions. Available online: https://mysick.com/saqqara/im0031331.pdf (accessed on 2 October 2015).
- Ritchie, S.; Hanway, J.; Benson, G. How A Corn Plant Develops; Special Report No. 48; Iowa State University of Science and Technology Cooperative Extension Service: Ames, IA, USA, 1992. [Google Scholar]
- Orfanidis, S.J. Introduction to Signal Processing; Prentice Hall, Ed.; Pearson Education, Inc.: Upper Saddle River, NJ, USA, 1995. [Google Scholar]
- Trimble. SPS930 Universal Total Stations Datasheet. Available online: http://construction.trimble.com/sites/construction.trimble.com/files/marketing_material/022482–1867_Trimble_SPSx30_UTS_DS_0611_sec.pdf (accessed on 7 September 2015).
- Kahaner, D.; Moler, C.; Nash, S. Numerical Methods and Software; Prentice-Hall, Inc.: Englewood Cliffs, NJ, USA, 1989. [Google Scholar]
- Sotoodeh, S. Outlier detection in laser scanner point clouds. In Proceedings of the ISPRS Commission V Symposium “Image Engineering and Vision Metrology”, Dresden, Germany, 25–27 September 2006; pp. 297–302.
- Sanz-Cortiella, R.; Llorens-Calveras, J.; Rosell-Polo, J.R.; Gregorio-Lopez, E.; Palacin-Roca, J. Characterisation of the LMS200 laser beam under the influence of blockage surfaces. Influence on 3D scanning of tree orchards. Sensors 2011, 11, 2751–2772. [Google Scholar] [CrossRef] [PubMed]
- Rusu, R.B.; Marton, Z.C.; Blodow, N.; Dolha, M.; Beetz, M. Towards 3D point cloud based object maps for household environments. Robot. Auton. Syst. 2008, 56, 927–941. [Google Scholar] [CrossRef]
- Kovesi, P. Matlab and Octave Functions for Computer Vision and Image Processing. Available online: http://www.peterkovesi.com/matlabfns (accessed on 15 July 2015).
- Bergström, P.; Edlund, O. Robust registration of point sets using iteratively reweighted least squares. Comput. Optim. Appl. 2014, 58, 543–561. [Google Scholar] [CrossRef]
© 2015 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons by Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Garrido, M.; Paraforos, D.S.; Reiser, D.; Vázquez Arellano, M.; Griepentrog, H.W.; Valero, C. 3D Maize Plant Reconstruction Based on Georeferenced Overlapping LiDAR Point Clouds. Remote Sens. 2015, 7, 17077-17096. https://doi.org/10.3390/rs71215870
Garrido M, Paraforos DS, Reiser D, Vázquez Arellano M, Griepentrog HW, Valero C. 3D Maize Plant Reconstruction Based on Georeferenced Overlapping LiDAR Point Clouds. Remote Sensing. 2015; 7(12):17077-17096. https://doi.org/10.3390/rs71215870
Chicago/Turabian StyleGarrido, Miguel, Dimitris S. Paraforos, David Reiser, Manuel Vázquez Arellano, Hans W. Griepentrog, and Constantino Valero. 2015. "3D Maize Plant Reconstruction Based on Georeferenced Overlapping LiDAR Point Clouds" Remote Sensing 7, no. 12: 17077-17096. https://doi.org/10.3390/rs71215870
APA StyleGarrido, M., Paraforos, D. S., Reiser, D., Vázquez Arellano, M., Griepentrog, H. W., & Valero, C. (2015). 3D Maize Plant Reconstruction Based on Georeferenced Overlapping LiDAR Point Clouds. Remote Sensing, 7(12), 17077-17096. https://doi.org/10.3390/rs71215870