Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Next Article in Journal / Special Issue
Selecting Canopy Zones and Thresholding Approaches to Assess Grapevine Water Status by Using Aerial and Ground-Based Thermal Imaging
Previous Article in Journal
Fine-Scale Sea Ice Structure Characterized Using Underwater Acoustic Methods
Previous Article in Special Issue
Monitoring Agronomic Parameters of Winter Wheat Crops with Low-Cost UAV Imagery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Technical Note

Imagine All the Plants: Evaluation of a Light-Field Camera for On-Site Crop Growth Monitoring

1
Department Monitoring and Exploration Technologies, UFZ – Helmholtz Centre for Environmental Research, Permoser Straße 15, Leipzig 04315, Germany
2
Chair of Geodesy and Geoinformatics, University of Rostock, Justus-von-Liebig-Weg 6, Rostock 18059, Germany
3
Department of Community Ecology, UFZ – Helmholtz Centre for Environmental Research, Theodor-Lieser-Straße 4, Halle 06120, Germany
4
Department of Landscape Ecology, UFZ – Helmholtz Centre for Environmental Research, Permoser Straße 15, Leipzig 04315, Germany
*
Author to whom correspondence should be addressed.
Remote Sens. 2016, 8(10), 823; https://doi.org/10.3390/rs8100823
Submission received: 29 July 2016 / Revised: 6 September 2016 / Accepted: 22 September 2016 / Published: 7 October 2016
(This article belongs to the Special Issue Remote Sensing in Precision Agriculture)

Abstract

:
The desire to obtain a better understanding of ecosystems and process dynamics in nature accentuates the need for observing these processes in higher temporal and spatial resolutions. Linked to this, the measurement of changes in the external structure and phytomorphology of plants is of particular interest. In the fields of environmental research and agriculture, an inexpensive and field-applicable on-site imaging technique to derive three-dimensional information about plants and vegetation would represent a considerable improvement upon existing monitoring strategies. This is particularly true for the monitoring of plant growth dynamics, due to the often cited lack of morphological information. To this end, an innovative low-cost light-field camera, the Lytro LF (Light-Field), was evaluated in a long-term field experiment. The experiment showed that the camera is suitable for monitoring plant growth dynamics and plant traits while being immune to ambient conditions. This represents a decisive contribution for a variety of monitoring and modeling applications, as well as for the validation of remote sensing data. This strongly confirms and endorses the assumption that the light-field camera presented in this study has the potential to be a light-weight and easy to use measurement tool for on-site environmental monitoring and remote sensing purposes.

1. Introduction

Global warming, a growing world population and a rising demand for food and energy means that new management strategies, not the least of which in the field of precision agriculture and environmental research, are required [1]. As a consequence, special emphasis is placed on the development of more precise environmental monitoring strategies and more reliable climate models [2]. In order to provide recommendations for action and political guidance in a changing world, it is necessary to assess the impacts of climate change and process dynamics in nature with the best possible accuracy and with respect to the inhomogeneity of ecosystems. This especially concerns plant growth and harvest yields in agriculture or adaption strategies for different types of land utilization under changing climate conditions [3]. Until now, numerous studies have been carried out that focus on this particular topic, where the impact of climatic changes on plant growth is predominantly simulated and measured in small laboratory or pilot-scaled experiments, such as in isolated chambers or greenhouse tests. Examples of such extensive surveys are provided by [4,5].
However, a constantly recurring problem with these studies is the small size of the observation plots, low temporal resolution and short investigation periods. This is due to the relatively high investment costs for the field equipment. The prohibitive nature of conducting this research in turn leads to reduced quality and significance of the data. To address this issue, the Global Change Experimental Facility project (GCEF), founded and established by the German Helmholtz Centre for Environmental Research, follows a larger scale approach, which, despite sounding rather mundane in theory, is actually extremely challenging to implement in practice, since the applied monitoring strategies have to be field-applicable and fully comprehensive. Therefore, new imaging methods and monitoring strategies have to be investigated that combine the dual advantages of low financial costs and lower methodological difficulty, which could also potentially be particularly appropriate for research areas related to agriculture, remote sensing and the modeling of plants, plant traits and ecosystems by providing on-site references or in situ ground truth data for remote sensing applications.
In particular, the extraction of morphological traits and the comprehensive monitoring of plant growth and growth dynamics based on imaging methods have become important working areas in the field of environmental research and ecosystem observation in forest and agricultural sciences [6,7,8,9]. Another important aspect concerns the need for in situ data in ground truth sampling to evaluate and calibrate remote sensing data [10].
In previous studies, most of the presented imaging techniques used to derive the morphological traits of plants were comparatively cost-intensive and focused on either stereo vision [7,9,11,12,13,14] or were laser scanner-based [15,16,17,18,19] approaches, which required a high level of methodological effort during field measurements and, therefore, resulted in reduced field capability [20,21].
In order to fulfill the requirements of an appropriate on-site monitoring tool, an ideal system should be able to derive the aforesaid plant characteristics in an automated, inexpensive and easy to use way regardless of the ambient conditions. This includes an appropriate calibration, as well as a stable performance to ensure reliable and repeatable results even under harsh conditions. To overcome these challenges, we tested an innovative and inexpensive light-field camera, the Lytro LF (Light-Field) (see Figure 1), in a large-scale field experiment, designed to illustrate the benefits and drawbacks of this technique via a feasibility and “proof of concept” study. For this purpose, it is of particular interest whether or not the light-field technique is applicable and can provide in situ three-dimensional information about the phytomorphology and growth dynamics of the plants, e.g., plant heights or plant canopy parameters over time.
Table 1 contains detailed information about the light-field camera Lytro LF used in this study. In the next section, the description of the light-field optics and the working principle of the camera itself are examined once again in an overview.

2. Preliminary Consideration and Definitions

Since the optics and mechanisms of image capturing are complex, the following simplifications are proposed. The object space will be simplified to a two-dimensional scene where light is defined as a scalar value, spreading in a straight line [23]. In this case, light from a certain point of a scene can be described as shown in Figure 2. When initially assuming that a camera can be regarded as a pinhole camera, any object placed in front of it would create a real image located in the plane of convergence [24]. This simple set up is shown in Figure 2a. It must be pointed out that this real image projection provides no useful depth information about the scene [25] unless a second pinhole camera is added, as is done in Figure 2b.
Based on these two images (Figure 2b), it is possible to extract further information about the structure and the depth of the scene. This stereoscopic effect is used in binocular stereo systems or stereo vision systems. Instead of using two or more single cameras to increase the amount of view points, it is also possible to move the camera (see Figure 2c), a technique also known as structure from motion or motion parallax [24].
If now a lens is placed at the pinhole, the light that hits the plane is captured from an endless variety of view points located at the lens plane (see Figure 2d). In other words, different view points refer to the different images being visible through different positions at the lens aperture plane. Usually, the light is captured as an averaged signal at the sensor plane without respect to its direction or origin [24]. This means, the incoming light from each point of the aperture plane contains more optical information than the averaged signal projected on the image plane represents. The entirety of this optical functions (intensity, direction, wavelength) is summarized and called plenoptic functions [24,26]. To retrieve this optical information, a simple assumption can be made. Figure 3a shows an object in the focus plane of the camera creating a sharp image of the object. If the object is for instance located closer to the camera, the resulting image would be out of focus, and the object would appear wider on the image plane (see Figure 3b).
The same phenomena would appear if the object were placed farther from the focal plane. To distinguish whether the object is close or far away from the camera (or rather the focal plane), it is assumed that the camera has a non-centric aperture, as is drawn in Figure 3c. Then, the object would only appear on one side of the image plane, due to the rays of light passing through the lens. Depending on the selected aperture, as well as on the distance from the camera, the object will appear either on the left or the right side of the optical axis, while the displacement between the object projection and the optical axis is dependent on the selected aperture [24].
By knowing this, it is possible to derive the depth of the scene from the known displacement of the aperture at the lens plane and the displacement of the object from the optical axis at the image plane. Figure 4 contains a simplified geometry of the relation between the displacement and the object distance described above.
The estimation of the object distance d (here simplified for a single object point) or rather the depth of the scene can be calculated by the known displacement of the aperture v and the aperture of the image h. Therefore, the triangle between the focal point beyond the sensor plane and the image displacement h can be assumed as being similar to the triangle of the focal point and the aperture displacement v, which leads to Equation (1) [24].
v - h l = v g
This can be transformed to:
1 g = 1 l 1 - h v
and in combination with lens equation:
1 f = 1 g + 1 d
the following expression can be formed:
1 d = 1 f - 1 l 1 - h v
Equation (4) can be used to determine the distance d by the known camera parameters and the measured displacement h in the image [24].
In theory, this could be achieved by separating the area of the sensor plane into an array of sub-systems, for instance, small pinhole cameras. Now, each pinhole camera captures light from a certain direction or rather from a certain position in the aperture plane of the lens. Since a pinhole camera is always capturing the incoming light in the form of a real image (see Figure 2a), the light rays that are now hitting the sensor plane can be described according to their direction and intensity [24].
Figure 5 shows the basic design of such a simplified plenoptic camera and the principle of capturing the light field for three different cases, assuming that each camera has a resolution of three pixels (labeled as u , v and w).
Particular attention should be given to the opening of each pinhole camera towards the center of the main lens. The position of the pinholes slightly changes due to the fact that each pinhole camera is sensitive for a specific subset of the incoming light through the main lens. Therefore, each camera is aimed at the center of the lens plane (see the blue lines in Figure 5). In the first case, the object point is located at the focal plane of the main lens. As a consequence, the resulting image is projected inside the center pinhole camera, which is solely responsible for the corresponding sensor signal. If the object is placed either further away or closer than the focus plane of the main lens, the corresponding signal would be caused by three pinhole cameras (for this simple design). However, according to the angle of incidence, only specific pixels are responding, which leads to a measurable displacement [24]. Finally, this displacement can be used to determine the distance of the object according to the above-mentioned Equation (4). In practice, the technical realization of the above described consideration is done by using a micro lens array instead of the assumed pinhole cameras.

3. Materials and Methods

3.1. Hardware Setup

In late 2011, the U.S. company Lytro Inc. launched the first consumer light-field camera, the Lytro LF (Lytro LF 8 GB, Lytro Inc., Mountain View, CA, USA). The release of this camera gained much publicity and attention, as it represented a revolutionary change in digital photography. The most fascinating aspect is the possibility of refocusing after a picture has been taken. The figures below illustrate the refocusing effect, from a very close focal point to a far focal point within the same image file (see Figure 6).
To be more precise, the fact that the focal point can be changed after capturing an image indicates that the camera obtains spatial information about the object as it captures the entire light-field of an image (direction and intensity of the incoming light), which in practice allows for a variety of possible applications, e.g., depth estimation or three-dimensional visualization. The depth information can be extracted from a depth map (see Figure 6d), which is generated by the Lytro software. Due to its low price (the device cost only $99 in 2015) and small dimensions, the Lytro LF is an excellent piece of technology, which allows us to explore the behavior and possible uses of a low-cost light-field camera, e.g., for environmental monitoring purposes.
Although the basic idea of light-field vision is more than 100 years old [27,28], only a few light-field cameras are available on the market [22,29,30]. A literature review on light-field technology clarifies that the majority of publications deal with the theoretical background of light-field cameras [24,26,31,32]. One of the few published studies to date that deals with the applications of light-field cameras focuses mainly on clinical photography and documentation of the growth stage of skin cancer [33]. In another study, the Lytro LF was used to diagnose and identify abnormalities in pediatric eyes [34]. However, due to the novelty of light-field photography, the cameras are not as widely used, which could provide an explanation for the limited number of studies carried out on this topic.
Nevertheless, the Lytro LF seems suitable for environmental monitoring especially with regard to ease of use and the small size of the camera. For this reason, the Lytro LF was used in this study, both for conceptual investigation and to obtain a practical proof that the device can be used for the monitoring of crop growth under field conditions.

3.2. Camera Preparation and Calibration

In order to carry out the proposed experiments, additional hardware was required. As the Lytro LF was designed as a consumer camera intended for everyday use, it does not support any external triggering by default. Therefore, some modifications were necessary to make the camera remotely controllable. To this end, the camera was disassembled, and two cable stands were soldered onto the On Off button, as well as the shutter. In addition, a switching device was designed based on two opto-couplers. In combination with an A D converter (NI USB-6008, National Instruments, Austin, TX, USA), this switching device allows the software-controlled operation of the camera. The setting of the output channels (a0, a1) of the A D converter to either “logic high” or “logic low” via the control software (LabView, Version 2009, National Instruments, Austin, TX, USA) results in switching on or off the camera, as well as capturing an image.
The most significant aspect when preparing the field experiment was the calibration of the depth estimation or rather the evaluation of the depth map quality. The depth map calibration was done empirically by placing the Lytro LF perpendicular and with a distance of d = 1 . 00 m towards a calibration plane. Prior to the experiment, the Lytro LF was activated, and the focal point of interest was set at the calibration plane. Between the plane and the Lytro LF, a movable object was inserted, starting at the camera housing ( d = 0 . 00 m ). In this context, it should be mentioned that the object, as well as the background (here, the calibration plain) should be visible. The depth estimation works very sufficiently whether the object is placed in the center of the image or at the rear as long as the field of view contains enough visible structural geometry. Figure 7c clearly shows that the whole object was assigned properly over its entire area. However, very smooth and homogenous objects cannot be recognized adequately by the Lytro LF. Once the picture was taken, the object was moved further away with a step width of Δ d = 0 . 01 m until the calibration plane was reached (see Figure 7).
Subsequently, the generated data were evaluated using MATLAB (R2014b, The MathWorks, Inc., Natick, MA, USA). As a result of the experiment, a relation between the grey values in the depth map and the object distance was found (see Figure 8), which remains unaffected even when the camera is turned off and on again.
Figure 8 illustrates the corresponding grey value in the depth map compared to the object distance in the range of d = 0 cm to 100 cm . It is clearly stated that for larger distances, the same grey values are allocated for different distances indicating the limited range of proper distance recognition. Of particular note is that object distances of less than 10 cm are assigned to the same grey value, as well. The exponential relation given in Figure 8 allows calculation of the object distance with a good degree of accuracy ( R 2 = 0 . 9343 ). Figure 8 also shows that the measurement range for obtaining good results is limited to close range applications, e.g., 10 cm to 50 cm. Figure 9 shows the depth estimation error depending on the object distance. This further underscores the very appropriate performance of the Lytro LF within the range, as mentioned above.

3.3. Experimental Design

The field experiment took place at a project site of the Helmholtz Centre for Environmental Research in Bad Lauchstädt from 7 August 2014 to 7 September 2014 (see Section 3.4). As a first step, the light-field camera had to be prepared for the field experiment. For reasons of simplification, a custom housing was designed for the Lytro LF. Thus, a waste water pipe was prepared and had, due to its excellent optical and mechanical properties, a fused quartz window added to it. To keep the Lytro LF in position inside the pipe, two retaining brackets were inserted, which can be screwed from the outside. Once the Lytro LF was placed inside, the switching device was mounted, and the housing was closed with a screw cap. Here, a cable gland was used for tension relief and to ensure the housing remained watertight. In order to avoid a build-up of moisture and humidity in the interior, several bags of silica gel were placed inside the housing of the camera. Considering the limited depth resolution of the Lytro LF (established during calibration), the camera was mounted on a tripod at one meter height within the plot (see Figure 10). The cable was fixed, to avoid cable movement, and fitted to a distribution box, where a computer and the A D converter for controlling the Lytro LF were stored.
Since the Lytro LF was fully calibrated in the laboratory, use of the camera in the field turned out to be very easy. Besides the installation of the stand and adjustment of the camera itself, only the cable between the switching device and the controlling unit had to be connected. From this point on, the system was running automatically and very stably during the whole measurement campaign. Here, it should be noted that the camera was not driven by any external power supply. After two months in the field and more than 200 images taken four times a day, the battery charge still remained at over 40 %. This was achieved by using an optimized switching procedure, which ensured that the camera was not turned on for more than five seconds per image acquisition.
Figure 11a shows the field of view of the Lytro LF related to the corn rows. Expressed in real-world dimensions, the Lytro LF covered an area of A 0 . 17 m 2 . In order to evaluate the feasibility and quality of the plant height estimation, the plant height of the crop was also measured manually. Determination of the plant height was achieved by measuring the distance between the soil surface and the highest point of the arch of the uppermost leaf within the field of view of the Lytro LF [35]. The tips of the next emerging leafs above were not measured since the steeply rising tips are weakly captured during the image recognition.
Processing of the data was performed in a mainly automated way. Since the Lytro LF stores the light-field images in its own format, importing the images from the camera was performed using the Lytro software. During this transfer, the devices began to process the images (depth map calculation), which turned out to be time consuming given the size of the dataset. Once the images were computed, they could be exported and further processed, which was performed by MATLAB and by using the known correlation between the grey values in the depth map and the object distance from the preliminary calibration (see the program Algorithm A1 in Appendix A). This post-processing is fully automated and does not require any input parameters (which was intended from the beginning of this investigation). The results of the algorithm programmed in MATLAB show the plant coverage and the plant heights in order, which depicts the dynamics of plant growth over time.

3.4. Experimental Sites Description

The experiment was performed at the test site of the Global Change Experimental Facility project (GCEF) of the Helmholtz Centre for Environmental Research [6], a large-scale outdoor test facility located in Bad Lauchstädt (Saxony-Anhalt, Germany). The project site of the GCEF consists of 50 plots, which are covered by a 5m high metal construction that covers a surface area of about 384 m 2 per plot. The plots are distributed over ten blocks, five of which are equipped with temporarily-closing mobile roofs and side panels, which allows the simulation of expected regional climate conditions and to assess the effects and consequences of climate change on different forms of land utilization [6]. In order to measure changes and any possible amendments to plant growth or the biocenosis in general, an applicable imaging technique is needed, especially to help improve the assessment of different impacts and the influences of climate change on the growth dynamics of plants. The field experiment of this study was installed on a pilot-scaled maize field (Zea maize), which covers a surface area of about 384 m 2 . Since the maize was specially sown for this study in late July 2014, the maximum plant height was about 1 m at the end of the measurement campaign. At this growth stage, the flag leaf does not yet exist, which is why in this study, normally, the top leaf represents the tallest part of the plant. This is important when comparing manually-measured plant heights and the estimations derived by the light-field camera.

4. Results

4.1. Canopy Measurement

The canopy measurement was carried out by developing a green filtering algorithm within MATLAB for automated leaf detection (see the program Algorithm A1 (lines 106–121) in Appendix A). In order to illustrate the quality of the green filtering, an example in the form of two raw images and the filtered results is given in Figure 12. Although canopy measurements based on RGB cameras are common, here, it should be noted that the images acquired by the Lytro LF are processed and computed using internal algorithms. Laboratory tests with a calibration grid have pointed out that images acquired by the Lytro LF show negligible lens distortion and can be compared to rectified images, which is particularly advantageous for photogrammetry or machine vision purposes. Moreover, this could be used to determine more specific plant characteristics, such as leaf area index or biomass.
Based on the green filtered images, it was possible to estimate the fractional vegetation cover. The result of this estimation is given in Figure 13. Here, each point represents the ratio between pixels assigned to be vegetation and non-vegetation during the field campaign. This illustrates that the coverage ratio is increasing constantly during the measurement campaign except for a few outliers. At the end of the observation, the canopy values declined, which is due to the fact that the plants have reached the camera housing. Therefore, the camera was very close to the top leaves while measuring the lower leaves and surrounded by a higher amount of soil, which caused slight damage to the vegetation cover curve.

4.2. Plant Height Measurement

Besides canopy coverage, the Lytro LF was mainly used to measure the plant heights of the maize. During the field campaign, four images were taken daily (9 a.m., 12 p.m., 3 p.m., 6 p.m.). The following table contains the estimated plant heights for three days on which the plant height was also measured manually (see Table 2). Since five maize plants were within the field of view of the Lytro LF (see Figure 11a), the value of the manually-measured plant height shown in Table 2 consists of the average plant height of all plants accompanied by the standard deviation. The Lytro LF value consists of the average value of the 0 . 01 % quantile accompanied by the inaccuracy of the measurement range known from the calibration (see Section 3.2).
The data obtained during the three investigation days given in Table 2 correspond to the data represented in Figure 14, beginning with 20 August 2014. Here, an RGB image, as well as the corresponding depth map is given for each day. As Figure 14 illustrates, the depth maps provide a good approximation of phenology and plant height during the measurement campaign. It is evident that the maximum values are increasing from Figure 14b (plant height approximately 50 cm) to Figure 14f (plant height approximately 90 cm). The figures also indicate that there are some incoherent areas within the leaves of the plants. This is discernible by the dark blue spots inside the leaf shape (e.g., Figure 14d) and due to the fact that these areas were overexposed leading to a reduced structure of the object within the image. Furthermore, the figures demonstrate that shadows (which are visible in the RGB images) do not affect the quality of the depth maps, which thus represents a fundamental finding of the experiment and proves the feasibility of not using artificial illumination in the field. Besides the depth map, the RGB images give an idea about the growth stage and situation within the plot. This could be valuable additional information, especially with regards to devising a comprehensive and wide-ranging monitoring strategy.
Figure 15 gives an impression of plant growth during the entire measurement campaign, gathered by the Lytro LF. Here, again, only the maximum values per observation were recognized (tallest plant part). In order to evaluate the data more thoroughly, a quantile plot of the plant growth is given. The plot confirms the assumption, which was mentioned above, that the depth estimation is only reliable for plant heights in the range of 50 cm < plant height < 90 cm . For values out of this range, the horizontal path indicates another distribution, which is likely due to an estimation error. The course of the plant height estimation shows a high dispersion of the values, especially for plant heights below 50 cm. It is conspicuous that during the whole campaign, no values below 45 cm were measured, which is again related to the fact that objects far away from the camera cannot be detected properly. Here, again, it should be mentioned that the image recognition is weaker for object distances larger than 45 cm leading to an estimation error of up to 13 cm (see Figure 9). To this end, the horizontal paths within the data are a result of decreasing depth resolution and more improper value assignment of ±10 cm. Another difficulty can be found in the measured object. During the early growth stages of maize plants, top leaves grow straight and vertically upwards before they curve downwards when a new plant stem appears. This succession of sprouting and bending of the leaves can cause an unsteady growth rate. Moreover, it should be emphasized that the experiment has taken place under field conditions where the plants were influenced by wind, dryness and humidity. Moreover, the data derived by the Lytro LF can be used to depict growth dynamics in the form of a temporal histogram plot. As the colorbar exemplifies, each recording is transformed to a histogram, and the intensity of the color indicates the relative share of the population. The step width is 10 cm in a range from 0 cm to 100 cm. Especially for the second and the third observation point, the curves show a similar course, while the plant height was overestimated at the first point. This is due to the fact that the plants where in the moderate measurement range of the Lytro LF of more than 50 cm. Outside of this range, the underlying depth map shows grey values, which can correspond to several depths as mentioned in the calibration section (see Section 3.2 and Figure 9).
In addition, Figure 15 also illustrates manually-measured plant heights in comparison to the estimations by the Lytro LF. The blue curve represents the manually-measured plant height of the maize according to Table 2. Unfortunately, an unexpected fast plant growth at the end of the experiment was responsible for the low amount of comparable measurements since the plants exceeded the measurement distance of 100 cm, leading to incorrect values and depth estimations. However, the temporal plot allows a significant insight into the growth dynamics of the crop, since all visible plant parts are taken into account. This allows the discussion of the plant growth dynamics depending on different plant levels or leaf stages.

4.3. Time Lapse Plant Observation

Based on the acquired depth maps, it is possible to illustrate the growth dynamics of the plants in the form of a video or time lapse animation. The animation demonstrates an intelligible and easily accessible way to investigate differences and anomalies of the leaf arrangement, phenotypes or the plant growth in general as a function of time (see Figure 16). The short video points out the advantages of an automated long-term on-site monitoring since it clarifies the varying conditions and the phenotypical changes of the plants during the field experiment. Therefore, the author would like to highlight the time lapse animation video, which can be found in the multimedia section.

5. Discussion

Based on the evaluation of the laboratory experiment, the most suitable distance for a good depth estimation is between 10 cm and 50 cm away from the Lytro LF, which is a constraint of the small sensor size leading to a very small parallax. Therefore, the stereoscopic basis of the device is limited (sensor size 6.451 mm × 4.603 mm) and very small compared to a common stereo vision system. For this reason, the estimation of plant heights and, moreover, the generation of a digital plant model were expected to be inaccurate for distances of more than 50 cm away from the camera. Due to the limited range of the Lytro LF, the camera was installed at only a 1-m height. As a result, the plants reached the Lytro LF housing on 3 September 2014. This also clearly indicates the limitations of the Lytro LF.
To improve this, a larger sensor would be useful. This has already been implemented in the successor of the Lytro LF, the Lytro Illum. However, the Lytro Illum was not available at the time the experiment was conducted. The prices for such devices will fall and, so, the limits of possibility. Another way to improve the depth resolution is to increase the pixel size and the amount of pixels under each micro lens, or more precisely, increase the disparity, which is responsible for the depth resolution and light-field capability of the camera in general. Since production of the micro lens arrays in combination with a high resolution sensor (amount of pixels) is expensive and relatively complex, this would increase the costs for such a light-field camera drastically.
Considering the low price, it may be sufficient for the moment to simply change the position of the Lytro LF gradually depending on plant growth or to install several cameras at different heights in the field. This simple modification would ensure that the object of interest or the top leaves are always within an appropriate measuring range of the camera (or cameras).
However, as a result of this study, the relative canopy coverage and plant height were successfully estimated based on the light-field images acquired by the Lytro LF, which allowed us to gain initial insights into the plant population of the plot. Furthermore, the Lytro LF is an appropriate test object for investigating the feasibility of a light-field camera as a monitoring tool under field conditions. To this end, the results illustrate that this device is a very promising tool that can be used for the monitoring of plant growth, plant traits and especially the identification of process dynamics. This, above all, becomes apparent through the comparison of manually-performed measurements and the estimations generated by the Lytro LF. Even though the first value was overestimated by more than 17 cm, the other results were in a good accordance. As a consequence, this confirms the assumption that this light-field camera has the potential to be a suitable measurement tool for environmental monitoring purposes and is a promising alternative compared to cost-intensive imaging techniques.
Although the overall course of plant growth exhibited a fairly wide dispersion, in consideration of the simple experimental design, the results achieved have to be regarded as a helpful insight into the growth dynamics (see Figure 15). In addition, this indicates that initial estimations concerning plant height monitoring can be derived and verified by a light-field camera, even by an inexpensive consumer product like the Lytro LF.

6. Conclusions

The most significant specification of the Lytro LF is the ability to provide three-dimensional information about the image. To this end, a laboratory experiment was performed to learn about the capabilities of the Lytro LF and the depth resolution. The test demonstrated a relation between the object distance and the grey values in the corresponding depth maps of the light-field images generated by the Lytro software showing a very appropriate performance for short range applications up to 50 cm. In this manner, the Lytro LF can be utilized to extract real-world distances. Moreover, the field test results showed that the Lytro LF is suitable for the monitoring of crop growth under field conditions; however, with certain restrictions. Thus, it seems that the Lytro LF has the potential to become a promising alternative to conventional stereo vision solutions. Furthermore, the Lytro LF can be regarded as a stand alone technique, since no further enhancements are needed to fulfill the depth estimation requirements, even under field conditions.
In conclusion, the Lytro LF approach can be used as an inexpensive measurement tool for environmental monitoring purposes, although its scope is limited to close-range applications. The need for higher pixel resolution and image depth may be accomplished by newly-developed upcoming light-field cameras, which introduce a variety of possible applications. Furthermore, it is expected that the costs for such cameras will decrease over time. As a consequence, new applications will be discovered in the field of remote sensing, environmental monitoring and forest and agricultural sciences in general.
Furthermore, special attention should be given to the simplification of the experimental design and the data processing, especially the opportunity to provide three-dimensional information without any additional requirements during field measurements based on one single shot, which represents a decisive and cost-effective contribution in the area of environmental research and the monitoring of plant growth.

Supplementary Materials

The following are available online at www.mdpi.com/2072-4292/8/10/823/s1, Video S1: 3DPlantGrowthLytro.avi. The video gives an impression of the growth dynamics in the form of a time-laps video. In addition, it highlights the advantage of the method to receive an impression of the surrounding circumstances besides the pure information of plant heights or canopy coverage. Therefore, the video is showing the total focus and depth image over time.

Acknowledgments

The Global Change Experimental Facility (GCEF) is funded by the Federal Ministry of Education and Research, the State Ministry for Science and Economy of Saxony-Anhalt and the State Ministry for Higher Education, Research, and the Arts of Saxony. Additional thanks to the co-authors for providing data and ideas, as well as to the technicians Konrad Kirsch and Steffen Lehmann for their excellent fieldwork and support. We cordially thank English native speaker Christopher Higgins for proofreading this text.

Author Contributions

Robert Schima was responsible for the main part of the experimental design, analysis and writing of the article. Hannes Mollenhauer and Görres Grenzdörffer contributed important aspects regarding the experimental design. Moreover, Görres Grenzdörffer contributed information on machine vision and image recognition. Ines Merbach contributed information and support on setting up the field experiment and the preparation of the maize field at the experimental site in Bad Lauchstädt. Angela Lausch provided impulses on remote sensing and applications in agriculture. Peter Dietrich, Jan Bumberger and Angela Lausch contributed experience and information on environmental monitoring with a focus on field measurements and campaign planning. Besides this, Jan Bumberger contributed knowledge about terrestrial sensor networks and sensor integration. Robert Schima and Jan Bumberger initiated and managed the review. All authors checked and contributed to the final text.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
10 GCEFGlobal Change Experimental Facility
10 Lytro LFLytro Light-Field camera (name)

Appendix A. Source Codes

Algorithm A1. Source code plant height measurement Lytro.
Algorithm A1. Source code plant height measurement Lytro.
Remotesensing 08 00823 i001 Remotesensing 08 00823 i002 Remotesensing 08 00823 i003

References

  1. Purkis, S.; Klemas, V. Remote Sensing and Global Environmental Change, 1st ed.; Wiley: Hoboken, NJ, USA, 2011. [Google Scholar]
  2. Kasperson, J.; Kasperson, R. Global Environmental Risk; United Nations University Press: Tokyo, Japan; Earthscan: London, UK, 2001. [Google Scholar]
  3. Morison, J.I.L.; Morecroft, M.D. Plant Growth and Climate Change; Blackwell Publishing Ltd.: Oxford, UK, 2006. [Google Scholar]
  4. Aronson, E.; McNulty, S. Appropriate experimental ecosystem warming methods by ecosystem, objective, and practicality. Agric. For. Meteorol. 2009, 149, 1791–1799. [Google Scholar] [CrossRef]
  5. Shaver, G.R.; Canadell, J.; Chapin, F.S.; Gurevitch, J.; Harte, J.; Henry, G.; Ineson, P.; Jonasson, S.; Melillo, J.; Pitelka, L.; et al. Global warming and terrestrial ecosystems: A conceptual framework for analysis. BioScience 2000, 50, 871–882. [Google Scholar] [CrossRef]
  6. Apelt, F.; Breuer, D.; Nikoloski, Z.; Stitt, M.; Kragler, F. Phytotyping 4D: A light-field imaging system for non-invasive and accurate monitoring of spatio-temporal plant growth. Plant J. 2015, 82, 693–706. [Google Scholar] [CrossRef] [PubMed]
  7. Kazmi, W.; Foix, S.; Alenyà, G.; Andersen, H.J. Indoor and outdoor depth imaging of leaves with time-of-flight and stereo vision sensors: Analysis and comparison. ISPRS J. Photogramm. Remote Sens. 2014, 88, 128–146. [Google Scholar] [CrossRef] [Green Version]
  8. De Moraes Frasson, R.P.; Krajewski, W.F. Three-dimensional digital model of a maize plant. Agric. For. Meteorol. 2010, 150, 478–488. [Google Scholar] [CrossRef]
  9. Dornbusch, T.; Wernecke, P.; Diepenbrock, W. A method to extract morphological traits of plant organs from 3D point clouds as a database for an architectural plant model. Ecol. Model. 2007, 200, 119–129. [Google Scholar] [CrossRef]
  10. Bhatta, B. Research Methods in Remote Sensing; SpringerBriefs in Earth Sciences; Springer: Dordrecht, The Netherlands, 2013. [Google Scholar]
  11. Busemeyer, L.; Mentrup, D.; Müller, K.; Wunder, E.; Alheit, K.; Hahn, V.; Maurer, H.P.; Reif, J.C.; Würschum, T.; Müller, J.; et al. BreedVision—A multi-sensor platform for non-destructive field-based phenotyping in plant breeding. Sensors 2013, 13, 2830–2847. [Google Scholar] [CrossRef] [PubMed]
  12. Jin, J.; Tang, L. Corn plant sensing using real-time stereo vision. J. Field Robot. 2009, 26, 591–608. [Google Scholar] [CrossRef]
  13. Biskup, B.; Scharr, H.; Schurr, U.; Rascher, U. A stereo imaging system for measuring structural parameters of plant canopies. Plant Cell Environ. 2007, 30, 1299–1308. [Google Scholar] [CrossRef] [PubMed]
  14. Shrestha, D.; Steward, B.; Kaspar, T. Determination of Early Stage Corn Plant Height Using Stereo Vision. In Proceedings of the 6th International Conference on Precision Agriculture and Other Precision Resources Management, Minneapolis, MN, USA, 14–17 July 2002; pp. 1382–1394.
  15. Schaefer, M.T.; Lamb, D.W. A combination of plant NDVI and LiDAR measurements improve the estimation of pasture biomass in Tall Fescue (Festuca arundinacea var. Fletcher). Remote Sens. 2016, 8, 109. [Google Scholar] [CrossRef]
  16. Crommelinck, S.; Höfle, B. Simulating an autonomously operating low-cost static terrestrial LiDAR for multitemporal maize crop height measurements. Remote Sens. 2016, 8, 205. [Google Scholar] [CrossRef]
  17. Paulus, S.; Schumann, H.; Kuhlmann, H.; Léon, J. High-precision laser scanning system for capturing 3D plant architecture and analysing growth of cereal plants. Biosyst. Eng. 2014, 121, 1–11. [Google Scholar] [CrossRef]
  18. Paulus, S.; Behmann, J.; Mahlein, A.K.; Plümer, L.; Kuhlmann, H. Low-cost 3D systems: Suitable tools for plant phenotyping. Sensors 2014, 14, 3001–3018. [Google Scholar] [CrossRef] [PubMed]
  19. Sanz-Cortiella, R.; Llorens-Calveras, J.; Escolà, A.; ó-Satorra, J.; Ribes-Dasi, M.; Masip-Vilalta, J.; Camp, F.; Gràcia-Aguilá, F.; Solanelles-Batlle, F.; Planas-DeMartí, S.; et al. Innovative LIDAR 3D dynamic measurement system to estimate fruit-tree leaf area. Sensors 2011, 11, 5769–5791. [Google Scholar] [CrossRef] [PubMed]
  20. Li, L.; Zhang, Q.; Huang, D. A review of imaging techniques for plant phenotyping. Sensors 2014, 14, 20078–20111. [Google Scholar] [CrossRef] [PubMed]
  21. McCarthy, C.; Hancock, N.; Raine, S. Applied machine vision of plants: A review with implications for field deployment in automated farming operations. Intell. Serv. Robot. 2010, 3, 209–217. [Google Scholar] [CrossRef] [Green Version]
  22. Lytro Inc. The First Generation Lytro Camera, 8 GB. Available online: https://store.lytro.com/collections/the-first-generation-product-list (accessed on 15 October 2014).
  23. Slevogt, H. Technische Optik; Sammlung Göschen, De Gruyter: Berlin, Germany, 1974. [Google Scholar]
  24. Adelson, E.H.; Wang, J.Y.A. Single lens stereo with a plenoptic camera. IEEE Trans. Pattern Anal. Mach. Intell. 1992, 14, 99–106. [Google Scholar] [CrossRef]
  25. Wöhler, C. 3D Computer Vision. Efficient Methods and Applications; Springer: Dordrecht, The Netherlands, 2009. [Google Scholar]
  26. Adelson, E.H.; Bergen, J.R. The Plenoptic Function and the Elements of Early Vision; Technical Report 148; Vision and Modeling Group, Media Laboratory, Massachusetts Institute of Technology: Cambridge, MA, USA, 1991. [Google Scholar]
  27. Ives, F. Parallax Stereogram and Process of Making Same. U.S. Patent 725,567, 14 April 1903. [Google Scholar]
  28. Lippmann, G. Epreuves reversibles donnant la sensation du relief. J. Phys. Théor. Appl. 1908, 7, 821–825. [Google Scholar] [CrossRef]
  29. Raytrix. 3D Light Field Camera Technology. Raytrix GmbH, Online. 2014. Available online: http://www.raytrix.de/index.php/Kameras.html (accessed on 20 October 2014).
  30. Lytro Inc. LYTRO ILLUM. Available online: https://store.lytro.com/products/lytro-illum (accessed on 20 October 2014).
  31. Kučera, J. Computational Photography of Light-Field Camera and Application to Panoramic Photography. Master’s Thesis, Charles University in Prague, Faculty of Mathematics and Physics, Prague, Czech Republic, 2014. [Google Scholar]
  32. Ng, R. Digital Light Field Photography. Ph.D. Thesis, Stanford University, Standford, CA, USA, 2006. [Google Scholar]
  33. Baghdadchi, S.; Liu, K.; Knapp, J.; Prager, G.; Graves, S.; Akrami, K.; Manuel, R.; Bastos, R.; Reid, E.; Carson, D. An innovative system for 3D clinical photography in the resource-limited settings. J. Transl. Med. 2014, 12, 1–8. [Google Scholar] [CrossRef] [PubMed]
  34. Marcus, I.; Tung, I.T.; Dosunmu, E.O.; Thiamthat, W.; Freedman, S.F. Anterior segment photography in pediatric eyes using the Lytro light field handheld noncontact camera. J. Am. Assoc. Pediatr. Ophthalmol. Strabismus 2014, 17, 572–577. [Google Scholar] [CrossRef] [PubMed]
  35. Abendroth, L.; Elmore, R.; Hartzler, R.G.; McGrath, C.; Mueller, D.S.; Munkvold, G.P.; Pope, R.; Rice, M.E.; Robertson, A.E.; Sawyer, J.E.; et al. Corn Field Guide; Iowa State University, Extension Service: Ames, IA, USA, 2009. [Google Scholar]
Figure 1. The Lytro LF (Light-Field) camera used in this study.
Figure 1. The Lytro LF (Light-Field) camera used in this study.
Remotesensing 08 00823 g001
Figure 2. Different pinhole camera image acquisition; (a) Shows a single pinhole camera image acquisition; (b) Stereo image acquisition with two viewpoints; (c) Sequential image acquisition by a motion parallax system; (d) Iillustrates how a lens would collect light from a variety of view points (adapted from [24]).
Figure 2. Different pinhole camera image acquisition; (a) Shows a single pinhole camera image acquisition; (b) Stereo image acquisition with two viewpoints; (c) Sequential image acquisition by a motion parallax system; (d) Iillustrates how a lens would collect light from a variety of view points (adapted from [24]).
Remotesensing 08 00823 g002
Figure 3. Principle of single lens stereo: (a) Point object in the focal plane of the main lens; (b) Point object closer than the focal plane; (c) Eccentric aperture creating a cropped image (adapted from [24]).
Figure 3. Principle of single lens stereo: (a) Point object in the focal plane of the main lens; (b) Point object closer than the focal plane; (c) Eccentric aperture creating a cropped image (adapted from [24]).
Remotesensing 08 00823 g003
Figure 4. Geometry of a single aperture of an out-of-focus image (adapted from [24]). Here, the following abbreviations are used: D is the distance of an object to the lens plane, f the focal length, v the displacement of the aperture, d the distance of the object to the lens plane, l the distance between the lens plane and the sensor plane, e the distance of the conjugate focal point beyond the sensor, g the distance to the conjugate focus of the object and h the displacement of the object’s image in the sensor plane.
Figure 4. Geometry of a single aperture of an out-of-focus image (adapted from [24]). Here, the following abbreviations are used: D is the distance of an object to the lens plane, f the focal length, v the displacement of the aperture, d the distance of the object to the lens plane, l the distance between the lens plane and the sensor plane, e the distance of the conjugate focal point beyond the sensor, g the distance to the conjugate focus of the object and h the displacement of the object’s image in the sensor plane.
Remotesensing 08 00823 g004
Figure 5. Schematic model of a plenoptic camera with an array of micro lenses (adapted from [24]).
Figure 5. Schematic model of a plenoptic camera with an array of micro lenses (adapted from [24]).
Remotesensing 08 00823 g005
Figure 6. Based on one single image ( 1080 × 1080 pixel) taken by the Lytro LF light-field camera, it is possible to change the focal point within the light-field image file. In addition, the camera provides a 328 × 328 pixel depth map of the image. (a) Shows a close range focal point; (b) Shows a mid range focal point and (c) a focal point set to a larger distance of the same image file; (d) Represents the corresponding depth map of the light-field image.
Figure 6. Based on one single image ( 1080 × 1080 pixel) taken by the Lytro LF light-field camera, it is possible to change the focal point within the light-field image file. In addition, the camera provides a 328 × 328 pixel depth map of the image. (a) Shows a close range focal point; (b) Shows a mid range focal point and (c) a focal point set to a larger distance of the same image file; (d) Represents the corresponding depth map of the light-field image.
Remotesensing 08 00823 g006
Figure 7. Exemplary light-field image file (close focal point and far away focal point of the same scene and object distance) and corresponding depth map during the calibration process of the Lytro LF. (a) Shows the RGB image focused on the object plane, whereas (b) is focused on the calibration board. (c) Contains the corresponding depth map indicating the object distances.
Figure 7. Exemplary light-field image file (close focal point and far away focal point of the same scene and object distance) and corresponding depth map during the calibration process of the Lytro LF. (a) Shows the RGB image focused on the object plane, whereas (b) is focused on the calibration board. (c) Contains the corresponding depth map indicating the object distances.
Remotesensing 08 00823 g007
Figure 8. Correspondence between the depth map grey values and object distance determined empirically during the calibration of the Lytro LF light-field camera.
Figure 8. Correspondence between the depth map grey values and object distance determined empirically during the calibration of the Lytro LF light-field camera.
Remotesensing 08 00823 g008
Figure 9. Corresponding error plot of the depth estimation depending on the object distance.
Figure 9. Corresponding error plot of the depth estimation depending on the object distance.
Remotesensing 08 00823 g009
Figure 10. Installation of the Lytro LF and experimental setup in the field at the Global Change Experimental Facility project (GCEF) test site. (a) Shows the tripod and housing of the Lytro LF at the beginning of the experiment, whereas (b) shows the situation at the end of the field campaign.
Figure 10. Installation of the Lytro LF and experimental setup in the field at the Global Change Experimental Facility project (GCEF) test site. (a) Shows the tripod and housing of the Lytro LF at the beginning of the experiment, whereas (b) shows the situation at the end of the field campaign.
Remotesensing 08 00823 g010
Figure 11. On-site crop growth monitoring: field of view of the Lytro LF and experimental design. (a) Field of view of the Lytro LF camera during the field experiment; (b) Experimental setup and reference points for the manually plant height estimation.
Figure 11. On-site crop growth monitoring: field of view of the Lytro LF and experimental design. (a) Field of view of the Lytro LF camera during the field experiment; (b) Experimental setup and reference points for the manually plant height estimation.
Remotesensing 08 00823 g011
Figure 12. (ad) Extraction of the leaf area using a green filtering algorithm in MATLAB compared to the raw images. (a,c) show exemplary raw RGB images as input data and the resulting green filtered masks in (b,d).
Figure 12. (ad) Extraction of the leaf area using a green filtering algorithm in MATLAB compared to the raw images. (a,c) show exemplary raw RGB images as input data and the resulting green filtered masks in (b,d).
Remotesensing 08 00823 g012
Figure 13. Fractional vegetation cover over time using Lytro LF image files and a green filtering algorithm in MATLAB.
Figure 13. Fractional vegetation cover over time using Lytro LF image files and a green filtering algorithm in MATLAB.
Remotesensing 08 00823 g013
Figure 14. (af) RGB images on the left (a,c,e) show exemplary dates during the field campaign compared to the corresponding depth maps allowing the approximation of plant height and morphological traits represented in the images on the right (b,d,f).
Figure 14. (af) RGB images on the left (a,c,e) show exemplary dates during the field campaign compared to the corresponding depth maps allowing the approximation of plant height and morphological traits represented in the images on the right (b,d,f).
Remotesensing 08 00823 g014
Figure 15. Comparison of manually-measured plant heights and Lytro LF depth estimation plus a temporal plot of plant height distribution (histogram bin counts) during the field campaign.
Figure 15. Comparison of manually-measured plant heights and Lytro LF depth estimation plus a temporal plot of plant height distribution (histogram bin counts) during the field campaign.
Remotesensing 08 00823 g015
Figure 16. Time lapse animations or time series of the acquired depth maps allow an impressive insight into the plant growth and growth dynamics of the plants.
Figure 16. Time lapse animations or time series of the acquired depth maps allow an impressive insight into the plant growth and growth dynamics of the plants.
Remotesensing 08 00823 g016
Table 1. Lytro specifications [22].
Table 1. Lytro specifications [22].
Camera UnitSpecificationProperty
LensFocal Length43–344mm
Zoom8× optical
ApertureConstant f/2.0
Image SensorSensor TypeCMOS
Light-Field Resolution11 Megaray (the number of light rays captured by the light-field sensor)
Active Area4.6 mm × 4.6 mm
ImageFormatLight-Field Picture (.lfp)
Aspect Ratio01:01
2D Export Resolution1080 × 1080 pixels (approx 1MP peak output)
File/Picture Storage350 files 8 GB
ExposureModesFull Auto, Full Manual, Shutter Priority or ISO Priority
Shutter Priority 1 250 –8 s
ISO Priority80–3200
Exposure LockYes
Neutral Density (ND) Filter4-stop
Control InterfaceTap on touchscreen
ScreenTouchscreenYes
Size1.52 (diagonal)
Screen TypeBack-lit LCD
Live ViewYes
PlaybackIn Camera Picture ReviewYes
PowerBatteryBuilt-in, rechargeable long-life lithium-ion
Battery ChargingVia Micro-USB to computer or Lytro Fast Charger
ExternalControlsPower button, Shutter button, Zoom Slider, Touchscreen
USBMicro-USB
Tripod SocketAvailable via Lytro custom accessory mount, sold separately
MiscellaneousSoftwareIncludes LYTRO Desktop for importing, organizing, processing and interacting with living pictures. See more in the LYTRO Desktop Fact Sheet
Wireless802.11 b/g/n, Wi-Fi Protected Access (WPA2)
E-WasteRoHS certified (Restriction of Hazardous Substances Directive 2002/95/EC)
MaterialsLightweight anodized aluminum with silicone grip Camera Kit Includes: LYTRO Magnetic Lens Cap, lens cleaning cloth, wrist strap, USB cable for data transfer and charging and Quick Start guide
Dimensions41 mm × 41 mm × 112 mm
Weight214  g
Table 2. Manually-measured plant heights in comparison with the automated estimation of the Lytro LF.
Table 2. Manually-measured plant heights in comparison with the automated estimation of the Lytro LF.
DatePlant Height in cm
Manually-MeasuredLytro LF Estimation
20 August 201442.5 ± 1.454.7 ± 7
28 August 201466.9 ± 4.170.6 ± 5
4 September 201493.4 ± 5.386.7 ± 1
10 September 2014127.7 ± 14.7-

Share and Cite

MDPI and ACS Style

Schima, R.; Mollenhauer, H.; Grenzdörffer, G.; Merbach, I.; Lausch, A.; Dietrich, P.; Bumberger, J. Imagine All the Plants: Evaluation of a Light-Field Camera for On-Site Crop Growth Monitoring. Remote Sens. 2016, 8, 823. https://doi.org/10.3390/rs8100823

AMA Style

Schima R, Mollenhauer H, Grenzdörffer G, Merbach I, Lausch A, Dietrich P, Bumberger J. Imagine All the Plants: Evaluation of a Light-Field Camera for On-Site Crop Growth Monitoring. Remote Sensing. 2016; 8(10):823. https://doi.org/10.3390/rs8100823

Chicago/Turabian Style

Schima, Robert, Hannes Mollenhauer, Görres Grenzdörffer, Ines Merbach, Angela Lausch, Peter Dietrich, and Jan Bumberger. 2016. "Imagine All the Plants: Evaluation of a Light-Field Camera for On-Site Crop Growth Monitoring" Remote Sensing 8, no. 10: 823. https://doi.org/10.3390/rs8100823

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop