Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Next Article in Journal
Estimating Productivity Measures in Guayule Using UAS Imagery and Sentinel-2 Satellite Data
Previous Article in Journal
Ten Years of VIIRS Land Surface Temperature Product Validation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimating Leaf Chlorophyll Content of Moso Bamboo Based on Unmanned Aerial Vehicle Visible Images

1
State Key Laboratory of Subtropical Silviculture, Zhejiang A & F University, Hangzhou 311300, China
2
Zhejiang Provincial Collaborative Innovation Center for Bamboo Resources and High-Efficiency Utilization, Zhejiang A & F University, Hangzhou 311300, China
3
Key Laboratory of Carbon Cycling in Forest Ecosystems and Carbon Sequestration of Zhejiang Province, Zhejiang A & F University, Hangzhou 311300, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Remote Sens. 2022, 14(12), 2864; https://doi.org/10.3390/rs14122864
Submission received: 6 May 2022 / Revised: 9 June 2022 / Accepted: 14 June 2022 / Published: 15 June 2022

Abstract

:
Leaf chlorophyll content is an important indicator of the physiological and ecological functions of plants. Accurate estimation of leaf chlorophyll content is necessary to understand energy, carbon, and water exchange between plants and the atmosphere. The leaf chlorophyll content index (CCI) of 109 Moso bamboo samples (19 for training data, 19 for validation data, and 71 for extrapolation data) was measured from December 2019 to May 2021, while their corresponding red–green–blue (RGB) images were acquired using an unmanned aerial vehicle (UAV) platform. A method for estimating leaf CCI based on constructing relationships between field leaf CCI measurements and UAV RGB images was evaluated. The results showed that a modified excess blue minus excess red index and 1.4 × H-S in the hue–saturation–value (HSV) color space were the most suitable variables for estimating the leaf CCI of Moso bamboo. No noticeable difference in accuracy between the linear regression model and backpropagation neural network (BPNN) model was found. Both models performed well in estimating leaf CCI, with an R2 > 0.85 and relative root mean square error (RMSEr) < 15.0% for the validation data. Both models failed to accurately estimate leaf CCI during the leaf-changing period (April to May in off-year), with the problems being overestimation in low leaf CCI and underestimation in high leaf CCI values. At a flight height of 120 m and illumination between 369 and 546 W/m2, the CCI for an independent sample dataset was accurately estimated by the models, with an R2 of 0.83 and RMSEr of 13.78%. Flight height and solar intensity played a role in increasing the generality of the models. This study provides a feasible and straightforward method to estimate the leaf CCI of Moso bamboo based on UAV RGB images.

1. Introduction

Leaf chlorophyll content (LCC) is an important indicator of the status of plant physiological and ecological functions [1,2,3]. LCC regulates energy, carbon, and water exchanges between plants and the atmosphere because it largely affects photosynthetic activities [4,5]. The LCC is also closely related to nitrogen content, which indirectly supplies information on nutrition in leaves and soil [6,7,8,9]. The LCC can quickly respond to diseases, insect attacks, and environmental stresses [10]. Therefore, accurate monitoring of variation in LCC at high spatial resolution and temporal frequency is necessary to understand the plant’s ecological functions and health status and provides timely information for decision-makers [11,12,13].
Compared with the in situ LCC estimation method, satellite remote sensing methods are unique and efficient in detecting LCC on a larger scale because LCC can absorb solar radiation from the production of biochemical energy [14,15,16]. It has been widely used to determine LCCs from local to global scales [3,17]. However, there are significant challenges in using satellite remote sensing data for the accurate determination of LCCs. First, spectral reflectance at the canopy scale is governed by LCC and the leaf area index (LAI) and nonphotosynthetic elements [18,19,20]. This makes the determination of LCCs from satellite-derived canopy reflectance challenging [3,21]. Although some variables that are more sensitive to LCC than other factors were used, the LAI effects on the accuracy of LCC determination still exist to some extent [22,23]. The other reason for uncertainties in LCC estimations based on satellite remote sensing data is the difficulty in matching canopy spectral reflectance and ground-based LCC observations. This is due to the coarse spatial resolution of most satellite remote sensing data, and is especially challenging for tall plants and spatial heterogeneity in LCC within a coarse pixel [24,25]. This results in a weak relationship between canopy spectral reflectance and ground-based LCC observations and further reduces the accuracy of LCC determinations.
Unmanned aerial vehicles (UAVs) can capture images with fine spatial resolution and temporal frequency as specified by users [11,26]. Hence, leaf-level spectral reflectance is easily extracted from UAV images at a centimeter-level spatial resolution [27]. Compared to using canopy-scale spectral reflectance, using leaf-level spectral reflectance derived from pure pixels, which are covered only with leaves, results in a stronger relationship between LCC and spectral reflectance. The effects of plant structure and background information on the relationships between spectral reflectance and LCC are largely eliminated using leaf-level spectral reflectance [28]. In addition, LCC observations and spectral reflectance can be well matched using fine spatial resolution images and increasing the relationship between LCC observations and spectral reflectance. Previous studies have shown that lower errors in LCC estimates were obtained using images with fine spatial resolution compared to those obtained using images with a relatively coarse spatial resolution [28]. Very strong relationships between LCC observations and UAV-based spectral reflectance have been widely reported [29,30,31]. Digital images with spectral reflectance in the visible bands obtained from UAVs are the most commonly used and have sufficient capacity to determine variation in LCC across different plants [30], because LCC mainly absorbs visible light in the blue and red spectral regions [32,33,34]. The green band and combination of visible bands have been widely used to accurately estimate changes in LCC [13,30].
Radiometric transfer models (RTMs) and empirical models are the two commonly used methods for constructing relationships between LCC observations and UAV-based spectral reflectance. RTMs provide an explicit connection between vegetation biophysical variables and canopy reflectance based on the transfer and interaction of radiation within the canopy [35]. The estimation of vegetation parameters with RTMs is not constrained by geographic location, data collection time, and sensor configuration [36]. However, different combinations of parameters input to the RTMs may produce similar spectra, resulting in nonuniqueness of the parameter estimates [37]. Empirical models, including regression algorithms and machine learning algorithms, have been widely applied to estimate LCC [16,31,38,39]. Regression algorithms are used to find linear or nonlinear relationships between LCC observations and their spectral reflectance or combinations of these reflectances, which are simple and found to be feasible [20,38]. Machine learning algorithms, such as artificial neural network algorithms, random forests, support vector machines, and decision trees, can solve complicated nonlinear calculations and generate adaptive and robust estimates [40,41,42,43,44]. Among artificial neural network algorithms, the backpropagation (BP) neural network algorithm is well-known for successfully and accurately estimating LCC [31,45]. One of problems in the BP algorithm is overfitting for the training dataset, which results in large prediction errors for unknown situations [46]. In other words, the transferability of the BP algorithm remains doubtful [47]. To the best of our knowledge, UAV images have been mainly applied in estimating the LCC of crops, but they have rarely been applied in estimating the LCC in forests [27,30,34]. Overall, broad bands in the visible regions were used in this study to retrieve the LCC of Moso bamboo forests based on both empirical and BP algorithms.
Moso bamboo is a unique plant with high economic and ecological value. Its culm can be made into a variety of products, such as boards and furniture [48]. Many people highly depend on Moso bamboo for a living. They are willing to manage Moso bamboo forests intensively to obtain higher income, which often results in surplus soil nitrogen supply [49,50,51]. On the other hand, Moso bamboo has a strong capacity for carbon sequestration [52,53,54,55]. Hence, it is necessary to accurately detect the LCC of Moso bamboo to scientifically guide soil nitrogen supply and estimate carbon sequestration by the Moso bamboo forests. Variation in LCC of Moso bamboo is complicated at spatial and temporal scales due to its unique physiological characteristics [56]. Leaves of old bamboo shoots change from green to yellow, and the LCC gradually decreases from March to June for on-year bamboo forests because old bamboo shoots supply nutrition to young shoots to sustain growth [57,58,59]. Spatial variation in the LCC of Moso bamboo forest is evident from dramatic differences in LCC between old and young bamboos [57]. For off-year Moso bamboo forest, the LCC gradually decreased during the leaf-changing period from March to mid-May, increased during the leaf expansion period from mid-May to late June, and then remained stable after late June [58,59,60]. This implies that temporal variations in the LCC of Moso bamboo forests are visible. Therefore, remote sensing images with high spatial resolution and temporal frequency are necessary to accurately determine the LCC of Moso bamboo forests accurately.
In this study, a long-term series of leaf chlorophyll content index (CCI), a good indicator of LCC, for individual crowns of Moso bamboo were measured as ground observations. Their corresponding spectral reflectance (digital numbers) was obtained using the Dajiang UAV platform. Some variables sensitive to CCI were selected from many commonly used variables derived from UAV-based digital numbers. The relationships between CCI observations and sensitive variables were constructed using empirical and BP algorithms. The algorithms were validated using independent samples and applied to accurately retrieve the temporal and spatial distributions of the CCI of the Moso bamboo forest. The objectives of this study were to (1) select variables derived from UAV-visible broad bands that are sensitive to the CCI of Moso bamboo and have strong antidisturbance ability, (2) construct feasible algorithms for accurate CCI determination, and (3) validate the transferability and representativeness of the algorithms for application to different datasets from different sites and times without complicated processing, such as calibration for each dataset.

2. Materials and Methods

2.1. Study Areas

The study areas are in Anji County and Lin’an District, northwestern Zhejiang Province, China (Figure 1). The climate type belongs to the subtropical monsoon region. The annual average precipitation is between 1100 and 1900 mm, and the annual average temperature range is 12.20–15.60 °C. Anji County accounts for 757 km2 of bamboo forests, of which 79.30% are Moso bamboo forests [23]. Lin’an district has a 587 km2 bamboo forest, and the Moso bamboo forest accounts 39.69%. Intensive activities, including cutting, fertilizing, and weeding, were conducted on the Moso bamboo in the study areas. The growth status of Moso bamboo forests is healthy without indications of stress or insect attack. The average canopy height is approximately 11 m after the shoot cutting activity, and the diameter at breast height (DBH) is between 7.0 and 13.8 cm.

2.2. Leaf Chlorophyll Content Index Measurement

Moso bamboo samples were selected from a plot covering a 1 km × 1 km area centered on the flux tower (30.476°N, 119.673°E). Eight samples per month were collected over five months in the December 2019–October 2020 period, resulting in a total of 40 samples (Table 1 and Figure 1a). These samples equally represented different ages and DBH classes. To quickly locate the samples on the corresponding UAV red–green–blue (RGB) images, a red marker was placed beside each sample (Figure 1e). The samples were collected after the UAV RGB images were obtained. To obtain more representative leaf CCI measurements, the canopy was divided into several sections at an interval of 0.5 m from the top down after the Moso bamboo was felled. Then, all the leaves of each section were removed and mixed. Approximately 100 leaves were randomly selected from the leaves of each section, and between 500 and 700 leaves were collected for each bamboo sample. The CCI for each leaf sample was measured one time in the middle of leaf using a CCM-200 plus Chlorophyll Content Meter (Opti-Sciences, Inc., Hudson, NH, USA/Boston, MA, USA). The average CCI of all leaf samples from a bamboo sample was calculated as the leaf CCI of the bamboo sample. Because the sample leaves for two bamboo samples collected in December 2019 were not fresh, they were discarded. Training and validating the CCI estimation models was carried out using 38 samples.
The extrapolation reliability of the models was tested with sample datasets independently collected from both study sites (Table 1). We collected 36 and 20 bamboo samples on 19 April 2021 (Figure 1b), and 21 May 2021 (Figure 1c), respectively, in Anji County, and 15 bamboo samples on 20 April 2021 (Figure 1d) in Lin’an district. We collected 100 and 150 leaf samples from different canopy heights for each independent bamboo sample. The leaf CCI for each independent bamboo sample was measured using the process described above.

2.3. UAV Images Acquisition and Processing

RGB images covering the sample areas were obtained using the DJI Phantom 4 Pro V2.0 UAV system (Shenzhen Dajiang Baiwang Technology Co., Ltd., Shenzhen, China). This UAV has an integrated camera with a 1-inch complementary metal-oxide semiconductor sensor that captures red–green–blue spectral information [61]. The UAV flight commenced at 10 a.m., with clear skies and slow wind weather conditions. The flight height was set at 120 m, and the spatial resolution of the image was approximately 3.9 cm/pixel. Nine flights were conducted during December 2019 and April 2021. The flight dates coincided with bamboo sample collection dates (Table 1). The images coinciding with the independent bamboo samples collected in 2021 were obtained under different flight heights and solar intensities to analyze the influences of these factors on CCI estimation (Table 1). Images were obtained on 19 April 2021 under two different levels of solar intensity at the flight height of 120 m. Images were obtained on 20 April 2021 at four different flight heights (80, 100, 120, and 140 m) under the same solar intensity. Images were obtained on 21 May 2021 under four different levels of solar intensity, and flights under each level of solar intensity were conducted at four different flight heights.

2.4. Variables for Modeling

The canopy of each bamboo sample was extracted from the UAV RGB image using hand delineation (Figure 1f). The nonleaf parts and very bright and shadow pixels within the canopy were masked using a threshold method. The RGB values for the remaining pixels were averaged as the RGB values of the bamboo sample. Many variables were derived from RGB values and used for the CCI estimation (Table S1).
The normalized RGB (rgb) values were calculated to limit the effects of solar light and terrain shadows on variations in the rgb values in the RGB color space. Previously, general vegetation indices such as excess red vegetation index (ExR) [62], excess blue vegetation index (ExB) [63], excess green vegetation index (ExG) [64], excess green minus excess red (ExGR) [65], and the Kawashima index (IKAW) [66] were calculated based on the rgb values.
The hue–saturation–value (HSV) is another color space that can be converted from the RGB color space. Hue is pure color, describing the “redness”, “greenness”, and “blueness” of an object [64]. Saturation is the degree of the color close to a pure color [64]. The value is the brightness of the color and varies with the color saturation. A previous study has shown that the HSV color space is not sensitive to changes in illumination, and H-S was found to be the best variable for extracting vegetation from the background [67]. The HSV was calculated using the rgb2hsv function in MATLAB software [68]. The H and S values ranged from 0 to 1, and the V values ranged from 0 to 255.
In addition, the relationship between the CCI and rgb values was analyzed. The CCI of Moso bamboo decreased from December 2019 and reached a minimum in May 2020, then increased and remained relatively stable from July 2020 (Figure 2a). Among the rgb bands, the trend of the b band was closest to that of the CCI. The relationship between the g band and CCI was low, as indicated by the difference in trends (Figure 2a,b). High g band values were found in May, but the corresponding CCI values were low (Figure 2b). The b band was more sensitive to CCI than the g band for Moso bamboo (Figure 2a,c). Compared with the b band, the variation in the g band was stable during the year, and it was more suitable for use as a reference band. Therefore, some new vegetation indices, such as 1.4 × r-b (modified excess red index, MExR), 1.4 × g-b (modified excess green index, MExG), 2 × b-r-g (modified excess blue index, MExB), and (2 × b-r-g)-(1.4 × r-b) (modified excess blue minus excess red index, MExBR), were proposed by replacing the normalized green band with the normalized blue band.

2.5. Models for CCI Estimation

Linear regression and BP neural network models are widely used to estimate the CCI of vegetation based on UAV RGB images [31,66]. High accuracy of CCI estimates for rice and maize crops has been achieved based on the two methods [13,30]. Therefore, the two methods were used to estimate the CCI of Moso bamboo in this study based on MATLAB R2021a for Windows.

2.5.1. Linear Regression Model

X is the independent variable strongly related to the CCI, a is the slope, and b is the intercept.
CCI = a × X + b

2.5.2. Backpropagation Neural Network Model

The BP neural network uses the Gaussian error function as an activation function in the hidden layer and the logsig function as an activation function in the output layer (Erf-BP). It has the advantages of high robustness and fitting accuracy in estimating vegetation carbon storage [69]. The Erf-BP model was used for the CCI estimation in this study. There is still no scientific method to determine the number of hidden layers, and a trial-and-error method is often used. In addition, the training performance (error) has a significant influence on the prediction ability of the network. If it is too large, the prediction ability is poor. If it is too small, the prediction ability is strong for the training data. However, overfitting may occur for the test data, and the generalization ability is weak. Therefore, it is essential to select a suitable training performance. Considering these drawbacks, the number of neurons in the hidden layer and the training performance was set in the ranges 5–15 and 0.01–0.3, respectively. The network was trained using a one-by-one search strategy, decreasing in steps of 1 and 0.01, respectively. The learning rate and momentum factor were set to 0.2 and 0.9, respectively. The optimal combination of the two parameters was selected based on the principle of minimizing the root mean square error (RMSE) of the test data. Meanwhile, the weights of the Erf-BP model were used to estimate the CCI of Moso bamboo forests.
In this study, the five vegetation indices, namely IKAW, MExBR, H-S, 1.4H-S, and (H-S)/(H+S), were closely related to CCI with correlation coefficients greater than 0.90 and were chosen as the inputs for the Erf-BP model. The sample data with odd numbers were selected as the training dataset. The remaining samples were used as the test dataset. CCI values were normalized using the following equation:
CCI n = 2 × CCI CCI min CCI max CCI min 1
where CCIn is the normalized CCI. CCImin and CCImax denote the minimum and maximum CCI values, respectively. They were, respectively, set as 3 and 30 according to the ranges of the measured leaf CCI. The CCImin and CCImax derived from the measured leaf CCI were well represented because nearly 30,000 leaves were collected during the four seasons.

2.6. Accuracy Evaluation of Models

Sampling for model training and validation is very difficult. Systematic sampling is used in this study. The 38 samples with odd numbers were used as training data for building the linear regression and the Erf-BP models. The remaining 19 samples were used as test data to validate the prediction ability of the models. Seventy-one independent samples collected on 19–20 April and 21 May 2021 were used to evaluate the extrapolation ability of the models. Measure metrics, such as the coefficient of determination (R2), RMSE, relative RMSE (RMSEr), and bias, were used to calculate the match between CCI measurements and CCI estimates.

3. Results

3.1. Correlation Analysis

Correlation analysis between the 21 variables and CCI showed that CCI had the highest correlation with the B band, followed by the G and R bands in the RGB space (Figure 3). In the normalized RGB space, the correlations between the b and r bands and CCI were 0.87 and −0.73, respectively, which were significantly increased compared to RGB space. However, the correlation between g and CCI was low and even lower than between G and CCI. Among the vegetation indices derived from the normalized RGB bands, MExBR had the highest correlation with CCI, followed by IKAW, MExB, and MExR.
In the HSV space, H and S highly correlated with CCI, with correlation coefficients of 0.71 and −0.51, respectively, and V was not significantly correlated with CCI. Compared to H and S, the variables derived from the operation between H and S significantly increased the correlation with CCI. The 1.4H-S, (H-S)/(H+S), and H-S were significantly related to CCI, with correlation coefficients higher than 0.90.

3.2. Accuracy of the Linear Regression Models

The correlation analysis showed that MExBR in the RGB color space and 1.4H-S in the HSV color space highly correlated with the CCI of Moso bamboo, with an absolute correlation coefficient of 0.93 and a significance of 0.01. Therefore, these two vegetation indices were selected to invert the CCI of Moso bamboo.
The linear regression models between CCI and MExBR and 1.4H-S were CCI = 34.978 × MExBR + 29.617 and CCI = 39.592 × (1.4H-S) + 15.336, respectively. For the training samples, the R2 between the estimated and measured values for both models were higher than 0.88 (Figure 4). The RMSE (1.38) and RMSEr (10.01%) for the model driven by MExBR were slightly lower than those driven by 1.4H-S (RMSE = 1.41 and RMSEr = 10.28%). The R2 between the estimated and measured values for the validation sample was higher than 0.85 for both models. The errors (RMSE = 1.88 and RMSEr = 14.35%) for the model driven by 1.4H-S was slightly smaller than those (RMSE = 1.91 and RMSEr = 14.56%) driven by MExBR. Relatively, the RMSE and RMSEr during the winter and spring seasons were higher than those during the summer and autumn seasons.
For MExBR, the slopes were 0.89 and 0.84 for the training and validation data with confidence intervals from 0.73 to 1.05 and from 0.66 to 1.02, respectively. For 1.4H-S, the slopes were 0.88 and 0.82 for the training and validation data with confidence intervals from 0.72 to 1.05 and from 0.65 to 0.98, respectively. Regression lines between estimated and measured values were very close to the 1:1 line for the training and validation datasets, indicating no systematic underestimation or overestimation. The differences between estimates and measurements in low CCI area (less than 10) were clear for both the training and validation datasets. The difference in estimates from the models driven by the two vegetation indices was insignificant (p > 0.05). Overall, both linear regression models driven by MExBR and 1.4H-S were accurate and could be used to estimate the CCI of the Moso bamboo forest.

3.3. Accuracy of the Erf-BP Model

The predictive accuracy of the Erf-BP model was more sensitive to training performance than the number of hidden layers. The RMSE for the training dataset decreased as the training performance decreased. In contrast, the RMSE for the test dataset first decreased and then increased as the training performance decreased (Figure 5). The RMSE for the test dataset reached a minimum when the training performance was 0.08, and the number of hidden layers was eight. The corresponding weights and thresholds for the Erf-BP neural network model were recorded to estimate CCI (Tables S2 and S3).
The R2 values between the estimated and measured values for the training and test datasets were greater than 0.88 (Figure 6). The RMSE and RMSEr were 1.61 and 11.72% for the training dataset and 1.61 and 12.27% for the test dataset. The regression lines between the measured and estimated values for the training and validation datasets were very close to the 1:1 line, with confidence intervals for their slopes between 0.81 and 1.19 and between 0.77 and 1.12, respectively. However, the differences (RMSE = 1.40 and RMSEr = 16.13%) between estimates and observations when CCI was less than 10 was lower compared to those (RMSE = 1.75 and RMSEr = 20.22%) from the linear regression model. This implies that the Erf-BP model performed better than the linear regression model in estimating low CCI values.
The RMSE and RMSEr of the Erf-BP model were greater than those of the linear regression model driven by MExBR for the training dataset, with absolute differences of 0.23 and 1.71%, respectively. For the validation dataset, the RMSE and RMSEr of the Erf-BP model were smaller than those of the linear regression model, with absolute differences of 0.30 and 2.29%, respectively. However, the differences between the estimates from the Erf-BP model and the linear regression models were insignificant (p > 0.05), implying that both models could be used to estimate the CCI of Moso bamboo forests accurately.

3.4. Effects of Illumination and Flight Height on CCI Estimates

Comparison of CCI estimates derived from the MExBR images acquired under an illumination level of 938 W/m2 at 10:16 (CCI938) and under an illumination level of 546 W/m2 at 15:38 (CCI546) on 19 April 2021, at the same flight height of 120 m, showed that the CCI estimates derived from images acquired under different light intensities varied significantly (Figure 7). The CCI938 was significantly higher than that of CCI546. The CCI938 value was also significantly higher than the measured values, with an RMSE of 5.58 and a bias of 5.12. However, the CCI546 value was very close to the measured value, with an RMSE of 1.60 and a bias of −0.03. Therefore, illumination significantly affects the accuracy of the extrapolation model, and different illumination levels can result in significantly different CCI estimates for the same flight height.
The CCI estimates derived from the MExBR images acquired at different flight heights under the same illumination level (377 W/m2) on 20 April 2021, were significantly different (Figure 8). As the flight height decreased from 140 to 80 m, the CCI estimates decreased. Compared to the measured CCI, all the CCI estimates obtained at different flight heights were underestimated (Figure 8). The difference between the estimated and measured CCI was the smallest when the flight height was 140 m, with an RMSE of 1.70, an RMSEr of 18.60%, and a bias of −1.26. Hence, flight height also had a significant influence on the accuracy of the CCI estimates.
Comparison of CCI estimates from different flight heights and illumination levels on 21 May 2021 showed that the CCI estimates decreased with decreased flight height under illumination at levels of 369 and 450 W/m2 (Figure 9a,b). Under the illumination of 369 W/m2, the CCI estimates obtained at a flight height of 120 m were the closest to the measured values, with RMSE, RMSEr, and bias values of 2.09, 12.98%, and −1.40, respectively. Under 450 W/m2 illumination, the CCI estimates obtained at a flight height of 100 m were the closest to the measured values, with RMSE, RMSEr, and bias values of 1.97, 12.23%, and 1.22, respectively. The CCI estimates obtained at flight heights of 120 and 140 m under the 450 W/m2 illumination were very similar and correlated highly with the measured values. However, they were significantly greater than the measured values with bias values of 4.26 and 3.87, respectively.
Under relatively high illumination (887 and 894 W/m2), the differences in CCI estimates were slight between different flight heights, and the CCI estimates were not sensitive to flight heights when the flight height was over 100 m (Figure 9c,d). The CCI estimates were strongly related to but greater than the measured values when the flight height was over 100 m. On the contrary, the CCI estimates obtained at a flight height of 80 m were smaller than the measured values and were significantly different from those obtained at flight heights greater than 100 m. Hence, the effect of flight heights on CCI estimates decreased under strong illumination and high flight height conditions.
In summary, the flight height largely affected the CCI estimates. Under low illumination levels, estimates gradually decreased as the flight height decreased. The sensitivity of the CCI estimates to flight height lowered under high illumination levels. Hence, it is suggested that images should be acquired under high illumination levels to reduce the effect of flight height on CCI estimates. The CCI estimates significantly changed with changing illumination levels at the same flight height. The illumination effect on the CCI estimates decreased as the flight height increased. The CCI estimates did not vary with illumination levels when the flight height was over 120 m. In contrast, the CCI estimates were sensitive to illumination levels and decreased as the illumination level decreased when the flight height was below 100 m. It is advisable to acquire images at a high flight height to reduce the effect of illumination on the CCI estimates.

3.5. Accuracy Evaluation of Model Extrapolation

Accuracy evaluation of the extrapolation model based on independent samples collected under a flight height of 120 m and illumination between 369 and 546 W/m2 showed high correlation between predicted and measured values with an R2 higher than 0.80 for the linear regression and Erf-BP models (Figure 10). There was a clear underestimation in CCI estimates for the samples collected on 20 April 2021, when the Moso bamboo changed leaves. There was no systematic bias in any of the other samples. The accuracy of the linear regression model driven by MExBR was slightly higher than that driven by 1.4H-S, with an RMSE and RMSEr of 2.12 and 13.78%, respectively. The Erf-BP was not superior to the linear regression model, with an RMSE and RMSEr of 2.44 and 15.88%, respectively. This study obtained highly accurate CCI estimates using the linear regression and the Erf-BP models under suitable flight height and illumination conditions.
The spatial distribution of the leaf CCI estimates for each image is shown in Figure 11. The results showed that both models could accurately predict the spatial distribution of leaf CCI and that their spatial distribution patterns were very similar (Figure 11a,b). This further confirmed the versatility of the models. On 19 April and 21 May 2021, the leaf CCI in Anji County for on-year Moso bamboo was visibly greater than that on 20 April 2021, in Lin’an district for off-year Moso bamboo forest. The means of leaf CCI estimates from both models were 16.3 and 15.8, respectively, on 19 April 2021, and 16.2 and 16.3, respectively, on 21 May 2021. This was very close to the means of the measured leaf CCI of 17.4 on April 19 and 16.0 on 21 May (Figure 11c). The mean leaf CCI estimates on April 20 from both models were visibly lower than that of measured leaf CCI, implying that the estimates on 20 April were underestimated. The ranges of leaf CCI estimates from both models were similar to those of the measured leaf CCI. The percentages of each leaf CCI class from both models were comparable to those of the measured leaf CCI on 19 April and 21 May 2021. The percentage of each leaf CCI class from the linear regression model was closer to the measured leaf CCI than the Erf-BP model. The percentages of leaf CCI for six and eight classes on 20 April from both models were clearly higher than the measured leaf CCI. Both models failed to estimate the high leaf CCI values, and the Erf-BP model overestimated the low leaf CCI classes (0–4 classes) on 20 April.

4. Discussion

4.1. Variables Related to CCI

The results showed that CCI has a strong relationship with the b and r bands and some vegetation indices derived from the normalized RGB color space. In the HSV color space, it has strong correlations with H-S, 1.4H-S, and (H-S)/(H+S). These results are consistent with those of previous studies. For example, the relationship between the r band and leaf chlorophyll content was greater than 0.90 [31], and the r band was selected as the primary variable for estimating CCI of maize plants [70]. The b band has been reported to be one of the most important variables for CCI modeling of crops [13]. The normalized difference between the r and b bands had a high and stable correlation with the CCI of crop plants, with a correlation coefficient of −0.81 [30,66]. Our results showed that the rgb values showed higher correlations with CCI than RGB values, consistent with previous studies [66,71,72,73]. A long-term series of sample data from December 2019 to October 2020 were used in this study, which can represent the variation in CCI caused by leaf changes. The CCI was more sensitive to changes in the physiological status of plants (such as leaf changes) than other metrics of biophysical characteristics [13,74], which is the reason for the high correlations between visible band values and CCI in this study.

4.2. Importance of Blue Band for CCI Estimation

Most previous studies have shown that the relationship between the green band value and CCI is stronger than that between the blue band and CCI [23,30,75]. The blue band was not linked to the CCI [76]. Therefore, many variables were constructed based on the blue band as a reference, such as (g-b)/(g+b), 2G-R-B, and 1.4G-B [2,76]. However, our results differed from those, and the blue band showed higher correlations with the CCI of Moso bamboo than the green band. As CCI increased, the blue band value also increased. In contrast, the green band was not sensitive to CCI and was used as a reference to construct variables. These findings are primarily due to the particular physiological status of Moso bamboo.
Moso bamboo forests change leaves from March to May. Young leaves start sprouting in May [57,58,59]. The color of the leaves is bright green, and while the green band value is high in May, it is comparable to the green band values from July to December. However, the CCI values at the beginning of the growth stage of leaves in May are far lower than those in July and August [58,59] because chlorophyll biosynthesis is not complete for young leaves in May. Therefore, the relationship between CCI and the green band was weak because of the low CCI corresponding to the high green band in May (Figure 2b). The strengthening of these variables in the green band was not considered for the CCI estimation models. In contrast, the blue band value during the leaf-changing period in May was visibly different from that after the leaf-changing period in July. As the CCI decreased from January to May, the blue band value also showed a decreasing trend (Figure 2a), which indicates that the blue band can represent the seasonal variation in CCI [2,76]. This indicates that the relationship between the normalized blue band and CCI was higher than that between the normalized green band and CCI (Figure 2b,c). Previous studies showed that absolute correlation coefficients between the normalized red and blue bands and CCI were higher than 0.78, whereas that between the normalized green band and CCI was less than 0.50 [66,73]. These studies confirmed that the normalized green band could be used as a reference band for constructing variables. Overall, our study shows that the blue band is a useful and sensitive band for determination of the CCI.
However, there are some drawbacks to using the blue band to estimate the CCI. The blue band value was unstable under different solar light conditions because, compared with the green and red bands, it is greatly influenced by atmospheric scattering [76]. Previous studies indicated that the accuracy of the blue band value was reduced by the reflected light from the sparse canopy and diffuse radiation from the atmosphere, which results in uncertainty in CCI estimation among different light conditions during photography [77]. Our study supports this conclusion. The CCI estimates derived from MExBR, which strengthens the blue band, were largely affected by the flight height and intensity of the illumination conditions. Hence, only under a suitable flight height and solar light condition can the accuracy of the CCI estimated from MExBR be guaranteed.

4.3. Comparison between Linear Regression and Erf-BP Models

The accuracy of the CCI estimates from the Erf-BP model was slightly higher than the linear regression model for the validation samples. The Erf-BP model has an advantage in data fitting because it can flexibly conduct nonlinear calculations [13]. However, the CCI values were normalized based on the CCImin and CCImax derived from the measured leaf CCI. Although the CCImin and CCImax are well represented in this study, they had an effect on the accuracy of CCI estimates. The CCI ranges estimated from the Erf-BP model were definitely within the range between CCImin and CCImax. Hence, the Erf-BP model cannot accurately predict CCI values beyond the range between CCImin and CCImax. The difference in CCI estimates from the Erf-BP and linear regression models was insignificant, and the accuracy of the Erf-BP model was slightly lower than that of the linear regression model for the independent samples. There are two main reasons for the insignificant difference between the Erf-BP and linear regression models. First, the highly linear relationships between the variables and CCI (Figure 3) make solving nonlinear and complicated relationships of the Erf-BP model strongly impractical. Second, very strong relationships between the variables make the ability to solve the multicollinearity of the BPNN model unusable. Therefore, although four variables were used in the Erf-BP model, the accuracy of the CCI estimates did not improve. A previous study indicated that the complex methods were impractical and produced less reliable results than the more straightforward regression method when relationships between variables and CCI were strong [78]. The linear model would probably be more widely applicable because of its simplicity. Hence, the linear regression model is sufficient to estimate the CCI of Moso bamboo accurately. It is worth mentioning that sufficient training data collected under comparable acquisition conditions is an essential prerequisite for the high accuracy of linear regression models [11].

4.4. Adaptability and Universality of Models

Although the linear regression and Erf-BP models for CCI estimation had high prediction accuracies, they could not accurately estimate CCI for independent data unless collected under prescriptive flight height and solar light conditions. Previous studies have shown that UAV RGB image values are affected by many factors, such as weather conditions, flight height, and terrain [67,79,80,81,82].
Variations in illumination caused differences in RGB values obtained by digital cameras and led to inconsistent color representations of vegetation [80,83,84], because illumination and color parts were mixed in the RGB color space [80,85]. Color depth for the same leaf changed from dark green to bright green during the day [79], due to large variations in leaf radiance under different illuminations [76]. The RGB image values were unstable when the photographic conditions changed [77]. The difference in green and blue band values under intense illumination was greater than under cloudy conditions [66]. The blue band value was much greater than the red band value on a clear day, but the difference was relatively small on a cloudy day [86]. In addition, results derived from UAV images obtained from different flight heights were also different [77,87,88]. As the flight heights decreased from 100 to 60 m, the normalized difference vegetation index of vegetation increased from 0.17 to 0.25 [87]. Therefore, increasing the adaptability and universality of models continue to face huge challenges until the instability of RGB image values caused by changing photographic conditions is solved.
Two approaches were used to reduce the effect of illumination intensity. The first is to normalize the RGB color space. This method has been widely used and is proven effective in reducing the illumination effect [30,66,89,90]. A previous study showed that the normalized RGB color space is more stable than the RGB color space under changing unstable illumination conditions [73,91]. The performance of the model driven by the normalized RGB color space was better than that driven by RGB color space in estimating CCI [71,72,73]. The normalized difference between the red and blue bands was the most suitable function for application in different field images and was found not to be influenced by different weather conditions [66]. The second approach is converting the RGB color space to the HSV color space, which can separate color and illumination to reduce the effect of illumination [80]. Previous studies have indicated that the HSV color space is robust to illumination variations [92].
However, neither of these approaches can entirely eliminate the effect of illumination [90]. Our results showed that flight height and illumination conditions still influenced the CCI estimates derived from normalized RGB and HSV color spaces. A correlation between flight height, illumination, and the CCI estimates was found. When the illumination was sufficiently intense, the CCI estimates were not influenced by the flight height. This may be because enough reflected radiation was received by the digital camera at different heights when the leaf irradiation was high under intense illumination conditions. In contrast, the reflected radiation received by the digital camera varied with flight height when the illumination was weak.
At the same flight height, the CCI estimates increased as the illumination intensity increased and remained stable when the illumination intensity reached a relatively high value. Hence, a sunny day with intense illumination offers the best light conditions for obtaining images with a reduced effect of flight height on RGB values. A previous study indicated that the best flight time is around noon with a clear sky [93,94]. Another study reported that using RGB images obtained on a clear day is unsuitable for CCI estimation [66]. It is difficult to discriminate between the differences in leaf color plates caused by variations in CCI because the angular distribution of reflected radiation from the leaf color plates tends to be narrow-directional under clear days with direct solar radiation [66]. However, our study showed that the relationships between MExBR and CCI are significant under relatively high illumination, which implies that the RGB values from images obtained on a clear day can represent the variation in CCI (Figure 9c,d). This is inconsistent with the results from Kawashima and Nakatani (1998).
An effective way to eliminate the effect of illumination on RGB images is not available [94,95]. Some measures could help in overcoming the effect of illumination on RGB images for long flight or several flights conducted over different days, such as conducting flights under stable or homogenous illumination conditions, making adjustments for varying illumination, and normalizing the RGB images [93,94,96]. Because in this study RGB images were not collected under homogenous illumination conditions, the models were suitable for accurate estimation of CCI under the prerequisite that RGB images are obtained at flight heights between 100 and 120 m and illumination levels of 369–546 W/m2. Future studies should focus on increasing the adaptability and universality of models for estimating CCI based on UAV RGB images.

5. Conclusions

Statistical models driven by variables derived from UAV RGB images and calibrated using 38 bamboo samples collected in different months were constructed to estimate the leaf CCI of Moso bamboo forests. Independent images collected under different flight heights and solar intensities were used to test the adaptability and universality of the models. The UAV RGB image is a feasible dataset for the determination of leaf CCI. The linear regression and Erf-BP models can accurately estimate leaf CCI. The Erf-BP model was not preferred to the linear regression model in estimating leaf CCI in this study. During the leaf-changing period, it was challenging to accurately estimate the leaf CCI using the models because it overestimated the low leaf CCI and underestimated the high leaf CCI values. Further analysis revealed that the flight height and solar intensity strongly influenced the performance of the models. High flight height reduces the sensitivity of CCI estimates to solar intensity, and high solar intensity decreases the effect of flight height on CCI estimates. Overall, the generality of the models proposed in this study is still problematic. Accurate estimates were only reached when the models were driven by variables derived from images acquired from a certain flight height (100–120 m) and solar intensity (369–546 W/m2). Future studies should focus on eliminating the effects of solar intensity and flight height on CCI estimation and increasing the models’ generality.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/rs14122864/s1, Table S1: Variables derived from RGB values and used for the chlorophyll content index estimation. Table S2: Connection weights and threshold values from input layer to hidden layer. Table S3: Connection weights and threshold values from hidden layer to output layer.

Author Contributions

Conceptualization, X.X. and Y.Z.; methodology, H.X., Y.Q. and J.W.; software, H.X., J.W. and X.X.; validation, H.X. and J.W.; investigation, H.X., Y.Q., L.H., Y.T., Z.Z. and J.W.; data curation, H.X., Y.Q., L.H., Y.T., Z.Z. and J.W.; writing—original draft preparation, H.X. and X.X.; writing—review and editing, H.X. and X.X.; supervision, X.X. and Y.Z.. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (Grant No. 31870619), Joint Research Fund of Department of Forestry of Zhejiang Province and Chinese Academy of Forestry (Grant No. 2022SY05).

Data Availability Statement

Not applicable.

Acknowledgments

The Overseas Expertise Introduction Project for Discipline Innovation (111 Project D18008).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gitelson, A.; Merzlyak, M.N. Spectral Reflectance Changes Associated with Autumn Senescence of Aesculus hippocastanum L. and Acer Platanoides L. Leaves. Spectral Features and Relation to Chlorophyll Estimation. J. Plant Physiol. 1994, 143, 286–292. [Google Scholar] [CrossRef]
  2. Sims, D.A.; Gamon, J.A. Relationships between Leaf Pigment Content and Spectral Reflectance across a Wide Range of Species, Leaf Structures and Developmental Stages. Remote Sens. Environ. 2002, 81, 337–354. [Google Scholar] [CrossRef]
  3. Croft, H.; Chen, J.M.; Wang, R.; Mo, G.; Luo, S.; Luo, X.; He, L.; Gonsamo, A.; Arabian, J.; Zhang, Y.; et al. The Global Distribution of Leaf Chlorophyll Content. Remote Sens. Environ. 2020, 236, 111479. [Google Scholar] [CrossRef]
  4. Niinemets, Ü.; Tenhunen, J.D. A Model Separating Leaf Structural and Physiological Effects on Carbon Gain along Light Gradients for the Shade-Tolerant Species Acer Saccharum. Plant Cell Environ. 1997, 20, 845–866. [Google Scholar] [CrossRef]
  5. Croft, H.; Chen, J.M.; Luo, X.; Bartlett, P.; Chen, B.; Staebler, R.M. Leaf Chlorophyll Content as a Proxy for Leaf Photosynthetic Capacity. Glob. Chang. Biol. 2017, 23, 3513–3524. [Google Scholar] [CrossRef] [Green Version]
  6. Yoder, B.J.; Pettigrew-Crosby, R.E. Predicting Nitrogen and Chlorophyll Content and Concentrations from Reflectance Spectra (400–2500 Nm) at Leaf and Canopy Scales. Remote Sens. Environ. 1995, 53, 199–211. [Google Scholar] [CrossRef]
  7. Clevers, J.G.P.W.; Kooistra, L. Using Hyperspectral Remote Sensing Data for Retrieving Canopy Chlorophyll and Nitrogen Content. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 574–583. [Google Scholar] [CrossRef]
  8. Van Evert, F.K.; Booij, R.; Jukema, J.N.; ten Berge, H.F.M.; Uenk, D.; Meurs, E.J.J.B.; van Geel, W.C.A.; Wijnholds, K.H.; Slabbekoorn, J.J.H. Using Crop Reflectance to Determine Sidedress N Rate in Potato Saves N and Maintains Yield. Eur. J. Agron. 2012, 43, 58–67. [Google Scholar] [CrossRef]
  9. Croft, H.; Arabian, J.; Chen, J.M.; Shang, J.; Liu, J. Mapping Within-Field Leaf Chlorophyll Content in Agricultural Crops for Nitrogen Management Using Landsat-8 Imagery. Precis. Agric. 2020, 21, 856–880. [Google Scholar] [CrossRef] [Green Version]
  10. Abdullah, H.; Darvishzadeh, R.; Skidmore, A.K.; Groen, T.A.; Heurich, M. European Spruce Bark Beetle (Ips Typographus, L.) Green Attack Affects Foliar Reflectance and Biochemical Properties. Int. J. Appl. Earth Obs. Geoinf. 2018, 64, 199–209. [Google Scholar] [CrossRef] [Green Version]
  11. Elarab, M.; Ticlavilca, A.M.; Torres-Rua, A.F.; Maslova, I.; McKee, M. Estimating Chlorophyll with Thermal and Broadband Multispectral High Resolution Imagery from an Unmanned Aerial System Using Relevance Vector Machines for Precision Agriculture. Int. J. Appl. Earth Obs. Geoinf. 2015, 43, 32–42. [Google Scholar] [CrossRef] [Green Version]
  12. Roosjen, P.P.J.; Brede, B.; Suomalainen, J.M.; Bartholomeus, H.M.; Kooistra, L.; Clevers, J.G.P.W. Improved Estimation of Leaf Area Index and Leaf Chlorophyll Content of a Potato Crop Using Multi-Angle Spectral Data—Potential of Unmanned Aerial Vehicle Imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 66, 14–26. [Google Scholar] [CrossRef]
  13. Zhu, W.; Sun, Z.; Yang, T.; Li, J.; Peng, J.; Zhu, K.; Li, S.; Gong, H.; Lyu, Y.; Li, B.; et al. Estimating Leaf Chlorophyll Content of Crops via Optimal Unmanned Aerial Vehicle Hyperspectral Data at Multi-Scales. Comput. Electron. Agric. 2020, 178, 105786. [Google Scholar] [CrossRef]
  14. Houborg, R.; McCabe, M.; Cescatti, A.; Gao, F.; Schull, M.; Gitelson, A. Joint Leaf Chlorophyll Content and Leaf Area Index Retrieval from Landsat Data Using a Regularized Model Inversion System (REGFLEC). Remote Sens. Environ. 2015, 159, 203–221. [Google Scholar] [CrossRef] [Green Version]
  15. Darvishzadeh, R.; Skidmore, A.; Schlerf, M.; Atzberger, C.; Corsi, F.; Cho, M. LAI and Chlorophyll Estimation for a Heterogeneous Grassland Using Hyperspectral Measurements. ISPRS J. Photogramm. Remote Sens. 2008, 63, 409–426. [Google Scholar] [CrossRef]
  16. Ali, A.M.; Darvishzadeh, R.; Skidmore, A.; Gara, T.W.; O’Connor, B.; Roeoesli, C.; Heurich, M.; Paganini, M. Comparing Methods for Mapping Canopy Chlorophyll Content in a Mixed Mountain Forest Using Sentinel-2 Data. Int. J. Appl. Earth Obs. Geoinf. 2020, 87, 102037. [Google Scholar] [CrossRef]
  17. Blackburn, G.A. Quantifying Chlorophylls and Carotenoids at Leaf and Canopy Scales: An Evaluation of Some Hyperspectral Approaches. Remote Sens. Environ. 1998, 66, 273–285. [Google Scholar] [CrossRef]
  18. Verrelst, J.; Schaepman, M.E.; Malenovský, Z.; Clevers, J.G.P.W. Effects of Woody Elements on Simulated Canopy Reflectance: Implications for Forest Chlorophyll Content Retrieval. Remote Sens. Environ. 2010, 114, 647–656. [Google Scholar] [CrossRef] [Green Version]
  19. Simic, A.; Chen, J.M.; Noland, T.L. Retrieval of Forest Chlorophyll Content Using Canopy Structure Parameters Derived from Multi-Angle Data: The Measurement Concept of Combining Nadir Hyperspectral and off-Nadir Multispectral Data. Int. J. Remote Sens. 2011, 32, 5621–5644. [Google Scholar] [CrossRef]
  20. Croft, H.; Chen, J.M.; Zhang, Y. The Applicability of Empirical Vegetation Indices for Determining Leaf Chlorophyll Content over Different Leaf and Canopy Structures. Ecol. Complex. 2014, 17, 119–130. [Google Scholar] [CrossRef]
  21. Darvishzadeh, R.; Skidmore, A.; Schlerf, M.; Atzberger, C. Inversion of a Radiative Transfer Model for Estimating Vegetation LAI and Chlorophyll in a Heterogeneous Grassland. Remote Sens. Environ. 2008, 112, 2592–2604. [Google Scholar] [CrossRef]
  22. Haboudane, D.; Tremblay, N.; Miller, J.R.; Vigneault, P. Remote Estimation of Crop Chlorophyll Content Using Spectral Indices Derived From Hyperspectral Data. IEEE Trans. Geosci. Remote Sens. 2008, 46, 423–437. [Google Scholar] [CrossRef]
  23. Xu, M.; Liu, R.; Chen, J.M.; Liu, Y.; Shang, R.; Ju, W.; Wu, C.; Huang, W. Retrieving Leaf Chlorophyll Content Using a Matrix-Based Vegetation Index Combination Approach. Remote Sens. Environ. 2019, 224, 60–73. [Google Scholar] [CrossRef]
  24. Fisher, J.I.; Mustard, J.F. Cross-Scalar Satellite Phenology from Ground, Landsat, and MODIS Data. Remote Sens. Environ. 2007, 109, 261–273. [Google Scholar] [CrossRef]
  25. Croft, H.; Chen, J.M.; Zhang, Y.; Simic, A. Modelling Leaf Chlorophyll Content in Broadleaf and Needle Leaf Canopies from Ground, CASI, Landsat TM 5 and MERIS Reflectance Data. Remote Sens. Environ. 2013, 133, 128–140. [Google Scholar] [CrossRef]
  26. Colomina, I.; Molina, P. Unmanned Aerial Systems for Photogrammetry and Remote Sensing: A Review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef] [Green Version]
  27. Sakamoto, T.; Gitelson, A.A.; Nguy-Robertson, A.L.; Arkebauer, T.J.; Wardlow, B.D.; Suyker, A.E.; Verma, S.B.; Shibayama, M. An Alternative Method Using Digital Cameras for Continuous Monitoring of Crop Status. Agric. For. Meteorol. 2012, 154–155, 113–126. [Google Scholar] [CrossRef] [Green Version]
  28. Darvishzadeh, R.; Skidmore, A.; Abdullah, H.; Cherenet, E.; Ali, A.; Wang, T.; Nieuwenhuis, W.; Heurich, M.; Vrieling, A.; O’Connor, B.; et al. Mapping Leaf Chlorophyll Content from Sentinel-2 and RapidEye Data in Spruce Stands Using the Invertible Forest Reflectance Model. Int. J. Appl. Earth Obs. Geoinf. 2019, 79, 58–70. [Google Scholar] [CrossRef] [Green Version]
  29. Raymond Hunt, E.; Daughtry, C.S.T.; Eitel, J.U.H.; Long, D.S. Remote Sensing Leaf Chlorophyll Content Using a Visible Band Index. Agron. J. 2011, 103, 1090–1099. [Google Scholar] [CrossRef] [Green Version]
  30. Saberioon, M.M.; Amin, M.S.M.; Anuar, A.R.; Gholizadeh, A.; Wayayok, A.; Khairunniza-Bejo, S. Assessment of Rice Leaf Chlorophyll Content Using Visible Bands at Different Growth Stages at Both the Leaf and Canopy Scale. Int. J. Appl. Earth Obs. Geoinf. 2014, 32, 35–45. [Google Scholar] [CrossRef]
  31. Singhal, G.; Bansod, B.; Mathew, L.; Goswami, J.; Choudhury, B.U.; Raju, P.L.N. Chlorophyll Estimation Using Multi-Spectral Unmanned Aerial System Based on Machine Learning Techniques. Remote Sens. Appl. Soc. Environ. 2019, 15, 100235. [Google Scholar] [CrossRef]
  32. Datt, B. Visible/near Infrared Reflectance and Chlorophyll Content in Eucalyptus Leaves. Int. J. Remote Sens. 1999, 20, 2741–2759. [Google Scholar] [CrossRef]
  33. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between Leaf Chlorophyll Content and Spectral Reflectance and Algorithms for Non-Destructive Chlorophyll Assessment in Higher Plant Leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef] [PubMed]
  34. Zulfa, A.W.; Norizah, K.; Hamdan, O.; Zulkifly, S.; Faridah-Hanum, I.; Rhyma, P.P. Discriminating Trees Species from the Relationship between Spectral Reflectance and Chlorophyll Contents of Mangrove Forest in Malaysia. Ecol. Indic. 2020, 111, 106024. [Google Scholar] [CrossRef]
  35. Houborg, R.; Soegaard, H.; Boegh, E. Combining Vegetation Index and Model Inversion Methods for the Extraction of Key Vegetation Biophysical Parameters Using Terra and Aqua MODIS Reflectance Data. Remote Sens. Environ. 2007, 106, 39–58. [Google Scholar] [CrossRef]
  36. Chakhvashvili, E.; Siegmann, B.; Muller, O.; Verrelst, J.; Bendig, J.; Kraska, T.; Rascher, U. Retrieval of Crop Variables from Proximal Multispectral UAV Image Data Using PROSAIL in Maize Canopy. Remote Sens. 2022, 14, 1247. [Google Scholar] [CrossRef]
  37. Weiss, M.; Baret, F. Evaluation of Canopy Biophysical Variable Retrieval Performances from the Accumulation of Large Swath Satellite Data. Remote Sens. Environ. 1999, 70, 293–306. [Google Scholar] [CrossRef]
  38. Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated Narrow-Band Vegetation Indices for Prediction of Crop Chlorophyll Content for Application to Precision Agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
  39. Cai, Y.; Guan, K.; Lobell, D.; Potgieter, A.B.; Wang, S.; Peng, J.; Xu, T.; Asseng, S.; Zhang, Y.; You, L.; et al. Integrating Satellite and Climate Data to Predict Wheat Yield in Australia Using Machine Learning Approaches. Agric. For. Meteorol. 2019, 274, 144–159. [Google Scholar] [CrossRef]
  40. Guo, Y.; Yin, G.; Sun, H.; Wang, H.; Chen, S.; Senthilnath, J.; Wang, J.; Fu, Y. Scaling Effects on Chlorophyll Content Estimations with RGB Camera Mounted on a UAV Platform Using Machine-Learning Methods. Sensors 2020, 20, 5310. [Google Scholar] [CrossRef]
  41. Yu, K.; Lenz-Wiedemann, V.; Chen, X.; Bareth, G. Estimating Leaf Chlorophyll of Barley at Different Growth Stages Using Spectral Indices to Reduce Soil Background and Canopy Structure Effects. ISPRS J. Photogramm. Remote Sens. 2014, 97, 58–77. [Google Scholar] [CrossRef]
  42. Ma, J.; Li, Y.; Chen, Y.; Du, K.; Zheng, F.; Zhang, L.; Sun, Z. Estimating above Ground Biomass of Winter Wheat at Early Growth Stages Using Digital Images and Deep Convolutional Neural Network. Eur. J. Agron. 2019, 103, 117–129. [Google Scholar] [CrossRef]
  43. Brewer, K.; Clulow, A.; Sibanda, M.; Gokool, S.; Naiken, V.; Mabhaudhi, T. Predicting the Chlorophyll Content of Maize over Phenotyping as a Proxy for Crop Health in Smallholder Farming Systems. Remote Sens. 2022, 14, 518. [Google Scholar] [CrossRef]
  44. Narmilan, A.; Gonzalez, F.; Salgadoe, A.S.A.; Kumarasiri, U.W.L.M.; Weerasinghe, H.A.S.; Kulasekara, B.R. Predicting Canopy Chlorophyll Content in Sugarcane Crops Using Machine Learning Algorithms and Spectral Vegetation Indices Derived from UAV Multispectral Imagery. Remote Sens. 2022, 14, 1140. [Google Scholar] [CrossRef]
  45. González Vilas, L.; Spyrakos, E.; Torres Palenzuela, J.M. Neural Network Estimation of Chlorophyll a from MERIS Full Resolution Data for the Coastal Waters of Galician Rias (NW Spain). Remote Sens. Environ. 2011, 115, 524–535. [Google Scholar] [CrossRef]
  46. Rocha, A.D.; Groen, T.A.; Skidmore, A.K.; Darvishzadeh, R.; Willemen, L. The Naïve Overfitting Index Selection (NOIS): A New Method to Optimize Model Complexity for Hyperspectral Data. ISPRS J. Photogramm. Remote Sens. 2017, 133, 61–74. [Google Scholar] [CrossRef]
  47. Le Maire, G.; François, C.; Soudani, K.; Berveiller, D.; Pontailler, J.Y.; Bréda, N.; Genet, H.; Davi, H.; Dufrêne, E. Calibration and Validation of Hyperspectral Indices for the Estimation of Broadleaved Forest Leaf Chlorophyll Content, Leaf Mass per Area, Leaf Area Index and Leaf Canopy Biomass. Remote Sens. Environ. 2008, 112, 3846–3864. [Google Scholar] [CrossRef]
  48. Piazza, M.; Lobovikov, M.; Paudel, S.; Ren, H.; Wu, J. World Bamboo Resources—A Thematic Study Prepared in the Framework of the Global Forest Resources Assessment 2005; Food & Agriculture: Rome, Italy, 2007. [Google Scholar]
  49. Li, Y.; Zhang, J.; Chang, S.X.; Jiang, P.; Zhou, G.; Fu, S.; Yan, E.; Wu, J.; Lin, L. Long-Term Intensive Management Effects on Soil Organic Carbon Pools and Chemical Composition in Moso Bamboo (Phyllostachys Pubescens) Forests in Subtropical China. For. Ecol. Manag. 2013, 303, 121–130. [Google Scholar] [CrossRef]
  50. Xu, L.; Fang, H.; Deng, X.; Ying, J.; Lv, W.; Shi, Y.; Zhou, G.; Zhou, Y. Biochar Application Increased Ecosystem Carbon Sequestration Capacity in a Moso Bamboo Forest. For. Ecol. Manag. 2020, 475, 118447. [Google Scholar] [CrossRef]
  51. Zhou, J.; Qu, T.; Li, Y.; Van Zwieten, L.; Wang, H.; Chen, J.; Song, X.; Lin, Z.; Zhang, X.; Luo, Y.; et al. Biochar-Based Fertilizer Decreased While Chemical Fertilizer Increased Soil N2O Emissions in a Subtropical Moso Bamboo Plantation. Catena 2021, 202, 105257. [Google Scholar] [CrossRef]
  52. Yen, T.M.; Lee, J.S. Comparing Aboveground Carbon Sequestration between Moso Bamboo (Phyllostachys Heterocycla) and China Fir (Cunninghamia Lanceolata) Forests Based on the Allometric Model. For. Ecol. Manag. 2011, 261, 995–1002. [Google Scholar] [CrossRef]
  53. Xu, X.; Zhou, G.; Liu, S.; Du, H.; Mo, L.; Shi, Y.; Jiang, H.; Zhou, Y.; Liu, E. Implications of Ice Storm Damages on the Water and Carbon Cycle of Bamboo Forests in Southeastern China. Agric. For. Meteorol. 2013, 177, 35–45. [Google Scholar] [CrossRef]
  54. Li, P.; Zhou, G.; Du, H.; Lu, D.; Mo, L.; Xu, X.; Shi, Y.; Zhou, Y. Current and Potential Carbon Stocks in Moso Bamboo Forests in China. J. Environ. Manag. 2015, 156, 89–96. [Google Scholar] [CrossRef] [PubMed]
  55. Yen, T.M. Culm Height Development, Biomass Accumulation and Carbon Storage in an Initial Growth Stage for a Fast-Growing Moso Bamboo (Phyllostachy Pubescens). Bot. Stud. 2016, 57, 10. [Google Scholar] [CrossRef] [Green Version]
  56. Li, L.; Li, N.; Lu, D.; Chen, Y. Mapping Moso Bamboo Forest and Its On-Year and off-Year Distribution in a Subtropical Region Using Time-Series Sentinel-2 and Landsat 8 Data. Remote Sens. Environ. 2019, 231, 111265. [Google Scholar] [CrossRef]
  57. Kleinhenz, V.; Midmore, D.J. Aspects of Bamboo Agronomy; Academic Press: Cambridge, MA, USA, 2001; Volume 74, pp. 99–153. ISBN 0065-2113. [Google Scholar]
  58. Zhou, Y.; Zhou, G.; Du, H.; Shi, Y.; Mao, F.; Liu, Y.; Xu, L.; Li, X.; Xu, X. Biotic and Abiotic Influences on Monthly Variation in Carbon Fluxes in On-Year and off-Year Moso Bamboo Forest. Trees-Struct. Funct. 2019, 33, 153–169. [Google Scholar] [CrossRef]
  59. Xu, X.; Du, H.; Zhou, G.; Mao, F.; Li, X.; Zhu, D.; Li, Y.; Cui, L. Remote Estimation of Canopy Leaf Area Index and Chlorophyll Content in Moso Bamboo (Phyllostachys Edulis (Carrière) J. Houz.) Forest Using MODIS Reflectance Data. Ann. For. Sci. 2018, 75, 33. [Google Scholar] [CrossRef] [Green Version]
  60. Li, X.; Du, H.; Zhou, G.; Mao, F.; Zhang, M.; Han, N.; Fan, W.; Liu, H.; Huang, Z.H.; He, S.; et al. Phenology Estimation of Subtropical Bamboo Forests Based on Assimilated MODIS LAI Time Series Data. ISPRS J. Photogramm. Remote Sens. 2021, 173, 262–277. [Google Scholar] [CrossRef]
  61. Niu, Y.; Zhang, L.; Zhang, H.; Han, W.; Peng, X. Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery. Remote Sens. 2019, 11, 1261. [Google Scholar] [CrossRef] [Green Version]
  62. Meyer, G.E.; Hindman, T.W.; Laksmi, K. Machine Vision Detection Parameters for Plant Species Identification. In Proceedings of the SPIE 3543 Precision Agriculture and Biological Quality, Boston, MA, USA, 14 January 1999; Volume 3543. [Google Scholar]
  63. Guijarro, M.; Pajares, G.; Riomoros, I.; Herrera, P.J.; Burgos-Artizzu, X.P.; Ribeiro, A. Automatic Segmentation of Relevant Textures in Agricultural Images. Comput. Electron. Agric. 2011, 75, 75–83. [Google Scholar] [CrossRef] [Green Version]
  64. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color Indices for Weed Identification under Various Soil, Residue, and Lighting Conditions. Trans. Am. Soc. Agric. Eng. 1995, 38, 259–269. [Google Scholar] [CrossRef]
  65. Camargo Neto, J. A Combined Statistical-Soft Computing Approach for Classification and Mapping Weed Species in Minimum-Tillage Systems; University of Nebraska: Lincoln, NE, USA, 2004; pp. 1–170. [Google Scholar]
  66. Kawashima, S.; Nakatani, M. An Algorithm for Estimating Chlorophyll Content in Leaves Using a Video Camera. Ann. Bot. 1998, 81, 49–54. [Google Scholar] [CrossRef] [Green Version]
  67. Zhuang, S.; Wang, P.; Jiang, B. Vegetation Extraction in the Field Using Multi-Level Features. Biosyst. Eng. 2020, 197, 352–366. [Google Scholar] [CrossRef]
  68. Sabzi, S.; Abbaspour-Gilandeh, Y.; Javadikia, H. Machine Vision System for the Automatic Segmentation of Plants under Different Lighting Conditions. Biosyst. Eng. 2017, 161, 157–173. [Google Scholar] [CrossRef]
  69. Xu, X.; Du, H.; Zhou, G.; Ge, H.; Shi, Y.; Zhou, Y.; Fan, W.; Fan, W. Estimation of Aboveground Carbon Stock of Moso Bamboo (Phyllostachys Heterocycla Var. Pubescens) Forest with a Landsat Thematic Mapper Image. Int. J. Remote Sens. 2011, 32, 1431–1448. [Google Scholar] [CrossRef]
  70. Qiao, L.; Zhang, Z.Y.; Chen, L.S.; Sun, H.; Li, M.Z.; Li, L.; Ma, J. Detection of Chlorophyll Content in Maize Canopy from UAV Imagery. IFAC-PapersOnLine 2019, 52, 330–335. [Google Scholar] [CrossRef]
  71. Dutta Gupta, S.; Ibaraki, Y.; Pattanayak, A.K. Development of a Digital Image Analysis Method for Real-Time Estimation of Chlorophyll Content in Micropropagated Potato Plants. Plant Biotechnol. Rep. 2013, 7, 91–97. [Google Scholar] [CrossRef]
  72. Dutta Gupta, S.; Pattanayak, A.K. Intelligent Image Analysis (IIA) Using Artificial Neural Network (ANN) for Non-Invasive Estimation of Chlorophyll Content in Micropropagated Plants of Potato. Vitr. Cell. Dev. Biol.-Plant 2017, 53, 520–526. [Google Scholar] [CrossRef]
  73. Agarwal, A.; Dutta Gupta, S. Assessment of Spinach Seedling Health Status and Chlorophyll Content by Multivariate Data Analysis and Multiple Linear Regression of Leaf Image Features. Comput. Electron. Agric. 2018, 152, 281–289. [Google Scholar] [CrossRef]
  74. Ciganda, V.; Gitelson, A.; Schepers, J. Vertical Profile and Temporal Variation of Chlorophyll in Maize Canopy: Quantitative “Crop Vigor” Indicator by Means of Reflectance-Based Techniques. Agron. J. 2008, 100, 1409–1417. [Google Scholar] [CrossRef] [Green Version]
  75. Xue, L.; Cao, W.; Luo, W.; Dai, T.; Zhu, Y. Monitoring Leaf Nitrogen Status in Rice with Canopy Spectral Reflectance. Agron. J. 2004, 96, 135–142. [Google Scholar] [CrossRef]
  76. Jay, S.; Gorretta, N.; Morel, J.; Maupas, F.; Bendoula, R.; Rabatel, G.; Dutartre, D.; Comar, A.; Baret, F. Estimating Leaf Chlorophyll Content in Sugar Beet Canopies Using Millimeter- to Centimeter-Scale Reflectance Imagery. Remote Sens. Environ. 2017, 198, 173–186. [Google Scholar] [CrossRef]
  77. Liu, Y.; Hatou, K.; Aihara, T.; Kurose, S.; Akiyama, T.; Kohno, Y.; Lu, S.; Omasa, K. A Robust Vegetation Index Based on Different Uav Rgb Images to Estimate SPAD Values of Naked Barley Leaves. Remote Sens. 2021, 13, 686. [Google Scholar] [CrossRef]
  78. Le Maire, G.; François, C.; Dufrêne, E. Towards Universal Broad Leaf Chlorophyll Indices Using PROSPECT Simulated Database and Hyperspectral Reflectance Measurements. Remote Sens. Environ. 2004, 89, 1–28. [Google Scholar] [CrossRef]
  79. Yang, W.; Wang, S.; Zhao, X.; Zhang, J.; Feng, J. Greenness Identification Based on HSV Decision Tree. Inf. Process. Agric. 2015, 2, 149–160. [Google Scholar] [CrossRef] [Green Version]
  80. Suh, H.K.; Hofstee, J.W.; van Henten, E.J. Improved Vegetation Segmentation with Ground Shadow Removal Using an HDR Camera. Precis. Agric. 2018, 19, 218–237. [Google Scholar] [CrossRef] [Green Version]
  81. Castillo-Martínez, M.; Gallegos-Funes, F.J.; Carvajal-Gámez, B.E.; Urriolagoitia-Sosa, G.; Rosales-Silva, A.J. Color Index Based Thresholding Method for Background and Foreground Segmentation of Plant Images. Comput. Electron. Agric. 2020, 178, 105783. [Google Scholar] [CrossRef]
  82. Bhandari, M.; Ibrahim, A.M.H.; Xue, Q.; Jung, J.; Chang, A.; Rudd, J.C.; Maeda, M.; Rajan, N.; Neely, H.; Landivar, J. Assessing Winter Wheat Foliage Disease Severity Using Aerial Imagery Acquired from Small Unmanned Aerial Vehicle (UAV). Comput. Electron. Agric. 2020, 176, 105665. [Google Scholar] [CrossRef]
  83. Sojodishijani, O.; Ramli, A.R.; Rostami, V.; Samsudin, K.; Saripan, M.I. Just-in-Time Outdoor Color Discrimination Using Adaptive Similarity-Based Classifier. IEICE Electron. Express 2010, 7, 339–345. [Google Scholar] [CrossRef] [Green Version]
  84. Teixidó, M.; Font, D.; Pallejà, T.; Tresanchez, M.; Nogués, M.; Palacín, J. Definition of Linear Color Models in the RGB Vector Color Space to Detect Red Peaches in Orchard Images Taken under Natural Illumination. Sensors 2012, 12, 7701–7718. [Google Scholar] [CrossRef] [Green Version]
  85. Florczyk, S. Video Based Indoor Exploration with Autonomous and Mobile Robots. J. Intell. Robot. Syst. Theory Appl. 2005, 41, 245–262. [Google Scholar] [CrossRef]
  86. Ide, R.; Oguma, H. Use of Digital Cameras for Phenological Observations. Ecol. Inform. 2010, 5, 339–347. [Google Scholar] [CrossRef]
  87. Mesas-Carrascosa, F.J.; Torres-Sánchez, J.; Clavero-Rumbao, I.; García-Ferrer, A.; Peña, J.M.; Borra-Serrano, I.; López-Granados, F. Assessing Optimal Flight Parameters for Generating Accurate Multispectral Orthomosaicks by Uav to Support Site-Specific Crop Management. Remote Sens. 2015, 7, 12793–12814. [Google Scholar] [CrossRef] [Green Version]
  88. Avtar, R.; Suab, S.A.; Syukur, M.S.; Korom, A.; Umarhadi, D.A.; Yunus, A.P. Assessing the Influence of UAV Altitude on Extracted Biophysical Parameters of Young Oil Palm. Remote Sens. 2020, 12, 3030. [Google Scholar] [CrossRef]
  89. Tian, L.F.; Slaughter, D.C. Environmentally Adaptive Segmentation Algorithm for Outdoor Image Segmentation. Comput. Electron. Agric. 1998, 21, 153–168. [Google Scholar] [CrossRef]
  90. Hague, T.; Tillett, N.D.; Wheeler, H. Automated Crop and Weed Monitoring in Widely Spaced Cereals. Precis. Agric. 2006, 7, 21–32. [Google Scholar] [CrossRef]
  91. Palus, H. Representations of Colour Images in Different Colour Spaces. In The Colour Image Processing Handbook; Springer: Boston, MA, USA, 1998; pp. 67–90. [Google Scholar] [CrossRef]
  92. Hamuda, E.; Mc Ginley, B.; Glavin, M.; Jones, E. Automatic Crop Detection under Field Conditions Using the HSV Colour Space and Morphological Operations. Comput. Electron. Agric. 2017, 133, 97–107. [Google Scholar] [CrossRef]
  93. Rasmussen, J.; Ntakos, G.; Nielsen, J.; Svensgaard, J.; Poulsen, R.N.; Christensen, S. Are Vegetation Indices Derived from Consumer-Grade Cameras Mounted on UAVs Sufficiently Reliable for Assessing Experimental Plots? Eur. J. Agron. 2016, 74, 75–92. [Google Scholar] [CrossRef]
  94. Sumesh, K.C.; Ninsawat, S.; Som-ard, J. Integration of RGB-Based Vegetation Index, Crop Surface Model and Object-Based Image Analysis Approach for Sugarcane Yield Estimation Using Unmanned Aerial Vehicle. Comput. Electron. Agric. 2021, 180, 105903. [Google Scholar] [CrossRef]
  95. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-Based Plant Height from Crop Surface Models, Visible, and near Infrared Vegetation Indices for Biomass Monitoring in Barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  96. Verger, A.; Vigneau, N.; Chéron, C.; Gilliot, J.M.; Comar, A.; Baret, F. Green Area Index from an Unmanned Aerial System over Wheat and Rapeseed Crops. Remote Sens. Environ. 2014, 152, 654–664. [Google Scholar] [CrossRef]
Figure 1. The study areas in Anji County and Lin’an district, northwestern Zhejiang province, China, and distribution of samples: (a) training samples collected during December 2019 to October 2020; (b) independent samples collected on 19 April 2021 (30.475625°N, 119.675964°E); (c) independent samples collected on 20 April 2021 (30.223582°N, 119.800826°E); (d) independent samples collected on 21 May 2021 (30.476623°N, 119.673754°E); (e) a red marker for sample identification; and (f) individual canopy of a sample.
Figure 1. The study areas in Anji County and Lin’an district, northwestern Zhejiang province, China, and distribution of samples: (a) training samples collected during December 2019 to October 2020; (b) independent samples collected on 19 April 2021 (30.475625°N, 119.675964°E); (c) independent samples collected on 20 April 2021 (30.223582°N, 119.800826°E); (d) independent samples collected on 21 May 2021 (30.476623°N, 119.673754°E); (e) a red marker for sample identification; and (f) individual canopy of a sample.
Remotesensing 14 02864 g001
Figure 2. Relationships between normalized RGB values and leaf chlorophyll content index (CCI): (a) changes in normalized RGB values and mean CCI from December 2019 to October 2020; (b) relationship between normalized green band and measured CCI; and (c) relationship between normalized blue band and measured CCI.
Figure 2. Relationships between normalized RGB values and leaf chlorophyll content index (CCI): (a) changes in normalized RGB values and mean CCI from December 2019 to October 2020; (b) relationship between normalized green band and measured CCI; and (c) relationship between normalized blue band and measured CCI.
Remotesensing 14 02864 g002
Figure 3. Correlation coefficients between leaf chlorophyll content index (CCI) and variables derived from the UAV RGB images. The full name for each variable can be found in Table S1.
Figure 3. Correlation coefficients between leaf chlorophyll content index (CCI) and variables derived from the UAV RGB images. The full name for each variable can be found in Table S1.
Remotesensing 14 02864 g003
Figure 4. Scatter plots of measured CCI and estimated CCI for the training and validation datasets based on the linear regression model driven by (a) MExBR and (b) 1.4H-S.
Figure 4. Scatter plots of measured CCI and estimated CCI for the training and validation datasets based on the linear regression model driven by (a) MExBR and (b) 1.4H-S.
Remotesensing 14 02864 g004
Figure 5. Changes in RMSE for the training and test dataset as the training performance changes.
Figure 5. Changes in RMSE for the training and test dataset as the training performance changes.
Remotesensing 14 02864 g005
Figure 6. Scatter plots of measured CCI and estimated CCI for the training and validation datasets based on the Erf-BP model.
Figure 6. Scatter plots of measured CCI and estimated CCI for the training and validation datasets based on the Erf-BP model.
Remotesensing 14 02864 g006
Figure 7. Scatter plots of measured and predicted CCI from the linear regression model driven by the MExBR index derived from the UAV images obtained under two different illumination conditions at a flight height of 120 m on 19 April 2021.
Figure 7. Scatter plots of measured and predicted CCI from the linear regression model driven by the MExBR index derived from the UAV images obtained under two different illumination conditions at a flight height of 120 m on 19 April 2021.
Remotesensing 14 02864 g007
Figure 8. Scatter plots of measured and predicted CCI from the linear regression model driven by the MExBR index derived from the UAV images obtained at different flight heights under an illumination of 377 W/m2 on 20 April 2021.
Figure 8. Scatter plots of measured and predicted CCI from the linear regression model driven by the MExBR index derived from the UAV images obtained at different flight heights under an illumination of 377 W/m2 on 20 April 2021.
Remotesensing 14 02864 g008
Figure 9. Scatter plots of measured and predicted CCI from the linear regression model driven by the MExBR index derived from the UAV images obtained at different flight heights under the illumination condition of (a) 369, (b) 450, (c) 894, and (d) 887 W/m2 on 21 May 2021.
Figure 9. Scatter plots of measured and predicted CCI from the linear regression model driven by the MExBR index derived from the UAV images obtained at different flight heights under the illumination condition of (a) 369, (b) 450, (c) 894, and (d) 887 W/m2 on 21 May 2021.
Remotesensing 14 02864 g009
Figure 10. Scatter plots of measured and estimated CCI for the independent sample dataset based on the linear regression models driven by the (a) MExBR, (b) 1.4H-S, and (c) Erf-BP models.
Figure 10. Scatter plots of measured and estimated CCI for the independent sample dataset based on the linear regression models driven by the (a) MExBR, (b) 1.4H-S, and (c) Erf-BP models.
Remotesensing 14 02864 g010aRemotesensing 14 02864 g010b
Figure 11. The spatial distribution of leaf CCI estimates for each image using (a) the linear regression model driven by MExBR, (b) the Erf-BP model, and (c) comparisons between the percentages of leaf CCI estimates and measured leaf CCI. M is mean value and N is the number of pixels or samples.
Figure 11. The spatial distribution of leaf CCI estimates for each image using (a) the linear regression model driven by MExBR, (b) the Erf-BP model, and (c) comparisons between the percentages of leaf CCI estimates and measured leaf CCI. M is mean value and N is the number of pixels or samples.
Remotesensing 14 02864 g011
Table 1. Basic description of CCI samples and corresponding flight height and illumination conditions for UAV images.
Table 1. Basic description of CCI samples and corresponding flight height and illumination conditions for UAV images.
DateCCINumber
of Samples
Flight Heights (m)Image Acquisition TimeIllumination Condition (W/m2)Location
3 December 201912.71 ± 1.946120 Clear skyAnji
8 January 20209.71 ± 2.048120 Clear skyAnji
9 May 20208.65 ± 0.978120 Clear skyAnji
31 July 202018.67 ± 0.998120 Clear skyAnji
15 October 202017.30 ± 0.618120 CloudyAnji
19 April 202117.53 ± 2.023612010:16938Anji
12015:37546
20 April 20219.16 ± 1.571580, 100, 120, 14009:41377Lin’an
21 May 202116.11 ± 1.82208009:00369Anji
10009:11450
12010:31894
14013:16887
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Xu, H.; Wang, J.; Qu, Y.; Hu, L.; Tang, Y.; Zhou, Z.; Xu, X.; Zhou, Y. Estimating Leaf Chlorophyll Content of Moso Bamboo Based on Unmanned Aerial Vehicle Visible Images. Remote Sens. 2022, 14, 2864. https://doi.org/10.3390/rs14122864

AMA Style

Xu H, Wang J, Qu Y, Hu L, Tang Y, Zhou Z, Xu X, Zhou Y. Estimating Leaf Chlorophyll Content of Moso Bamboo Based on Unmanned Aerial Vehicle Visible Images. Remote Sensing. 2022; 14(12):2864. https://doi.org/10.3390/rs14122864

Chicago/Turabian Style

Xu, Huaixing, Juzhong Wang, Yiling Qu, Lulu Hu, Yan Tang, Zhongsheng Zhou, Xiaojun Xu, and Yufeng Zhou. 2022. "Estimating Leaf Chlorophyll Content of Moso Bamboo Based on Unmanned Aerial Vehicle Visible Images" Remote Sensing 14, no. 12: 2864. https://doi.org/10.3390/rs14122864

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop