Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Next Article in Journal
Optical-to-SAR Translation Based on CDA-GAN for High-Quality Training Sample Generation for Ship Detection in SAR Amplitude Images
Previous Article in Journal
Multi-GNSS Precise Point Positioning with Ambiguity Resolution Based on the Decoupled Clock Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimation of Maize Biomass at Multi-Growing Stage Using Stem and Leaf Separation Strategies with 3D Radiative Transfer Model and CNN Transfer Learning

1
School of Information and Electrical Engineering, Shenyang Agricultural University, Shenyang 110866, China
2
Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
3
School of Information Science and Technology, Beijing Forestry University, Beijing 100083, China
4
College of Geological Engineering and Geomatics, Chang’an University, Xi’an 710064, China
5
School of Surveying and Mapping Land Information Engineering, Henan Polytechnic University, Jiaozuo 454000, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(16), 3000; https://doi.org/10.3390/rs16163000
Submission received: 19 June 2024 / Revised: 12 August 2024 / Accepted: 14 August 2024 / Published: 15 August 2024

Abstract

:
The precise estimation of above-ground biomass (AGB) is imperative for the advancement of breeding programs. Optical variables, such as vegetation indices (VI), have been extensively employed in monitoring AGB. However, the limited robustness of inversion models remains a significant impediment to the widespread application of UAV-based multispectral remote sensing in AGB inversion. In this study, a novel stem–leaf separation strategy for AGB estimation is delineated. Convolutional neural network (CNN) and transfer learning (TL) methodologies are integrated to estimate leaf biomass (LGB) across multiple growth stages, followed by the development of an allometric growth model for estimating stem biomass (SGB). To enhance the precision of LGB inversion, the large-scale remote sensing data and image simulation framework over heterogeneous scenes (LESS) model, which is a three-dimensional (3D) radiative transfer model (RTM), was utilized to simulate a more extensive canopy spectral dataset, characterized by a broad distribution of canopy spectra. The CNN model was pre-trained in order to gain prior knowledge, and this knowledge was transferred to a re-trained model with a subset of field-observed samples. Finally, the allometric growth model was utilized to estimate SGB across various growth stages. To further validate the generalizability, transferability, and predictive capability of the proposed method, field samples from 2022 and 2023 were employed as target tasks. The results demonstrated that the 3D RTM + CNN + TL method outperformed best in LGB estimation, achieving an R² of 0.73 and an RMSE of 72.5 g/m² for the 2022 dataset, and an R² of 0.84 and an RMSE of 56.4 g/m² for the 2023 dataset. In contrast, the PROSAIL method yielded an R² of 0.45 and an RMSE of 134.55 g/m² for the 2022 dataset, and an R² of 0.74 and an RMSE of 61.84 g/m² for the 2023 dataset. The accuracy of LGB inversion was poor when using only field-measured samples to train a CNN model without simulated data, with R² values of 0.30 and 0.74. Overall, learning prior knowledge from the simulated dataset and transferring it to a new model significantly enhanced LGB estimation accuracy and model generalization. Additionally, the allometric growth model’s estimation of SGB resulted in an accuracy of 0.87 and 120.87 g/m² for the 2022 dataset, and 0.74 and 86.87 g/m² for the 2023 dataset, exhibiting satisfactory results. Separate estimation of both LGB and SGB based on stem and leaf separation strategies yielded promising results. This method can be extended to the monitor and inversion of other critical variables.

1. Introduction

Maize is the main food crop in China, accounting for 23% of global annual production [1,2]. Monitoring the growth status of maize is crucial for food production security [3,4]. Above-ground biomass (AGB) is a key parameter in yield estimation models [5] and serves as an important ecological indicator in ecosystems, used to assess the efficiency of crops in utilizing light and storing carbon [6]. Efficient and accurate monitoring of maize AGB provides essential decision-making information for breeding and enhances crop yield.
Traditional biomass measurement methods rely primarily on destructive laboratory measurements, which are inefficient and time-consuming [7,8]. These methods are not suitable for application in the breeding field. Recently, UAV remote sensing has been widely used to estimate AGB [9,10]. For example, the studies by [11] and [12] estimated AGB for wheat and potato using UAV imaging. However, the accuracy of existing methods is insufficient for the breeding field. In the early crop growth stage, leaf biomass constitutes the main proportion of AGB. In the mid-to-late crop growth stage, stem biomass becomes the predominant component [13]. Therefore, to achieve high-accuracy AGB estimation, it is necessary to use a stem and leaf separation strategy for maize AGB estimation.
Most current research has focused on estimating AGB using UAV data, verifying the potential of UAVs to estimate crop traits [12,14,15]. Many scholars estimate biomass by screening the best band or VIs; these methods can be divided into the statistical regression method [16,17] and the machine learning method [18,19]. The statistical regression method aims to establish a relationship between the features extracted from UAV images and phenotypic traits. In contrast, machine learning models are suitable for dealing with the problem of collinearity between independent variables and dependent variables. However, these models often lack generalizability to other regions and years due to their high dependence on field-measured data, which are influenced by meteorological conditions [20].
Deep learning exhibits superior generalization capabilities compared to traditional machine learning. CNNs which are feedforward neural networks are widely used in image recognition and regression prediction [21]. The basic structure of a CNN includes the input layer, convolution layer, pooling layer, and fully connected layer. Regression prediction with CNN models necessitates a substantial number of samples, which can be prohibitively expensive to acquire under real-world conditions. RTMs can simulate canopy reflectance based on biophysical variables determined by the geometry, physiological, and biochemical characteristics of a crop [22,23]. Simulated data account for various observation conditions and combinations of leaf optical properties. Incorporating prior knowledge from extensive training datasets can mitigate uncertainties in the model. However, despite the utility of simulated data, model robustness remains a significant bottleneck. When observational conditions vary, most models require re-training from scratch with newly collected data, a process that is both time-consuming and labor-intensive. Therefore, TL emerges as a universal and efficacious modeling technique that reduces the need for extensive field-training samples. By transferring the weights from a large pre-trained model and fine-tuning it with a smaller dataset, TL enhances model robustness, accelerates the modeling process, and improves overall performance [24,25].
Currently, developed RTMs mainly include 1D and 3D models. While 1D RTMs have performed well in estimating LAI and chlorophyll [26], their accuracy depends on the model’s ability to realistically simulate canopy reflectance under a fixed canopy structure and optical properties [27]. The 1D RTM assumes that leaves are randomly distributed in a horizontally homogeneous canopy, making it difficult to describe the spectral response of structure changes [28]. To address this limitation, 3D RTMs such as LESS [29] and DART [30] have been developed to accurately simulate remote sensing signals and provide datasets for retrieving physiological and biochemical parameters. Although RTMs can estimate physiological and biochemical variables, few studies have focused on maize LGB estimation. Given that leaves are the main contributors to canopy reflectance, using spectra to estimate leaf biomass is reasonable. However, detecting changes in stem biomass through canopy reflectance alone is challenging. In this study, there was an obvious allometric growth relationship between the leaf biomass and stem biomass of maize. Estimating stem ground biomass (SGB) via the allometric growth model represents a novel approach.
This study took maize as the research object and aimed to propose a stem–leaf separation strategy for AGB estimation based on UAV multispectral data. This research simulated UAV multispectral data using the LESS model and gained knowledge from the simulated dataset to apply it to a new CNN model for LGB estimation. An allometric model was utilized to estimate SGB. The main objectives of this study are (1) to verify the accuracy of the simulated dataset and select the appropriate VIs for CNN training; (2) to verify the performance of 3D RTM, CNN, and TL models for estimating LGB compared to 1D RTM; and (3) to evaluate the accuracy of stem biomass estimation using the stem and leaf allometric growth model.

2. Materials and Methods

2.1. Experiment Site and Design

This study conducted three experiments at the National Precision Agriculture Research Center (40.17° N, 116.43° E), Beijing, China, across different years. The first maize data-collection campaign was conducted from July to September 2021, the second from June to August 2022, and the third from June to August 2023, with field trials in both 2022 and 2023 (Figure 1). The experimental sites are characterized by a warm temperate continental monsoon climate with abundant light and heat. The annual average temperature is about 13 °C, and the annual average precipitation is about 508 mm, mostly concentrated between June to August.
Five cultivars were planted in each of the experiments conducted in 2021 (Exp. 21), 2022 (Exp. 22), and 2023 (Exp. 23). The sowing dates for the maize were June 11, 2021, May 20, 2022, and May 30, 2023, respectively. Exp. 21 and Exp. 22 included density treatment with plant densities of 90,000 plant/ha, 67,500 plant/ha, 60,000 plant/ha, and 33,000 plant/ha for Exp. 21, and 90,000 plant/ha, 67,500 plant/ha, 60,000 plant/ha, and 45,000 plant/ha for Exp. 22. Exp. 23 involved different nitrogen stress-gradient treatments: 0 kg/ha, 270 kg/ha, 540 kg/ha, and 810 kg/ha. Exp. 21 had 80 plots, each measuring 3.6 m in width and 2.5 m in length. Both the 2022 and 2023 filed experiments had 60 plots, with plots measuring 3.6 m in width and 3.6 m in length. The maize inbred lines used in this study represented various genotypes. Fertilization and irrigation were carried out according to local practices.

2.2. Data Acquisition

2.2.1. Field Measurements

In this study, maize with different genotypes was used, resulting in variations in the growth process among the cultivars. The maize growth stages were primarily defined based on Zhengdan 958. Maize biomass was determined by separating the leaves and stem. During each campaign, two maize plants were randomly selected from each plot, collected, and taken to the laboratory. Each plant was divided into green leaves and stem, which were then bagged and processed in an oven at 105 °C before being dried at 85 °C until reaching a constant weight. These data were collected during each key growth period (Table 1). Leaf ground biomass (LGB) and stem ground biomass (SGB) in each plot were calculated using Equations (1) and (2). Leaf length and maximum leaf width were measured to construct 3D models and leaf area index (LAI) was calculated using Equation (3).
L G B = D W 1 × N A r e a × 1000
S G B = D W 2 × N A r e a × 1000
where DW1 is the leaf dry matter content, g; DW2 is the stem dry matter content, g; N is the number of maize plants in each plot; and Area is the covering area of each plot.
L A I = ( i = 1 n a × b × 0.75 ) × N A r e a
where a is leaf length (m) and b is maximum width (m) of i-th leaf; n is the leaf number of one plant; N is the plant number in each plot; and Area is the area of each plot.
We acquired maize 2D images using an RGB camera and extracted maize leaf angles using ImageJ 1.53a software (https://imagej.nih.gov/ij/, accessed on 15 April 2023). ImageJ is open-source image processing software that has angle measurement tools to extract an angle based on three points. The measurement method cites the literature [31]. ImageJ can determine an angle based on any three points. The measurement positions are shown in Figure 2.

2.2.2. Multispectral Data Collection and Processing

The DJI Phantom 4 Multispectral, equipped with an integrated multispectral imaging system consisting of six CMOS sensors and five bands (blue, green, red, near-infrared, and red-edge), was employed to acquire data. To avoid shadow interference from direct sunlight, data acquisition was performed between 10:00 and 14:00 on the specified dates (Table 1). Before the drone took off, a radiation calibration board (MicaSense, Seattle, WA, USA) was placed horizontally on the ground for multispectral data calibration. The DJI GS Pro platform was used for precise location data acquisition and flight route planning. During the drone flight, the forward and lateral overlaps were set to 80%, with a flight altitude of 30 m and a speed of 2.1 m/s in Exp. 21, Exp. 22 and Exp. 23. Afterwards, the raw images were processed in DJI Terra to generate orthoimages of the calibrated reflectance for the study area. Finally, ENVI 5.6 software was used to delineate the minimum boundary rectangle of the region of interest, and the average reflectance of each plot (600 plots in total) was calculated for subsequent spectral analysis.

2.3. 3D RTM and 1D RTM Canopy Reflectance Simulations

In this study, we generated two distinct datasets for LGB estimation using RTMs, specifically the LESS and PROSAIL models. First, we constructed 60 3D scenes, including five varieties, four planting densities, and three growth stages. The structural differences among the five varieties were obtained from measured data, generating 3D scenes with different structural parameter combinations for canopy spectrum simulation. The method for constructing the 3D scenes was based on the same author’s method described in the literature [32]. The parameter range of the constructed 3D scene is shown in Table 2. The total leaf area of the canopy 3D scenes was obtained, and trial investigations revealed a high correlation between LGB and LAI, allowing LGB to be calculated for each 3D scene. The LESS model can accurately simulate canopy reflectance for any soil reflectance and leaf properties (mainly related to leaf biochemical contents) at a given canopy structure and observational configuration. The PROSPECT-D model, a leaf bidirectional reflectance model, was used within the LESS model to generate a large number of simulated spectra by varying its parameters. The names and ranges of variables in the PROSPECT-D model are shown in Table 3. The simulated data include five bands in the visible, red-edge, and near-infrared regions with central wavelengths of 450 nm, 560 nm, 650 nm, 730 nm, and 840 nm. The view zenith and azimuth angle were set to 0, and the sun zenith and azimuth angle were set to 40, 50, and 60 degrees. We generated a total of 52,565 canopy spectra, referred to as the LR Dataset.
PROSAIL simplifies the canopy scene into a turbid medium, i.e., homogeneous and infinitely extending, and cannot extract organ components such as tassels and stems [27]. In this study, a total of 28,800 simulated data were generated using the PROSAIL model. The wavelength range of the simulated dataset was 400 to 2500 nm. We selected five wavelengths for constructing the dataset: 450 nm, 560 nm, 650 nm, 730 nm, and 840 nm. The input parameters for the PROSAIL model are shown in Table 4.

2.4. CNN Architecture and TL

This study utilized a CNN network architecture incorporating deep and shallow features (Figure 3). This network fused multilayer features, learning complex and non-linear relationships within the data to enhance model inversion performance. The 1D CNN architecture includes input, convolution, dropout, fully, and output layers. The first two convolution layers consist of convolution, batch normalization, rectified linear units (ReLU layer), and average pooling, while the last two convolution layers include convolution, batch normalization, ReLU, and average pooling. The initial weights of the model were set to zero and then updated using the Adam’s gradient descent algorithm. The name and function of all dataset were shown in Table 5.
TL, a widely adopted technique for deep neural networks, has found extensive application in various remote sensing domains [33]. It facilitates the rapid application of previously acquired knowledge to novel problems. TL utilizes knowledge learned from a source domain and task as a starting point, reapplying it to the predictive functions of a target task. This capability of cross-domain learning allows for the utilization of different yet related tasks between the source and target. In this study, TL was used to leverage knowledge from simulated datasets, reducing the need for field-observed training data. A simulated dataset (LR dataset, n = 52565) generated from 3D RTM was used to pre-train the model, while measured data from 2021 (MR dataset, n = 240) were employed to re-train the CNN model. The prior knowledge obtained from the simulated dataset was subsequently utilized to estimate the LGB for the years 2022 and 2023. The pre-trained and re-trained models were trained for 500 and 300 epochs, respectively. Both models had a learning rate of 0.001, with batch sizes of 32 and 16, respectively. The analyses were conducted using Python version 3.7 and Torch version 1.7.1.
The experiments were conducted on a system with an Intel Core i7-9700K CPU @ 3.60 GHz, 64 GB of RAM, and an NVIDIA GeForce RTX A4000 Ti GPU. The training time for the pre-trained model on the simulated dataset was approximately 1 h, while the re-training on the measured dataset took around 2 min. These configurations ensured efficient processing and facilitated the application of deep learning techniques for accurate LGB estimation.

2.5. Validate the Availability of the Simulated Dataset

The objective of this study was to pre-train a CNN model using a simulated dataset, followed by fine-tuning a new CNN model with the MR dataset. Acquiring high-quality prior knowledge from the pre-trained model is crucial for accurate LGB estimation, necessitating the verification of the accuracy and stability of the simulated dataset. To achieve this, we employed the following strategies to validate the accuracy and stability of the simulated dataset. A well-performing simulated dataset should encompass a broad range of conditions, including spectra from real-world measurements. When generating simulated canopy spectra with the LESS model, the three-dimensional scenarios included maize at three different growth stages. First, we validated whether the simulated spectra closely matched the spectra obtained from field measurements. Subsequently, we compared the value distributions of VIs used to train the CNN model within the simulated dataset to those of the measured VIs. This comparison is essential to ensure that the simulated dataset has a sufficiently broad distribution, allowing for the acquisition of prior knowledge applicable to transfer learning.

2.6. Stem and Leaf Separation Strategy for Maize Biomass Estimation

2.6.1. Leaf Biomass Estimation Method

This study employed the CNN combined with TL methodology to estimate leaf biomass. VIs were utilized as characteristic variables, with those exhibiting a high correlation with LGB in the simulation dataset selected for training the CNN model (Table 6). To identify the most suitable VIs for LGB estimation, we ranked 20 VIs based on their correlation coefficients. We then sequentially increased the number of features, observing changes in model accuracy until stabilization was achieved. During the training of the CNN model, 80% of the LR dataset was designated as the training set, while the remaining 20% was allocated for validation. For the re-training of the CNN model, 70% of the MR dataset served as the training set, with 20% reserved for testing. This re-trained model was then employed to estimate leaf biomass for the years 2022 and 2023.

2.6.2. Stem Biomass Estimation Method

The existing literature has demonstrated an allometric growth relationship between leaf biomass and stem biomass at various growth stages. In this study, we analyzed the measured LGB and SGB value at jointing, trumpet, and tasseling stages and constructed allometric growth models using Equation (4), and then the total biomass was calculated using Equation (5).
S G B = a L G B b
A G B = L G B + a L G B b
where AGB is maize above-ground biomass; LGB is leaf biomass; SGB is stem biomass; and a and b are coefficients.

2.7. CNN Ablation Experiments

This study conducted 4 ablation experiments to validate the efficacy of CNNs with fusing shallow and deep features to enhance LGB estimation performance. Experiments E1 to E4 employed CNN networks with varying numbers of convolutional layers (Table 7). In all these experiments, the LR dataset was utilized to pre-train the model to gain prior knowledge. Subsequently, the MR dataset was used to re-train the final three layers of the new CNN model, and the 2023 dataset was employed to evaluate CNN models’ performance.

3. Results

3.1. Selection of Vegetation Indices

Figure 4 shows the Pearson correlation coefficients between LGB and 20 selected VIs. As depicted, the majority of these indices demonstrated a robust correlation with LGB, with coefficients generally exceeding 0.6. Incorporating these high-correlation VIs into the CNN enhances the model’s capacity to discern relationships between feature variables and target variables. Many of these indices were also employed for predicting LAI, which was strongly correlated with LGB, thus rendering them suitable for LGB estimation. The top five VIs exhibiting the highest correlations are VIRed, MSAVI, MTVI2, SAVI, and SIPI, all of which include the near-infrared and red bands. Figure 5 depicts the variations in R2 and RMSE of the testing set as each of the twenty VIs was sequentially inputted into the pre-trained model. The results revealed that the model achieved optimal accuracy and stability when the feature count was twelve. Notably, while the R2 was not the highest with twelve VIs, this configuration first reached a relatively stable accuracy. Moreover, the RMSE value for the model using twelve VIs was lower compared to the model using thirteen VIs, with twelve VIs also attaining the earliest stable RMSE. Consequently, the selected VIs included MSR, DVI, MTVI2, Inre, EVI, SAVI, MSAVI, SIPI, NDVI, VIRed, MCARI, and CI2. It is important to note that some of these indices, such as EVI, exhibit resistance to saturation, whereas SAVI and MSAVI mitigate soil influence on LGB estimation. The inclusion of the near-infrared band in many of these VIs, which is sensitive to structural changes, further supports their suitability for monitoring LGB, as these structural changes are indicative of crop dry matter accumulation.

3.2. Validation of 3D RTM Datasets

In this study, we validated the stability of the simulated data by comparing the value distributions of five bands between the simulated and measured datasets. Figure 6 shows the value range and density of 12 VIs in both datasets. The overall results indicated that the VI ranges of the simulated data were greater than or equal to those of the measured data. The simulated data exhibited a normal distribution, whereas most VIs from the measured dataset displayed bimodal peaks. This can be attributed to the higher accuracy of the simulated data, as the canopy reflectance simulated by the LESS model was unaffected by environmental factors. Notably, the ranges of VIRed, MCARI, and CI2 for the simulated data were significantly greater than those for the measured data (Figure 6c,j,k). A broader dataset allowed the model to learn more comprehensive knowledge, thereby better supporting model transfer. The simulated data and measured datasets demonstrated good consistency in other VIs, as their value ranges were similar. This result confirmed that the simulated dataset was suitable for training the CNN model.
To substantiate the accuracy of the simulated dataset, we conducted a detailed comparison of reflectance across five spectral bands between the simulated and field-measured data, as depicted in Figure 7. This comparative analysis revealed a robust concordance between the simulated and field-measured data during the maize jointing, trumpet, and tasseling stages. Numerically, the reflectance values across these bands were nearly indistinguishable, with exceptionally narrow error margins. Among the bands, the near-infrared band exhibited the highest degree of consistency, while the values in the visible light bands were also closely aligned. The most pronounced divergence is noted in the red-edge band, attributable to the idealized conditions under which the simulated data were generated, as opposed to the field-measured data, which were subject to significant environmental variability and atmospheric perturbations. Results from Section 3.1 further corroborated that the selected VIs predominantly involved the near-infrared and red bands, and the data presented in Figure 7 affirmed the accuracy of the simulated dataset. Consequently, this dataset not only encapsulated the spectra of the field-measured data but also encompassed additional spectra not observable in real-world scenarios. Thus, the extensively representative simulated dataset proves to be exceptionally well-suited for training CNN models and for the acquisition of pre-trained model knowledge.

3.3. 3D RTM Simulated Dataset Improves Estimation of LGB as Compared to 1D RTM

To substantiate the efficacy of our proposed method, we conducted a comparative analysis of datasets simulated by the 3D RTM and PROSAIL models. Figure 8a–f illustrate the outcomes of LGB estimation utilizing datasets from both the LESS and PROSAIL. The consideration of 3D structural considerations yielded more precise LGB estimations for maize, as opposed to relying on simplified, uniform assumptions. Specifically, Figure 8a,d depict the model’s performance on the 2021 testing dataset during fine-tuning. Here, the 3D RTM approach enhanced the R2 by 0.12 and reduced the RMSE by 14.99 g/m2 for the testing set, while also reducing the RMSE by 62.05 g/m2 and 5.44 g/m2 in the 2022 and 2023 datasets, respectively. These improvements underscored the superior accuracy of the model trained with 3D RTM-simulated datasets when validated against 2022 and 2023 samples. Conversely, the PROSAIL model exhibited substantial underestimation for LGB in 2022, as illustrated in Figure 8e. Both methods, however, were subject to saturation effects, characterized by underestimation at high values and overestimation at low values—an issue inherent to remote sensing inversion due to canopy closure during later growth stages. Figure 9 depicts the loss function variation during the CNN re-training, where the loss consistently decreases and stabilizes with an increasing number of training epochs. To mitigate overfitting, we integrated regularization and dropout layers into the CNN architecture. In conclusion, the results affirmed that the model utilizing the LESS simulated dataset not only achieved superior estimation performance but also demonstrated enhanced transferability.

3.4. Estimation of above-Ground Biomass Based on Stem–Leaf Allometric Growth Relationships

In this study, we utilized measured data from 2021 to develop allometric growth models between the maize stem and leaf biomass. The allometric growth equations between the maize stem and leaf biomass at key growth stages are shown in Table 8. The stem biomass increases exponentially with leaf biomass, varying across growth stages where the power function evolved progressively. Figure 10 shows the result of SGB estimation using the allometric growth model, revealing an R2 of 0.87 and RMSE of 120.87 g/m2 for 2022 data, and an R2 of 0.74 and RMSE of 86.87 g/m2 for 2023. Despite relatively high R2 values, significant underestimation occurred in later growth stages. Particularly for the 2023 SGB estimates, saturation occurred during the maize tasseling stage, leading to estimation errors. This saturation in SGB estimation resulted from the saturation of LGB estimation at this growth stage.

3.5. CNN Ablation Experiment

Table 5 lists four ablation experiments conducted to estimate LGB. Each experiment used the LR dataset for initial training and then re-trained a new CNN model using the MR dataset. To verify the effectiveness of different CNN architectures, we estimated the LGB for 2023. The results depicted in Figure 11 indicate that experiment E4 (our method) demonstrated the highest estimation performance. The ablation experiments can be ranked based on their RMSE values: E4 (R2 = 0.84, RMSE = 56.4 g/m2) > E3 (R2 = 0.81, RMSE = 58.69 g/m2 > E2 (R2 = 0.85, RMSE = 60.9 g/m2) > E1 (R2 = 0.84, RMSE = 64.98 g/m2). These findings underscored that the integration of multi-convolutional layer features significantly enhances LGB estimation accuracy. Given that our study employed 12 feature variables, convolution cannot proceed when post-convolutional features were reduced to one, indicating that four convolutional layers are optimal. This study successfully amalgamated features extracted from both deep and shallow networks, substantially improving the model’s performance.

4. Discussion

4.1. Comparison with other LGB Estimation Models

To evaluate the accuracy of our proposed methodology, we contrasted it with two alternative approaches: a CNN model without TL technique and Partial Least Squares Regression (PLSR) method. Figure 12a–c illustrate the results where models trained exclusively on 2021 measured data were used to predict LGB for 2022 and 2023. These models achieved R2 values of 0.30 for 2022 and 0.74 for 2023. This disparity highlighted the challenge of applying single-year data models across different years, given the limited scope of annual data and the variations in precipitation, fertilization, and temperature that impact model generalization. In contrast, the PLSR method, as shown in Figure 12d–f, demonstrated superior performance compared to the CNN approach, achieving R2 values of 0.65 for 2022 and 0.77 for 2023. This result underscored the benefit of integrating RTM-simulated data into model training. The LR dataset, characterized by its extensive range of observational conditions and canopy reflectance variations, enriched both machine learning and deep learning methodologies, thus improving estimation accuracy. Existing research results have shown that increasing the sample size can ensure the robustness of machine learning algorithms in the training process [49], which is consistent with our research results (Figure 8 and Figure 12). Nevertheless, both methods displayed a tendency to underestimate LGB at higher values (>350 g/m2). Overall, our 3D RTM + CNN + TL approach outperformed others, attaining R2 values of 0.73 and 0.84, and RMSE values of 72.5 g/m2 and 56.4 g/m2 (Figure 8), by leveraging transfer learning to ensure robust cross-year transferability in LGB estimation.

4.2. Advantage of 3D RTM and TL and Allometric Growth Model

The quantitative results presented in Section 3.3 demonstrate that the integration of 3D RTM, CNN, and TL enhances the accuracy of maize LGB estimation across three developmental stages. This improvement can be elucidated through the following considerations: First, datasets generated via RTMs exhibit increased robustness and generalizability [50] due to the diverse 3D scenes that include varying heights, leaf areas, leaf numbers, and planting densities. Second, the TL technique enhanced model generalization [51], while 3D RTMs provided a more accurate simulation of canopy reflectance by incorporating both structural and environmental factors [52]. Following pre-training on a dataset simulated with the LESS model, the acquired weights were transferred to a new CNN model for subsequent fine-tuning. The prediction accuracy for leaf biomass in 2022 and 2023 surpassed that of the PROSAIL dataset, with R2 values increasing by 0.28 and 0.10, respectively, and RMSE decreasing by 62.05 g/m2 and 5.44 g/m2. Lastly, SGB was estimated using an allometric growth model that describes the growth dynamics between leaves and stems, with the allometric growth relationship showing a progressive increase across various stages. Utilizing an allometric growth model based exclusively on one year’s data to predict SGB for subsequent years may introduce inaccuracies (Figure 10), primarily due to errors in LGB estimation and inaccuracies inherent to the allometric growth model construction. In summary, the SGB estimation accuracy over the two-year period achieved R2 values no lower than 0.84 and RMSE values no higher than 120.87 g/m2, which was deemed satisfactory, particularly during the jointing stage where accuracy was notably high. Our findings demonstrated that the integration of CNN with the TL technique yielded superior estimates of leaf biomass compared to methods that do not utilize TL. This aligned with the conclusions of [48], who reported that CNNs exhibit enhanced generalization capabilities relative to traditional machine learning techniques and that TL significantly augments estimation precision. These observations resonated with the results of our study, affirming the efficacy of combining CNN and TL for improved biomass estimation.

4.3. Outlook for Future Work and Limitations

The application of the 3D RTM + CNN + TL methodology is subject to several limitations. Primarily, the approach for constructing 3D scenes is specifically tailored to maize and may not be directly applicable to other crops. Additionally, the LESS model, which simulated 52,565 canopy spectra, imposes significant computational demands, necessitating high-performance graphics processing units (GPUs) to expedite CNN model training. Consequently, a balance must be struck between the accuracy of the data and the speed of the model training. This study compared LGB models utilizing both 3D RTM and PROSAIL methodologies, and future work will investigate the potential of 3D RTM for estimating additional traits. The integration of CNN with TL has enhanced model performance, with TL effectively addressing some saturation issues by utilizing a limited dataset for calibration. Furthermore, this study has introduced a CNN and TL framework for leaf biomass estimation, offering a novel method for estimating stem biomass through allometric growth relationships. Subsequent research should encompass multi-year experiments to further validate the methodology’s feasibility and generalizability.

5. Conclusions

Monitoring biomass provides valuable information for yield prediction. In this study, we developed a comprehensive maize biomass estimation method based on a stem–leaf separation strategy. We employed the 3D RTM + CNN + TL approach to estimate leaf biomass and an allometric growth model to estimate stem biomass. Leveraging the simulated dataset from the LESS model to pre-train the CNN model to gain prior knowledge, we then applied this prior knowledge and fine-tuned a new CNN model using measured data (2021 data: n = 240) for LGB estimation. This approach demonstrated that integrating deep and shallow features within the CNN network significantly enhances the accuracy of LGB estimation. Furthermore, our study results demonstrated that the 3D RTM + CNN + TL method achieved superior performance in LGB estimation over two years, outperforming PLSR, PROSAIL + CNN + TL, and CNN methods. Meanwhile, utilizing allometric growth model from different growth stages also achieved acceptable accuracy for SGB estimation over two years. We advocated for the application of this method to estimate other crop traits, such as those of wheat and soybeans. Future research should further exploit the benefits of 3D modeling to investigate mechanisms influencing crop trait remote sensing inversion and to enhance estimation accuracy.

Author Contributions

D.Z.: Methodology, writing—original draft, writing—review and editing; T.X.: Conceptualization, writing—review and editing, supervision; H.Y.: Resources, funding acquisition; G.Y.: Conceptualization, writing—review and editing; F.Y.: Writing—review and editing; C.Z.: Writing—review and editing; A.T.: Writing—review and editing; R.C.: Writing—review and editing; W.Z.: Writing—review and editing; C.Y.: Writing—review and editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the National Key Research and Development Program of China (2021YFD2000102), the Liaoning Applied Basic Research Program (Grant No. 2023JH2/101300120), the Natural Science Foundation of China (42371373), and the Special Fund for Construction of Scientific and Technological Innovation Ability of Beijing Academy of Agriculture and Forestry Sciences (KJCX20230434).

Data Availability Statement

Data will be made available on request.

Acknowledgments

We thank the Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences for providing the experimental base. We are grateful to the editors and anonymous reviewers for their constructive and helpful comments, which improved the quality of this paper.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Long, N.V.; Assefa, Y.; Schwalbert, R.; Ciampitti, I.A. Maize Yield and Planting Date Relationship: A Synthesis-Analysis for US High-Yielding Contest-Winner and Field Research Data. Front. Plant Sci. 2017, 8, 2106. [Google Scholar] [CrossRef]
  2. Zhuang, S.; Wang, P.; Jiang, B.; Li, M. Learned features of leaf phenotype to monitor maize water status in the fields. Comput. Electron. Agric. 2020, 172, 105347. [Google Scholar] [CrossRef]
  3. Hsiao, T.C.; Heng, L.; Steduto, P.; Rojas-Lara, B.; Raes, D.; Fereres, E. AquaCrop—The FAO crop model to simulate yield response to water: III. Parameterization and testing for maize. Agron. J. 2009, 101, 448–459. [Google Scholar] [CrossRef]
  4. Luo, N.; Meng, Q.; Feng, P.; Qu, Z.; Yu, Y.; Liu, L.; Muller, C.; Wang, P. China can be self-sufficient in maize production by 2030 with optimal crop management. Nat. Commun. 2023, 14, 2637. [Google Scholar] [CrossRef] [PubMed]
  5. Jin, X.; Madec, S.; Dutartre, D.; de Solan, B.; Comar, A.; Baret, F. High-throughput measurements of stem characteristics to estimate ear density and above-ground biomass. Plant Phenomics 2019, 2019, 4820305. [Google Scholar] [CrossRef]
  6. Meiyan, S.; Mengyuan, S.; Qizhou, D.; Xiaohong, Y.; Baoguo, L.; Yuntao, M. Estimating the maize above-ground biomass by constructing the tridimensional concept model based on UAV-based digital and multi-spectral images. Field Crops Res. 2022, 282, 108491. [Google Scholar] [CrossRef]
  7. Su, W.; Zhang, M.; Bian, D.; Liu, Z.; Huang, J.; Wang, W.; Wu, J.; Guo, H. Phenotyping of Corn Plants Using Unmanned Aerial Vehicle (UAV) Images. Remote Sens. 2019, 11, 2021. [Google Scholar] [CrossRef]
  8. Tao, H.; Feng, H.; Xu, L.; Miao, M.; Yang, G.; Yang, X.; Fan, L. Estimation of the Yield and Plant Height of Winter Wheat Using UAV-Based Hyperspectral Images. Sensors 2020, 20, 1231. [Google Scholar] [CrossRef]
  9. Wang, C.; Nie, S.; Xi, X.; Luo, S.; Sun, X. Estimating the Biomass of Maize with Hyperspectral and LiDAR Data. Remote Sens. 2017, 9, 11. [Google Scholar] [CrossRef]
  10. Li, W.; Niu, Z.; Wang, C.; Huang, W.; Chen, H.; Gao, S.; Li, D.; Muhammad, S. Combined Use of Airborne LiDAR and Satellite GF-1 Data to Estimate Leaf Area Index, Height, and Aboveground Biomass of Maize During Peak Growing Season. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 4489–4501. [Google Scholar] [CrossRef]
  11. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  12. Li, B.; Xu, X.; Zhang, L.; Han, J.; Bian, C.; Li, G.; Liu, J.; Jin, L. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging. ISPRS J. Photogramm. Remote Sens. 2020, 162, 161–172. [Google Scholar] [CrossRef]
  13. Yue, J.; Yang, H.; Yang, G.; Fu, Y.; Wang, H.; Zhou, C. Estimating vertically growing crop above-ground biomass based on UAV remote sensing. Comput. Electron. Agric. 2023, 205, 107627. [Google Scholar] [CrossRef]
  14. Yue, J.; Yang, G.; Li, C.; Li, Z.; Wang, Y.; Feng, H.; Xu, B. Estimation of Winter Wheat Above-Ground Biomass Using Unmanned Aerial Vehicle-Based Snapshot Hyperspectral Sensor and Crop Height Improved Models. Remote Sens. 2017, 9, 708. [Google Scholar] [CrossRef]
  15. Liu, Y.; Feng, H.; Yue, J.; Fan, Y.; Bian, M.; Ma, Y.; Jin, X.; Song, X.; Yang, G. Estimating potato above-ground biomass by using integrated unmanned aerial system-based optical, structural, and textural canopy measurements. Comput. Electron. Agric. 2023, 213, 108229. [Google Scholar] [CrossRef]
  16. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Maimaitiyiming, M.; Hartling, S.; Peterson, K.T.; Maw, M.J.W.; Shakoor, N.; Mockler, T.; Fritschi, F.B. Vegetation Index Weighted Canopy Volume Model (CVMVI) for soybean biomass estimation from Unmanned Aerial System-based RGB imagery. ISPRS J. Photogramm. Remote Sens. 2019, 151, 27–41. [Google Scholar] [CrossRef]
  17. Niu, Y.; Zhang, L.; Zhang, H.; Han, W.; Peng, X. Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery. Remote Sens. 2019, 11, 1261. [Google Scholar] [CrossRef]
  18. Han, L.; Yang, G.; Dai, H.; Xu, B.; Yang, H.; Feng, H.; Li, Z.; Yang, X. Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data. Plant Methods 2019, 15, 10. [Google Scholar] [CrossRef] [PubMed]
  19. Zhang, Y.; Xia, C.; Zhang, X.; Cheng, X.; Feng, G.; Wang, Y.; Gao, Q. Estimating the maize biomass by crop height and narrowband vegetation indices derived from UAV-based hyperspectral images. Ecol. Indic. 2021, 129, 107985. [Google Scholar] [CrossRef]
  20. Corti, M.; Cavalli, D.; Cabassi, G.; Marino Gallina, P.; Bechini, L. Does remote and proximal optical sensing successfully estimate maize variables? A review. Eur. J. Agron. 2018, 99, 37–50. [Google Scholar] [CrossRef]
  21. Gu, J.; Wang, Z.; Kuen, J.; Ma, L.; Shahroudy, A.; Shuai, B.; Liu, T.; Wang, X.; Wang, G.; Cai, J.; et al. Recent advances in convolutional neural networks. Pattern Recognit. 2018, 77, 354–377. [Google Scholar] [CrossRef]
  22. Chen, Q.; Zheng, B.; Chenu, K.; Hu, P.; Chapman, S.C. Unsupervised Plot-Scale LAI Phenotyping via UAV-Based Imaging, Modelling, and Machine Learning. Plant Phenomics 2022, 2022, 9768253. [Google Scholar] [CrossRef] [PubMed]
  23. Duan, S.-B.; Li, Z.-L.; Wu, H.; Tang, B.-H.; Ma, L.; Zhao, E.; Li, C. Inversion of the PROSAIL model to estimate leaf area index of maize, potato, and sunflower fields from unmanned aerial vehicle hyperspectral data. Int. J. Appl. Earth Obs. Geoinf. 2014, 26, 12–20. [Google Scholar] [CrossRef]
  24. Zhuang, F.; Qi, Z.; Duan, K.; Xi, D.; Zhu, Y.; Zhu, H.; Xiong, H.; He, Q. A Comprehensive Survey on Transfer Learning. Proc. IEEE 2021, 109, 43–76. [Google Scholar] [CrossRef]
  25. Weiss, K.; Khoshgoftaar, T.M.; Wang, D. A survey of transfer learning. J. Big Data 2016, 3, 9. [Google Scholar] [CrossRef]
  26. Darvishzadeh, R.; Skidmore, A.; Schlerf, M.; Atzberger, C. Inversion of a radiative transfer model for estimating vegetation LAI and chlorophyll in a heterogeneous grassland. Remote Sens. Environ. 2008, 112, 2592–2604. [Google Scholar] [CrossRef]
  27. Jiang, J.; Weiss, M.; Liu, S.; Baret, F. Effective GAI is best estimated from reflectance observations as compared to GAI and LAI: Demonstration for wheat and maize crops based on 3D radiative transfer simulations. Field Crops Res. 2022, 283, 108538. [Google Scholar] [CrossRef]
  28. Jacquemoud, S.; Verhoef, W.; Baret, F.; Bacour, C.; Zarco-Tejada, P.J.; Asner, G.P.; François, C.; Ustin, S.L. PROSPECT+SAIL models: A review of use for vegetation characterization. Remote Sens. Environ. 2009, 113, S56–S66. [Google Scholar] [CrossRef]
  29. Qi, J.; Xie, D.; Yin, T.; Yan, G.; Gastellu-Etchegorry, J.-P.; Li, L.; Zhang, W.; Mu, X.; Norford, L.K. LESS: LargE-Scale remote sensing data and image simulation framework over heterogeneous 3D scenes. Remote Sens. Environ. 2019, 221, 695–706. [Google Scholar] [CrossRef]
  30. Gastellu-Etchegorry Corresponding author, J.P.; Martin, E.; Gascon, F. DART: A 3D model for simulating satellite images and studying surface radiation budget. Int. J. Remote Sens. 2004, 25, 73–96. [Google Scholar] [CrossRef]
  31. Lei, L.; Li, Z.; Wu, J.; Zhang, C.; Zhu, Y.; Chen, R.; Dong, Z.; Yang, H.; Yang, G. Extraction of Maize Leaf Base and Inclination Angles Using Terrestrial Laser Scanning (TLS) Data. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5701817. [Google Scholar] [CrossRef]
  32. Zhao, D.; Xu, T.; Henke, M.; Yang, H.; Zhang, C.; Cheng, J.; Yang, G. A method to rapidly construct 3D canopy scenes for maize and their spectral response evaluation. Comput. Electron. Agric. 2024, 224, 109138. [Google Scholar] [CrossRef]
  33. Gadiraju, K.K.; Vatsavai, R.R. Comparative analysis of deep transfer learning performance on crop classification. In Proceedings of the 9th ACM SigSpatial International Workshop on Analytics for Big GeoSpatial Data, Seattle, WA, USA, 7 September 2020. [Google Scholar]
  34. Wu, C.; Niu, Z.; Tang, Q.; Huang, W. Estimating chlorophyll content from hyperspectral vegetation indices: Modeling and validation. Agric. For. Meteorol. 2008, 148, 1230–1241. [Google Scholar] [CrossRef]
  35. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  36. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  37. Portz, G.; Molin, J.P.; Jasper, J. Active crop sensor to detect variability of nitrogen supply and biomass on sugarcane fields. Precis. Agric. 2011, 13, 33–44. [Google Scholar] [CrossRef]
  38. Huete, A.R.; Liu, H.Q.; Batchily, K.; van Leeuwen, W. A comparison of vegetation indices over a global set of TM images for EOS-MODIS. Remote Sens. Environ. 1997, 59, 440–451. [Google Scholar] [CrossRef]
  39. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  40. Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A modified soil adjusted vegetation index. Remote Sens. Environ. 1994, 48, 119–126. [Google Scholar] [CrossRef]
  41. Penuelas, J.; Baret, F.; Filella, I. Semi-empirical indices to assess carotenoids/chlorophyll alpha ratio from leaf spectral reflectance. Photosynthetica 1995, 31, 221–230. [Google Scholar]
  42. Gitelson, A.; Merzlyak, M.N. Quantitative estimation of chlorophyll-a using reflectance spectra: Experiments with autumn chestnut and maple leaves. J. Photochem. Photobiol. B Biol. 1994, 22, 247–252. [Google Scholar] [CrossRef]
  43. Ramoelo, A.; Skidmore, A.K.; Cho, M.A.; Schlerf, M.; Mathieu, R.; Heitkönig, I.M.A. Regional estimation of savanna grass nitrogen using the red-edge band of the spaceborne RapidEye sensor. Int. J. Appl. Earth Obs. Geoinf. 2012, 19, 151–162. [Google Scholar] [CrossRef]
  44. Daughtry, C.S.T.; Walthall, C.L.; Kim, M.S.; de Colstoun, E.B.; McMurtrey, J.E. Estimating Corn Leaf Chlorophyll Concentration from Leaf and Canopy Reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  45. Datt, B. A New Reflectance Index for Remote Sensing of Chlorophyll Content in Higher Plants: Tests using Eucalyptus Leaves. J. Plant Physiol. 1999, 154, 30–36. [Google Scholar] [CrossRef]
  46. Gitelson, A.A.; Merzlyak, M.N. Signature Analysis of Leaf Reflectance Spectra: Algorithm Development for Remote Sensing of Chlorophyll. J. Plant Physiol. 1996, 148, 494–500. [Google Scholar] [CrossRef]
  47. Raper, T.B.; Varco, J.J. Canopy-scale wavelength and vegetative index sensitivities to cotton growth parameters and nitrogen status. Precis. Agric. 2015, 16, 62–76. [Google Scholar] [CrossRef]
  48. Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
  49. Li, H.; Li, F.; Xiao, J.; Chen, J.; Lin, K.; Bao, G.; Liu, A.; Wei, G. A machine learning scheme for estimating fine-resolution grassland aboveground biomass over China with Sentinel-1/2 satellite images. Remote Sens. Environ. 2024, 311, 114317. [Google Scholar] [CrossRef]
  50. Yue, J.; Yang, H.; Feng, H.; Han, S.; Zhou, C.; Fu, Y.; Guo, W.; Ma, X.; Qiao, H.; Yang, G. Hyperspectral-to-image transform and CNN transfer learning enhancing soybean LCC estimation. Comput. Electron. Agric. 2023, 211, 108011. [Google Scholar] [CrossRef]
  51. Zhang, Y.; Hui, J.; Qin, Q.; Sun, Y.; Zhang, T.; Sun, H.; Li, M. Transfer-learning-based approach for leaf chlorophyll content estimation of winter wheat from hyperspectral data. Remote Sens. Environ. 2021, 267, 112724. [Google Scholar] [CrossRef]
  52. Qi, J.; Xie, D.; Jiang, J.; Huang, H. 3D radiative transfer modeling of structurally complex forest canopies through a lightweight boundary-based description of leaf clusters. Remote Sens. Environ. 2022, 283, 113301. [Google Scholar] [CrossRef]
Figure 1. Geographical location of the experimental sites and UAV digital images acquired in 2021, 2022 and 2023. Note: (a) the geographical location of all experiments; (b) experiment 1 was conducted with plant densities treatment in 2021; (c) experiment 2 was conducted with plant densities treatment in 2022; (d) experiment 2 was conducted with nitrogen gradient treatment in 2023; A1–A5, B4, B5, C6, and C7 are Zhengdan 958, Jiyuan 1, Jiyuan 168, Jingjiuqingchu 16, Nongkenuo 336, Dajingjiu 26, Jingnongke 728, Tianci 19, and Jingnuo 2008, respectively.
Figure 1. Geographical location of the experimental sites and UAV digital images acquired in 2021, 2022 and 2023. Note: (a) the geographical location of all experiments; (b) experiment 1 was conducted with plant densities treatment in 2021; (c) experiment 2 was conducted with plant densities treatment in 2022; (d) experiment 2 was conducted with nitrogen gradient treatment in 2023; A1–A5, B4, B5, C6, and C7 are Zhengdan 958, Jiyuan 1, Jiyuan 168, Jingjiuqingchu 16, Nongkenuo 336, Dajingjiu 26, Jingnongke 728, Tianci 19, and Jingnuo 2008, respectively.
Remotesensing 16 03000 g001
Figure 2. The measurement position of the leaf angle. The red point is the center position of the leaf.
Figure 2. The measurement position of the leaf angle. The red point is the center position of the leaf.
Remotesensing 16 03000 g002
Figure 3. Proposed 1D CNN architecture for estimating LGB. The convolution layers are named Cov1, Cov2, Cov3, and Cov4. Letter B represents the bath normalization layer. Letter R represents the ReLU layer.
Figure 3. Proposed 1D CNN architecture for estimating LGB. The convolution layers are named Cov1, Cov2, Cov3, and Cov4. Letter B represents the bath normalization layer. Letter R represents the ReLU layer.
Remotesensing 16 03000 g003
Figure 4. Pearson’s correlation coefficients between 20 vegetation indices and LGB. DVI, EVI, VIRed, MSAVI, MTVI2, NDVI, SAVI, MCARI, and SIPI exhibit high correlation, whereas SCCCI, CI1, NDRE, VIRedge, and VIGreen show low correlation.
Figure 4. Pearson’s correlation coefficients between 20 vegetation indices and LGB. DVI, EVI, VIRed, MSAVI, MTVI2, NDVI, SAVI, MCARI, and SIPI exhibit high correlation, whereas SCCCI, CI1, NDRE, VIRedge, and VIGreen show low correlation.
Remotesensing 16 03000 g004
Figure 5. R2 and RMSE of CNN model with different number of VIs based on simulated dataset.
Figure 5. R2 and RMSE of CNN model with different number of VIs based on simulated dataset.
Remotesensing 16 03000 g005
Figure 6. Value distribution of the simulated and measured VIs; (al) represent the data density of 12 VIs of simulate data and UAV data; orange line is measured UAV data; blue line is simulated data; y-axis represents the value density; x-axis represents VIs value.
Figure 6. Value distribution of the simulated and measured VIs; (al) represent the data density of 12 VIs of simulate data and UAV data; orange line is measured UAV data; blue line is simulated data; y-axis represents the value density; x-axis represents VIs value.
Remotesensing 16 03000 g006
Figure 7. Spectral reflectance curves for three growth stages from simulated and field-measured datasets. As the growth stages progress, LAI increases accompanied by a corresponding increase in LGB.
Figure 7. Spectral reflectance curves for three growth stages from simulated and field-measured datasets. As the growth stages progress, LAI increases accompanied by a corresponding increase in LGB.
Remotesensing 16 03000 g007
Figure 8. Measured and estimated LGB from year 2022 and 2023 from three stages. (ac) 3D RTM + CNN method; (df) PROSAIL + CNN + TL method.
Figure 8. Measured and estimated LGB from year 2022 and 2023 from three stages. (ac) 3D RTM + CNN method; (df) PROSAIL + CNN + TL method.
Remotesensing 16 03000 g008aRemotesensing 16 03000 g008b
Figure 9. The loss function value of the training set and testing set during re-training the model.
Figure 9. The loss function value of the training set and testing set during re-training the model.
Remotesensing 16 03000 g009
Figure 10. Measured and estimated stem biomass in 2022 and 2023. (a) scatter plot between estimated SGB of 2022 year and measured SGB using allometric growth model; (b) scatter plot between estimated SGB of 2023 year and measured SGB using allometric growth model; The blue points in each figure included three growth stage.
Figure 10. Measured and estimated stem biomass in 2022 and 2023. (a) scatter plot between estimated SGB of 2022 year and measured SGB using allometric growth model; (b) scatter plot between estimated SGB of 2023 year and measured SGB using allometric growth model; The blue points in each figure included three growth stage.
Remotesensing 16 03000 g010
Figure 11. Measured and predicted 2023 LGB between four ablation experiments. The black virtual line represents 1:1 line. (a) represents experiment E1. (b) represents experiment E2. (c) represents experiment E3. The result of E4 experiment is shown in Figure 8c.
Figure 11. Measured and predicted 2023 LGB between four ablation experiments. The black virtual line represents 1:1 line. (a) represents experiment E1. (b) represents experiment E2. (c) represents experiment E3. The result of E4 experiment is shown in Figure 8c.
Remotesensing 16 03000 g011
Figure 12. Measured and estimated LGB from 3D RTM + PLSR, and CNN methods. (ac) scatter plot between estimated LAG and measured LAG using CNN method; (a) 2021 samples; (b) 2022 samples; (c) 2023 samples; (df) scatter plot between estimated LAG and measured LAG using 3D PLSR + PLSR method; (a) 2021 samples; (b) 2022 samples; (c) 2023 samples; The black line is 1:1 line.
Figure 12. Measured and estimated LGB from 3D RTM + PLSR, and CNN methods. (ac) scatter plot between estimated LAG and measured LAG using CNN method; (a) 2021 samples; (b) 2022 samples; (c) 2023 samples; (df) scatter plot between estimated LAG and measured LAG using 3D PLSR + PLSR method; (a) 2021 samples; (b) 2022 samples; (c) 2023 samples; The black line is 1:1 line.
Remotesensing 16 03000 g012
Table 1. The dates of measurement and data collection by the UAV platform. Exp 21, Exp 22, and Exp 23 represent experiments conducted in 2021, 2022, and 2023, respectively. Maize ground biomass was measured at key growth stages. Min, Mean, and Max represent the minimum, average, and maximum values of maize biomass within an experiment at a specific stage, respectively. SD stands for standard deviation.
Table 1. The dates of measurement and data collection by the UAV platform. Exp 21, Exp 22, and Exp 23 represent experiments conducted in 2021, 2022, and 2023, respectively. Maize ground biomass was measured at key growth stages. Min, Mean, and Max represent the minimum, average, and maximum values of maize biomass within an experiment at a specific stage, respectively. SD stands for standard deviation.
ExperimentDay after SowingGrowth StageAGB (g/m2)
MinMaxMeanSD
Exp. 2132Jointing56.67299.00150.2054.52
Exp. 2147Trumpet216.66882.00510.08149.79
Exp. 2159Tasseling300.001564.33829.60249.79
Exp. 2228Jointing36.9137.786.623.65
Exp. 2244Trumpet214.5716.0440.9104.38
Exp. 2258Tasseling5501844.2973.5250.03
Exp. 2327Jointing33.1105.971.116.58
Exp. 2344Trumpet140.8415.9280.867.66
Exp. 2363Tasseling399.61142.0671.0158.35
Table 2. Parameters of 3D maize scene used in this study.
Table 2. Parameters of 3D maize scene used in this study.
VariablesUnitMinTypicalMax
Plant distancem0.180.280.36
Leaf area per plantm20.190.450.88
Base angle of largest leaf°102350
Maximum plant heightm0.741.63.2
Maximum number of leaves per plant 61016
Table 3. Distribution of input variables used to generate canopy reflectance with 3D RTM simulations. VZA, VAA, SZA, and SAA correspond to view zenith angle, view azimuth angle, sun zenith angle, and sun azimuth angle. N, Cab, Car, Cm, and Cw represent leaf structure index, leaf chlorophyll per leaf area, leaf carotenoid per leaf area, leaf dry matter, and leaf equivalent water thickness.
Table 3. Distribution of input variables used to generate canopy reflectance with 3D RTM simulations. VZA, VAA, SZA, and SAA correspond to view zenith angle, view azimuth angle, sun zenith angle, and sun azimuth angle. N, Cab, Car, Cm, and Cw represent leaf structure index, leaf chlorophyll per leaf area, leaf carotenoid per leaf area, leaf dry matter, and leaf equivalent water thickness.
NameMinimumMaximumIntervalMeanStd
VZA00 0
VAA00 0
SZA 40, 50, 60
SAA 180, 200
N1.51.5
Cab209056030
Car4181126
Cm0.00250.0090.00030.0050.0016
Cw0.0150.0270.0030.0210.004
Table 4. Distribution of input variables with PROSAIL simulations.
Table 4. Distribution of input variables with PROSAIL simulations.
NameMinimumMaximumIntervalUnit
Leaf structure index1.51.5 ug/cm2
Chlorophyll a + b content209010ug/cm2
Carotenoid content4181ug/cm2
Dry matter content 0.00250.0090.0003ug/cm2
Equivalent water0.0150.0270.003ug/cm2
LAI180.5m2/m2
Brown pigments0 ug/cm2
Soil coefficient0 coefficient
Azimuth angle90 Degrees
Solar zenith405010Degrees
Observer zenith angle0 Degrees
Average leaf inclination angle305010Degrees
Table 5. The name and explanation of two datasets. These datasets are used to train the CNN model, and to compare the performance between the proposed method and other methods.
Table 5. The name and explanation of two datasets. These datasets are used to train the CNN model, and to compare the performance between the proposed method and other methods.
DatasetExplanationFunction
LR datasetThe data simulated from LESS modelUsed for pre-training CNN model
MR datasetThe data obtained from 2021Used for re-training CNN model
Table 6. Spectral indices of multispectral image.
Table 6. Spectral indices of multispectral image.
Spectral IndicesDefinitionReferences
Modified Simple Ratio M S R = ( R n i r R r e d 1 ) / ( R n i r + R r e d + 1 ) [34]
Difference Vegetation Index D V I = R n i r R r e d [35]
Modified Triangular Vegetation Index 2 M T V I 2 = 1.5 × [ 1.2 × R n i r R g r e e n 2.5 × R r e d R g r e e n ] ( 2 × R n i r + 1 ) 2 6 × R n i r 5 × R r e d 0.5 [36]
INRE I N R E = 100 × ( R n i r R r e d ) [37]
Enhanced Vegetation Index E V I = 2.5 × R n i r R r e d ( R n i r + 6 × R r e d 7.5 × R b l u e + 1 ) [38]
SAVI S A V I = ( R n i r R r e d ) ( R n i r + R r e d + 0.5 ) × ( 1 + 0.5 ) [39]
Modified Soil Adjusted Vegetation Index M S A V I = ( 1 + 0.1 ) ( R n i r R r e d R n i r + R r e d + 0.1 ) [40]
Structure-Insensitive Pigment Index S I P I = ( R n i r R b l u e ) ( R n i r + R r e d ) [41]
Normalized Difference Vegetation Index N D V I = ( R n i r R r e d ) ( R n i r + R r e d ) [42]
Ratio Between NIR and Red Bands V I r e d = R n i r / R r e d [43]
Modified Chlorophyll Absorption Reflectance Index M C A R I = ( R r e R r e d 0.2 × ( R r e R g r e e n ) ) × ( R r e R r e d ) [44]
Normalized Difference Red-Edge Index N D R E = R n i r R r e R n i r + R r e [42]
Ratio Between NIR and Green Bands V I g r e e n = R n i r / R g r e e n [43]
Normalized Difference Index N D I = ( R n i r R r e ) ( R n i r + R r e d ) [45]
Red-Edge Chlorophyll Index 2 C I 2 = R r e R g r e e n 1 [46]
Ratio Between NIR and Red-Edge Bands V I r e = R n i r / R r e [43]
Optimized SAVI O S A V I = ( 1 + 0.16 ) ( R n i r R r e d R n i r + R r e d + 0.16 ) [35]
Modified Chlorophyll Absorption Reflectance Index 2 M C A R I 2 = 1.5 × ( 2.5 × R n i r R r e d e d g e 1.3 × R n i r R g r e e n ) ( 2 × R n i r + 1 2 6 × R n i r 5 × ( R r e d ) 2 0.5 ) [36]
Simplified Canopy Chlorophyll Content Index S C C C I = N D R E N D V I [47]
Transformed Chlorophyll Absorption Reflectance Index T C A R I = 3 × ( R r e R r e d 0.2 × ( R r e R g r e e n ) × ( R r e R r e d ) ) [48]
Table 7. Ablation experiments.
Table 7. Ablation experiments.
ExperimentsCNN ArchitectureInput
E1Cov4LR, MR dataset
E2Cov3 + Cov4LR, MR dataset
E3Cov2 + Cov3 + Cov4LR, MR dataset
E4Cov1 + Cov2 + Cov3 + Cov4LR, MR dataset
Table 8. The allometric growth relationship between leaf biomass and stem biomass at key growth stages of maize. All fitting results are based on measured data from 2021.
Table 8. The allometric growth relationship between leaf biomass and stem biomass at key growth stages of maize. All fitting results are based on measured data from 2021.
Growth StageAllometric ModelR2
Jointing S G B = 1.37 L G B 0.918 0.72
Trumpet S G B = 2.95 L G B 0.889 0.81
Tasseling S G B = 6.05 L G B 0.827 0.83
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhao, D.; Yang, H.; Yang, G.; Yu, F.; Zhang, C.; Chen, R.; Tang, A.; Zhang, W.; Yang, C.; Xu, T. Estimation of Maize Biomass at Multi-Growing Stage Using Stem and Leaf Separation Strategies with 3D Radiative Transfer Model and CNN Transfer Learning. Remote Sens. 2024, 16, 3000. https://doi.org/10.3390/rs16163000

AMA Style

Zhao D, Yang H, Yang G, Yu F, Zhang C, Chen R, Tang A, Zhang W, Yang C, Xu T. Estimation of Maize Biomass at Multi-Growing Stage Using Stem and Leaf Separation Strategies with 3D Radiative Transfer Model and CNN Transfer Learning. Remote Sensing. 2024; 16(16):3000. https://doi.org/10.3390/rs16163000

Chicago/Turabian Style

Zhao, Dan, Hao Yang, Guijun Yang, Fenghua Yu, Chengjian Zhang, Riqiang Chen, Aohua Tang, Wenjie Zhang, Chen Yang, and Tongyu Xu. 2024. "Estimation of Maize Biomass at Multi-Growing Stage Using Stem and Leaf Separation Strategies with 3D Radiative Transfer Model and CNN Transfer Learning" Remote Sensing 16, no. 16: 3000. https://doi.org/10.3390/rs16163000

APA Style

Zhao, D., Yang, H., Yang, G., Yu, F., Zhang, C., Chen, R., Tang, A., Zhang, W., Yang, C., & Xu, T. (2024). Estimation of Maize Biomass at Multi-Growing Stage Using Stem and Leaf Separation Strategies with 3D Radiative Transfer Model and CNN Transfer Learning. Remote Sensing, 16(16), 3000. https://doi.org/10.3390/rs16163000

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop