Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

A Survey of Image Processing Techniques For Plant Extraction and Segmentation in The Field

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

Computers and Electronics in Agriculture 125 (2016) 184–199

Contents lists available at ScienceDirect

Computers and Electronics in Agriculture


journal homepage: www.elsevier.com/locate/compag

Review

A survey of image processing techniques for plant


extraction and segmentation in the field
Esmael Hamuda ⇑, Martin Glavin, Edward Jones
National University of Ireland, University Road, Galway, Ireland

a r t i c l e i n f o a b s t r a c t

Article history: In this review, we present a comprehensive and critical survey on image-based plant segmentation tech-
Received 14 October 2015 niques. In this context, ‘‘segmentation” refers to the process of classifying an image into plant and non-
Received in revised form 20 April 2016 plant pixels. Good performance in this process is crucial for further analysis of the plant such as plant
Accepted 25 April 2016
classification (i.e. identifying the plant as either crop or weed), and effective action based on this analysis,
Available online 19 May 2016
e.g. precision application of herbicides in smart agriculture applications.
The survey briefly discusses pre-processing of images, before focusing on segmentation. The segmen-
Keywords:
tation stage involves the segmentation of plant against the background (identifying plant from a
Colour index-based segmentation
Threshold-based segmentation
background of soil and other residues). Three primary plant extraction algorithms, namely, (i) colour
Learning-based segmentation index-based segmentation, (ii) threshold-based segmentation, (iii) learning-based segmentation are
Segmentation quality discussed. Based on its prevalence in the literature, this review focuses in particular on colour
Plant pixels index-based approaches. Therefore, a detailed discussion of the segmentation performance of colour
Plant extraction index-based approaches is presented, based on studies from the literature conducted in the recent past,
particularly from 2008 to 2015. Finally, we identify the challenges and some opportunities for future
developments in this space.
Ó 2016 Elsevier B.V. All rights reserved.

Contents

1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
1.1. Background and motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
1.2. Image processing challenges. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186
1.3. Paper organisation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186
2. Image processing overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186
2.1. Pre-processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
2.2. Segmentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
3. Colour index-based approaches. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
3.1. Normalised Difference Index (NDI) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
3.2. Excess Green Index (ExG) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
3.3. Excess Red Index (ExR) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
3.4. Colour Index of Vegetation Extraction (CIVE) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
3.5. Excess Green minus Excess Red Index (ExGR) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
3.6. Normalised Green–Red Difference Index (NGRDI) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
3.7. Vegetative Index (VEG) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
3.8. Combined Indices 1 (COM1) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
3.9. Modified Excess Green Index (MExG). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
3.10. Combined Indices 2 (COM2) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
4. Evaluation of plant extraction based on colour indices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
5. Other segmentation approaches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194

⇑ Corresponding author.
E-mail address: E.Hamuda2@nuigalway.ie (E. Hamuda).

http://dx.doi.org/10.1016/j.compag.2016.04.024
0168-1699/Ó 2016 Elsevier B.V. All rights reserved.
E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199 185

5.1. Threshold-based approaches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194


5.2. Learning-based approaches. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
6. Discussion and conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198

1. Introduction as sugar beet, wheat, and corn for inter-row weed control. A num-
ber of studies have been carried out to evaluate the efficacy of
1.1. Background and motivation mechanical weed control methods. Forcella (2000) reported that
rotary hoeing yielded approximately 50% weed control alone with-
Weeds are one of the big challenges in agriculture because they out using other weed control methods such as herbicides and man-
appear everywhere randomly, and compete with the plant for ual labour. Donald (2007) found that inter-row mowing systems
resources. As a result of this competition for resources, crop yields for controlling both winter annual and summer annual weeds
suffer. Yield losses depend on factors such as weed species, popu- may reduce the use of herbicides by approximately 50%.
lation density, and relative time of emergence and distribution as Mechanical weeding is particularly suited to organic fields for
well as on the soil type, soil moisture levels, pH and fertility weed control and can also be helpful in conventional fields. On
(Papamichail et al., 2002). Numerous researchers have identified the other hand, the use of machinery may also have negative
a strong link between weed competition and crop yield loss, with effects on crops and the environment by causing damage and ero-
a wide range of crop varieties. For example, according to the study sion (Nelson and Giles, 1986; Eyre et al., 2011). Chemical weeding
by Stall (2009), an annual loss of 146 million pounds of fresh mar- is the most widely used method for weed control in agriculture
ket sweet corn and 18.5 million pounds of sweet corn for process- since the introduction of synthetic organic chemicals in the late
ing occurred in the United States from 1975 to 1979 due to weed 1940s, and farmers now rely heavily on herbicides for effective
competition, which corresponds to revenue losses of $13,165,000 weed control in crops (Gianessi and Reigner, 2007; Grichar and
and $9,155,000 respectively. Besides, the dry and head weight of Colburn, 1993; Bridges, 1992), particularly on large scale commer-
crop yield are measured to evaluate losses. Based on a study car- cial farms. Many studies have documented that the use of herbi-
ried out in 1996/1997 and repeated in 1997/1998 in central Jordan cides is a more economical method for controlling weeds
(Qasem, 2009), it was found that the average reduction in shoot compared to hand and mechanical weeding. With the help of her-
dry weight and head yield were 81% and 89% respectively. An bicides, farmers in Mississippi were estimated to have saved $10
effective and efficient weed management system is necessary to million per year compared to the cost of labour (Gianessi and
minimise yield losses in valuable crops. The critical period for Reigner, 2007). Demand for chemicals by farmers has increased
weed control must be taken into account to enhance weed man- the market size; according to a report carried out in 2014 by
agement strategies (Swanton and Weise, 1991), as the duration BCC Research Chemical Report (2014), the biopesticide and
of co-existence of weed and crop is an important indicator of yield synthetic pesticide market are expected to reach up to $83.7 billion
losses due to weed competition (Kropff et al., 1992). by 2019.
Zimdahl (1988, 1993) defined the critical period of weed control Although herbicides are very effective at controlling weeds,
(CPWC) as ‘‘a span of time between that period after seeding or they have negative impacts on both the environment (through pol-
emergence when weed competition does not reduce crop yield lution) and plant biology (development of resistance). Groundwa-
and the time after which weed competition will no longer reduce ter and surface water pollution has been reported in many cases
crop yield”. A more quantitative definition is as the number of in recent decades, and excessive use of herbicide has often been
weeks after crop emergence during which a crop must be weed- found to be the cause (Liu and O’Connell, 2002; Spliid and
free in order to prevent yield losses greater than 5% (Hall et al., Koeppen, 1998). To counteract these catastrophic environmental
1992; Van Acker et al., 1993; Knezevic et al., 1994). effects, most European countries have introduced legislative direc-
A number of studies have been carried out in many different tives to restrict the use of herbicides in agriculture (Lotz et al.,
locations, under different environmental conditions in an attempt 2002). If there are means to accurately detect and identify weed
to establish the CPWC. The studies are generally conducted by spatial distribution (weed patches), it is possible to limit herbicide
keeping the crop free from weeds for a fixed period of time, and quantities by applying them only where weeds are located (e.g.
then allowing the weeds to infest. Another approach used is grow- Lindquist et al., 1998; Manh et al., 2001; Berge et al., 2012;
ing weeds with the crop for certain predetermined durations, after Christensen et al., 2009; Jeschke et al., 2011). Heisel et al. (1999)
which all weeds are removed until the growing season ends (Nieto demonstrated a potential herbicide saving of 30–75% through the
et al., 1968). Some studies have reported that weeds that emerge at use of appropriate spraying technology and a decision support sys-
the same time as the crop, or slightly after, cause greater yield loss tem for precision application of herbicides. This drives the need for
than weeds emerging later in the growth cycle of the crop (Dew, systems for more accurate identification of weed patches, and has
1972; O’Donovan et al., 1985; Swanton et al., 1999). Most studies provided one motivation for development of image processing
recommended that crops should keep weed-free within the CPWC methods for identification of weeds. Colour-index based segmenta-
in order to minimise yield loss (e.g. Karkanis et al., 2012). tion methods have demonstrated a particular utility for weed iden-
Manual methods for weed control include hand weeding and tification, and hence are a particular focus of this paper.
use of simpler hand tools. Hand weeding is a conventional weed Besides identification of weeds to permit precision weeding,
removal method that has been successfully used to control weeds plant segmentation is also useful for other proposes, and applied
for many centuries, before any other methods existed, but is not in several applications such as plant species recognition (Lei
practical for large scale commercial farms because it is extremely et al., 2008), growing phase determination (Kataoka et al., 2003),
labour intensive, costly, tedious, and time consuming (USDA, and plant disease detection (Camargo and Smith, 2009). While
1996). weeding remains the most important motivator at present, these
Mechanical methods for weed control (by tillage or cultivation other applications are growing in importance with increasing
of the soil) are mostly applied in large areas for row crops such interest in smart agriculture.
186 E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199

1.2. Image processing challenges using a variety of approaches (indicated by the ‘‘Algorithms” box
on the left hand side of Fig. 1). Evaluation is typically carried out
Most recent studies have focused on chemical technology and by comparing the output of the segmentation algorithm with a ref-
its applications for targeting weeds at close range to avoid dis- erence image that is treated as a ‘‘gold standard”, and by using a
turbing crop plants, and these studies have demonstrated that it suitable performance or quality metric. These steps will be
is feasible to accurately target weeds within 1 cm of crop plants. described in the following sections of this paper.
Slaughter et al. (2008) considered image processing techniques
for detection and discrimination of plants and weeds in some 1.3. Paper organisation
detail. Plant has to be segmented from background soil, consider-
ing all field conditions, because mis-segmentation could seriously This survey is organised as follows: a brief overview of image
affect the accuracy of plant/weed detection. Among other things, processing approaches and a discussion of the pre-processing stage
Slaughter et al. concluded that natural illumination plays a crucial are given in Section 2; Section 3 describes colour index approaches,
role in effective plant segmentation, and poor illumination con- the most prevalent approach in the literature thus far; a compar-
tributes to poor plant segmentation. They also found that most of ison of segmentation performance for colour index-based
the available machine vision techniques are not robust for real approaches, based on recent studies from the literature, is given
time conditions. High segmentation performance is required for in Section 4. Section 5 briefly discusses threshold-based and learn-
precision chemical application, and with good performance, the ing based-approaches. An overall discussion and conclusions are
volume of herbicides that are applied to the fields can be given in Section 6, which also considers remaining challenges, lim-
minimised. itations, and recommendations.
In this survey, we focus on recent studies that consider image
processing techniques that used for plant extraction and segmen- 2. Image processing overview
tation under various field conditions and consider their perfor-
mance. Fig. 1 shows a block diagram of a general scheme for Machine vision technology has been widely used and studied in
segmentation, including a broad framework for evaluation of seg- agriculture to identify and detect plants (crops & weeds). It has
mentation algorithms. This typically includes a pre-processing shown a potential for success in a number of case studies in robotic
stage, followed by the core segmentation stage, which can be done weed control systems despite some serious challenges that will be

Fig. 1. General scheme for segmentation and its evaluation.


E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199 187

discussed below. After many decades of study, machine vision has not result in good segmentation because plant and soil background
improved the quality management of weed control systems pixels had similar greyscale values. Therefore, in order to demon-
(Meyer et al., 1998; Onyango and Marchant, 2003; Søgaard, strate good segmentation, the RGB space is often converted to
2005; Schuster et al., 2007). Machine vision technology has also alternative colour spaces. Several common green indexes (listed
been applied in other agricultural applications such as grading according to date of publication) are as follows.
and harvesting fruits (Slaughter and Harrel, 1989; Van Henten
et al., 2003; Abbasgholipour et al., 2011). As summarised in 3.1. Normalised Difference Index (NDI)
Slaughter et al. (2008), many researchers have developed image
processing methods as guidance for machine vision, working in dif- The Normalised Difference Index was proposed by Woebbecke
ferent fields and environments (under controlled and uncontrolled et al. (1992). They tested three methods to distinguish plant mate-
conditions). Image-based segmentation techniques mostly involve rial from soil background in an RGB image. A range of difference
two main stages: pre-processing and pixel classification. indices based on the R, G and B channels was evaluated e.g.
G  R, G  B, and G  R/G + R, with the third one demonstrating
2.1. Pre-processing the best separation of plant from background. This index is applied
to all pixels in the image, providing values ranging between 1 and
Pre-processing involves some important initial processing on +1, but to display the image, these values must range between 0
the original image from the camera such as contrast enhancement and 255. Therefore, the index was further processed by adding 1
and removing noise. to it and then multiplied by a factor of 128 to provide a greyscale
Image enhancement is one of the important steps in computer image (0–255). Thus, the final formula for NDI is as follows:
  
vision and it has played a significant role in various applications ðG  RÞ
such as medical imaging, industrial inspection, remote sensing, NDI ¼ 128  þ1 ð1Þ
ðG þ RÞ
and plant disease detection. Image enhancement is a process used
for enhancing and adjusting the contrast of the acquired image to The NDI index produces a near-binary image.
address the variability of luminance issues such as sunlight and
shadow (Jeon, 2014). Colour conversion is used to address lighting 3.2. Excess Green Index (ExG)
problems in the scene of an image. For example, Perez et al. (2000)
applied Normalised Difference Index (using only green and the red Woebbecke et al. (1995) examined several colour vegetation
channel) to reduce the illumination effect and discriminate indices that were derived using chromatic coordinates and modi-
between plants and background. Filtering is also one of the impor- fied hue in separating green plant from bare soil (corn residue
tant parts in image enhancement; in agricultural application, col- and wheat straw residue). The colour vegetation indices that were
our conversion and histogram equalization are used for plant leaf used include:
disease detection (Thangadurai and Padmavathi, 2014). For rg ð2Þ
instance, Homomorphic filtering is a technique that has the ability
gb ð3Þ
to minimise illumination issues and has been successfully applied
in outdoor images under various environmental conditions gb
ð4Þ
(Pajares et al., 2005). rg
2g  r  b ð5Þ
2.2. Segmentation where r, g, and b are the chromatic coordinates:

The initial goal in almost all image processing plant detection R G B


r¼ ; g¼ ; b¼ ð6Þ
approaches is to segment the different pixels which appear in ðR þ G þ B Þ

ðR þ G þ B Þ

ðR þ G þ B Þ


image into two classes: plant (crops and weeds) and background
where R , G and B are the normalised RGB values ranging from 0 to
(soil and residues). Background removal is an essential stage, and
1, and are computed as follows:
it has to be done in an appropriate way to avoid any
mis-classification. Several methods have been developed for R G B
R ¼ ; G ¼ ; B ¼ ð7Þ
segmenting crop canopy images. The common segmentation Rmax Gmax Bmax
technologies used for this purpose are: colour index-based
where R, G and B are the actual pixel values from the images based
segmentation, threshold-based segmentation, and learning-based
on each channel and Rmax ¼ Gmax ¼ Bmax ¼ 255 for a 24 bit colour
segmentation. The next two sections consider colour index-
image (3 ⁄ 8-bit channels).
based methods, while Section 5 discusses threshold-based and
Among selected colour vegetation methods, Woebbecke et al.
learning-based approaches.
found that the modified hue (2g  r  b), referred to as the Excess
Green Index (ExG), was the best choice for separating plants from
3. Colour index-based approaches bare soil. This is because ExG provided a clear contrast between
plants and soil, and produced near binary images. The ExG index
Colour is one of the most common methods used to discrimi- has been widely used and has performed very well in separating
nate plants from background clutter in computer vision. Several plants from non-plants (Meyer et al., 1998; Lamm et al., 2002;
researchers have used colour to separate plant from soil e.g. colour Ribeiro et al., 2005; Guerrero et al., 2012).
characteristics were used to distinguish green plants from soil and
estimate the leaf area (Rasmussen et al., 2007; Meyer and 3.3. Excess Red Index (ExR)
Camargo-Neto, 2008; Kirk et al., 2009).
The colour of a region of interest can be accentuated, so the Meyer et al. (1998) inspired by the fact that there are 4% blue,
undesired region (soil background region) will be attenuated. For and 32% green, compared with 64% red cones in the retina of the
the majority of conventional visible spectrum cameras, the images human eye, introduced ExR method and compared with ExG in
are output in the conventional RGB colour space. According to Tian the experiment to segment leaf regions from the background.
and Slaughter (1998), converting the RGB values into greyscale did Excess Red Index was able to separate the plant pixels from
188 E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199

background pixels, but it was not as accurate as ExG. The formula 3.8. Combined Indices 1 (COM1)
for ExR is defined as follows:
Guijarro et al. (2011) selected four greens indices, ExG, CIVE,
ExR ¼ 1:3R  G ð8Þ ExGR, and VEG. These methods were applied simultaneously rather
than individually to improve segmentation quality:
3.4. Colour Index of Vegetation Extraction (CIVE) COM1 ¼ ExG þ CIVE þ ExGR þ VEG ð13Þ

Colour Index of Vegetation Extraction (CIVE) was proposed by Guijarro showed that the combined method demonstrated better
Kataoka et al. (2003) based on a study carried out in soya bean results than when the approaches were applied separately. The
and sugar beet fields. This method was proposed to separate green method has been tested in barley and corn fields and demonstrated
plants from soil background in order to evaluate the crop growing high reliability under various illumination conditions in outdoor
status. The formula for CIVE is as follows: environments.

CIVE ¼ 0:441R  0:811G þ 0:385B þ 18:78745 ð9Þ 3.9. Modified Excess Green Index (MExG)
Kataoka et al. found that the CIVE has better plant segmentation
Modified Excess Green (MExG) Index was developed by
than Near-infrared (NIR) method because it provides greater
Burgos-Artizzu et al. (2011) and is defined as follows:
emphasis of the green areas.
MExG ¼ 1:262G  0:884R  0:311B ð14Þ
3.5. Excess Green minus Excess Red Index (ExGR) Burgos-Artizzu conducted experiments under uncontrolled lighting
in real time. The proposed method successfully converted the
This method was introduced by Meyer et al. (2004), and combi- colour image into greyscale image, which was very easy to binarise
nes two colour indices, namely, Excess Green Index (ExG) and with a fast automatic threshold method. The discrimination
Excess Red Index (ExR). These methods were applied simultane- between plant and soil region was effective because the MExG
ously to separate plants from the soil and residue, with ExG used method was very robust to the changing illumination conditions.
to extract the plant region and ExR used to eliminate the back- Burgos-Artizzu found that MExG method demonstrated better
ground noise (soil and residue) where green–red material (stems, segmentation results then ExG.
branches, or petioles) may exist. The ExGR is defined as follows:

ExGR ¼ ExG  ExR ð10Þ 3.10. Combined Indices 2 (COM2)

where ExG and ExR are as previously defined. This was introduced by Guerrero et al. (2012) for analysis of
maize plants, and is quite similar to COM1 in the combination of
3.6. Normalised Green–Red Difference Index (NGRDI) three colour Indices: ExG, CIVE, and VEG; ExGR was excluded
because it classified the shadow of the maize plant as part of plant.
The Normalised Green–Red Difference Index (NGRDI) was pro- Normalised Difference Index was also excluded because it may
posed by Hunt et al. (2005) and tested on digital photograph of segment soil regions as plant. The contribution of each selected
crops such as corn, alfalfa, and soybeans, which were captured method is controlled by a weighting factor, with the weights sum-
by a digital camera mounted on the bottom of an aircraft fuselage. ming to 1. The combined method is defined as follows:
The method of NGRDI was used to overcome the differences in COM2 ¼ 0:36ExG þ 0:47CIVE þ 0:17VEG ð15Þ
exposure settings selected by the digital camera when acquiring
aerial photography of the field. These differences may cause large
differences in colour bands that have the same reflectance. The for- 4. Evaluation of plant extraction based on colour indices
mula for NGRDI is as follows:
Segmentation-based colour indices have been widely used as a
ðG  RÞ
NGRDI ¼ ð11Þ benchmark by other researchers to evaluate the performance of
ðG þ RÞ

The G  R component is used to discriminate between green plants


100.00%
and soil, and G + R is used to normalise for variations in light inten-
90.00%
Mean and standard deviation

sity between different images.


80.00%
70.00%
3.7. Vegetative Index (VEG) 60.00%
50.00%
This was proposed by Hague et al. (2006) to separate plant (cer- 40.00%
eal and weeds) pixels from soil pixels. The study was conducted 30.00%
under field conditions, and the image was captured by a CCD cam- 20.00%
era. To achieve segmentation, an RGB image was converted to 10.00%
greyscale by using the following formula: 0.00%
ExGR ExG+Otsu NDI
G Mean 88% 53% 54%
VEG ¼ ð12Þ
Ra Bð1aÞ
Colour index-based method
where a is a constant value equal to 0.667. Hague found that this
Fig. 2. Comparison of the performance of selected colour indices: ExGR, ExG + Otsu,
transformation demonstrated good contrast between plant and soil. and NDI  Otsu under greenhouse conditions. SD is indicted by error bar in the plot
In addition, the VEG has a significant advantage because it is robust (Meyer and Camargo-Neto, 2008). (For interpretation of the references to colour in
to lighting change. this figure legend, the reader is referred to the web version of this article.)
E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199 189

100.00% 45.00%

Misclassification rate
Mean and standard deviation

90.00% 40.00%
35.00%
80.00%
30.00%
70.00% 25.00%
60.00% 20.00%
50.00% 15.00%
10.00%
40.00%
5.00%
30.00% 0.00%
CIVE ExG CIVE ExG
20.00%
NGV NGVS
10.00%
Min 20.12% 0.94% 7.25% 0.98%
0.00%
ExGR ExG+Otsu NDI+Otsu Med 33.35% 1.61% 13.86% 1.25%
Mean 88% 88% 25% Max 38.47% 4.91% 19.38% 3.26%
Colour index-based method Colour index-based method
Fig. 3. Comparison of the performance of selected colour indices: ExGR, ExG + Otsu, Fig. 5. Comparison of mis-classification of CIVE and ExG for both image types (NGV
and NDI + Otsu under actual field conditions. SD is indicted by error bar in the plot & NGVS) (Zheng et al., 2009).
(Meyer and Camargo-Neto, 2008). (For interpretation of the references to colour in
this figure legend, the reader is referred to the web version of this article.)

based on how similar the segmented image is to the annotated


their proposal methods for improving plant segmentation quality image and gives the result as ratio of correct classification of pixels.
against various lighting conditions and complex background. This The greenhouse sets were analysed using ExGR with a fixed thresh-
section draws together various recent studies that have used old of zero, NDI with a threshold of zero, and EXG with threshold
colour-index based methods, and in particular, focuses on calculated according to Otsu’s method (Otsu, 1979). The field
the comparative performance of these methods. The discussion is images were examined with ExGR with a threshold of zero, NDI
constrained by the selection of colour indices that have been with an Otsu threshold, and ExG with an Otsu threshold.
selected by the authors of the different studies considered; The mean and standard deviation of the quality factor for the
however, all of the colour index methods have been evaluated in colour indices considered for green house and field sets are shown
at least one study. in Figs. 2 and 3 respectively.
In previous studies, evaluation has generally been done by cal- From Fig. 2, it can be seen that ExGR with zero threshold pre-
culating the mean and standard deviation of an appropriately sented the best segmentation performance, with mean quality fac-
defined segmentation quality factor. A high mean value and low tor of almost 90% with low standard deviation, while the
standard deviation of segmentation quality factor corresponds to segmentation quality for ExG + Otsu and NDI  Otsu were quite
high demonstrated segmentation performance; a value of 1 for similar, around 50%. According to the results in Fig. 3, the perfor-
the mean, and 0 for standard deviation represents perfect plant mance for ExGR and ExG + Otsu were similar at approximately
segmentation. Meyer and Camargo-Neto (2008) compared three 90%, while NDI + Otsu has the lowest performance.
green indices, namely, Excess Green minus Excess Red Index Overall, ExGR demonstrated very good segmentation quality for
(ExGR), Excess Green Index (ExG), and Normalised Difference Index the images that were taken under different environments (green
(NDI). The segmentation quality has been tested and compared for house and field conditions), with various backgrounds. In addition,
both greenhouse and actual field of soybean images. In addition, ExGR was superior in plant separation over the ExG and NDI. ExG
various backgrounds (bare soil, corn stalks, and wheat residue) also performed well in segmentation of plants in field conditions.
were considered. The segmentation quality for each applied NDI gave the lowest accuracy.
method was evaluated according to the approach described in Zheng et al. (2009) proposed an algorithm using a Mean-Shift
Bhanu and Jones (1993). The input parameters of the evaluation method and Back Propagation Neural Network (MS-BPNN) to
method are two different binary images: one is extracted manually improve the segmentation quality of plant. To assess the perfor-
using Photoshop as the annotated ‘‘gold standard”, and other mance of the algorithm, two index-based methods (ExG and CIVE)
extracted by the colour index-based method under evaluation. were used as a benchmark. The study was conducted in outdoor
Thus, the evaluation method measures the segmentation accuracy environments, including variety of plant species, different

80.00% 100.00%
Misclassification rate

70.00% 90.00%
60.00% 80.00%
Segmentaon rate

50.00%
70.00%
40.00%
60.00%
30.00%
20.00% 50.00%
10.00% 40.00%
0.00% 30.00%
CIVE ExG CIVE ExG
20.00%
GV GVS
10.00%
Min 1.87% 3.35% 30.21% 18.32%
0.00%
Med 6.34% 13.14% 65.58% 50.69% CIVE ExGR NDI
Max 18.50% 17.82% 70.26% 61.09% Mean 81.41% 80.92% 80.71%
Colour index-based method Colour index- based method

Fig. 4. Comparison of mis-classification of CIVE and ExG for GV and GVS image Fig. 6. The average segmentation quality for CIVE, ExGR, and NDI (Zheng et al.,
types (Zheng et al., 2009). 2010).
190 E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199

illuminations, and different soil types. Before evaluating the seg- evaluate its performance. The method of Otsu was used to deter-
mentation performance, each region in the test image of size mine a threshold for use with all colour indices images for binari-
2  2 pixels was labelled by hand with ‘1’ for green region and sation. In addition, an averaging filter was used to remove noise. To
‘0’ for background and then compared that with segmented image. evaluate the performance of the MS-FLD method and the three
The segmentation performance was assessed based on the mis- colour index-based methods, each of the test images is labelled
segmentation rate for green and background region, in particular manually with ‘1’ (white) for the green region, and with ‘0’ (black)
the minimum (Min), median (Med), and maximum (Max) values for background region. The study has shown that MS-FLD obtained
were evaluated. Moreover, the mean running time per image (T) the highest average segmentation rate, at 97.98%. The performance
was used to evaluate the speed of MS-BPNN method. Four different of the colour index methods (in terms of average value of correct
image types were tested, including: green vegetation with shadow segmentation rate) are presented in Fig. 6.
(GVS), green vegetation without shadow (GV), non-green vegeta- It can be seen that all three colour indices performed well, and
tion with shadow (NGVS), and non-green vegetation without sha- average performance is quite high at approximately 80%. However,
dow (NGV). according to Zheng, the colour index-based methods were not
For GV and GVS images, the MS-BPNN method gave the lowest stable for all tested images; some resulting images showed that
Min, Med, and Max of mis-segmentation for GV: 0.19%, 2.53%, and NDI and CIVE gave better segmentation than ExGR, whereas others
5.34%, respectively, and for GVS: 15.72%, 18.23%, and 34.13% showed that ExGR produced better segmentation than NDI and
respectively. Of interest here are the performance figures for CIVE CIVE.
and ExG shown in Fig. 4. Again, it is of interest to compare the computation time of the
CIVE has lower mis-segmentation rate for GV images than ExG. MS-FLD method with the simpler colour-index methods. While
Both ExG and CIVE based have very high mis-segmentation rate for MS-FLD demonstrated better segmentation performance than col-
GVS images. In general, ExG is more accurate for extracting plants our index-based methods, its average running time was higher
with shadow than CIVE. For NGV images, the MS-BPNN method than that obtained by vegetation index-based methods: 3.3906 s
gives values of Min, Med and Max of mis-segmentation of 2.12%, and 0.0156 s (averaged over the three colour indices), respectively.
3.81%, and 5.26% respectively. These values are higher than those Guijarro et al. (2011) tested four green colour indices (ExG, CIVE,
for NGVS, where the algorithm gives values of 0.12%, 1.28%, and ExGR, and VEG) individually and simultaneously (using the COM1
4.93%. Again, what is of interest here is the performance for the col- combined index described above) to assess their performance for
our indices considered; the mis-classification performance for CIVE better automatic segmentation of plant. As noted earlier, the study
and ExG for NGV and NGVS images is shown in Fig. 5. by Guijarro found that when used individually, these indices may
ExG demonstrated good segmentation results for NGVS images, create either over-segmentation or under-segmentation results;
whereas the CIVE exhibited poorer performance. In addition, ExG when combined through COM1, these problems can be overcome.
has shown lower mis-segmentation rate for NGV than the pro- The study was conducted in barley and corn fields under various
posed MS-BPNN method. illumination conditions. Two scenes were taken into account:
Of further interest is the fact that although Zheng’s proposed one scenes contained plants and soil without sky and another
method demonstrated better segmentation performance overall one contained plants, soil, and sky. The combination method was
than both ExG and CIVE, it gave the longest average computing time proposed to increase the contrast between plant and soil, so the
for both types of tested images, which were: 91.9 s and 10.8 s. By probability of distinguishing between plant and background image
comparing the mean running time for colour index-based meth- is increased. In order to accomplish this goal, the contrast was
ods, ExG was given slightly lesser than CIVE for both types of tested measured based on the grey level histogram (minimum unifor-
images: (3.8 s and 0.5 s), (3.9 s and 0.6 s) respectively. mity). The uniformity was computed for each green image ðU GK Þ,
Zheng et al. (2010) introduced another method to improve the and the weight was obtained for each one ðW GK Þ, where k = {ExG,
quality of crop image segmentation. The proposed method was CIVE, ExGR, VEG}. Besides, the combined greenness ðGÞ was com-
based on the combination of two methods: one based on Mean puted and the mean threshold was chosen instead of the Otsu
Shift (MS) and another based on Fisher Linear Discriminant threshold to separate the plant region from the background. The
(FLD). The images that were used in the study were taken from dif- average error in pixel classification for segmentation of green areas
ferent soybean fields, under actual field conditions, and at different for each colour index based method is displayed in Fig. 7.
times of day. As a benchmark, three colour index-based methods The results of this study showed that the combination method
(NDI, ExGR, and CIVE) were compared with MS-FLD method to (COM1) provided the lowest percentage of average error for

20.00% 100.00%
Mean and SD of quality factor
Average error of segmentation

18.00% 90.00%
16.00% 80.00%
14.00% 70.00%
12.00% 60.00%
10.00% 50.00%
8.00% 40.00%
6.00% 30.00%
4.00% 20.00%
2.00% 10.00%
0.00% 0.00%
COM1 CIVE ExGR ExG VEG ExG ExGR VEG CIVE
Mean 8.31% 10.37% 10.71% 11.10% 18.23% Mean 89.98% 87.79% 87.43% 68.93%
Colour index-based method Colour index-based method

Fig. 7. Comparison of average error of greenness segmentation for COM1, CIVE, Fig. 8. Comparison of the mean and standard deviation of vegetation extraction for
ExGR, ExG, and VEG (Guijarro et al., 2011). ExG, ExGR, VEG, and CIVE. SD is indicted by error bar in the plot (Yu et al., 2013a).
E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199 191

greenness segmentation, whereas VEG showed the highest. Each of 100.00%

Mean accuracy rate


90.00%
CIVE, ExGR, and ExG method gave similar values of average error.
80.00%
Guijarro calculated the average weight of the four selected 70.00%
indexes over the set of the 240 images to find out their contribu- 60.00%
50.00%
tions to the average percentage error. The average weights were 40.00%
given: 0.12, 0.25, 0.30, and 0.33 for W GVEG , W GExG ; W GExGR ; and W GCIVE 30.00%
respectively. He found that there was a reverse correlation 20.00%
10.00%
between the obtained average weights and the percentage of error 0.00%
Qseg Sr Qseg Sr
for greenness. For example, CIVE has given the highest average
Data2011 Data2012
weight (0.33) and resulted in the lowest average percentage error
ExGR 71.40% 75.40% 68.20% 76.70%
over the others, whereas VEG has given the lowest average weight
MExG 69.50% 73.00% 63.10% 69.40%
(0.12) and caused the highest percentage error over the other
ExG 54.20% 56.00% 49.50% 52.50%
methods.
In conclusion, if the colour indices are applied simultaneously, Colour index-based method
they produce better greenness segmentation quality rather than
Fig. 10. Comparison of the segmentation quality (Qseg & Sr) of plant extraction for
when they are applied separately. CIVE’s contribution to the com- ExGR, MExG, and ExG for two different data sets under sunny conditions (Guo et al.,
bined method is greater than any other method while VEG was 2013).
the lowest. Both of ExG and ExGR have nearly the same
contribution.
There were one primary disadvantage associated with the com- In conclusion, all colour indices demonstrated good adaptability
bined method, which was increased computational time. in conditions of changing illumination (up to a certain degree of
Yu et al. (2013a) proposed a new method for crop segmentation illumination) and complex environments, however CIVE did not
based on colour segmentation called Affinity Propagation-Hue perform as well as the other indices.
Intensity (AP-HI). Five other algorithms were compared with it to The AP-HI method was in dealing with various environment
judge its performance. Among these, three colour index methods, conditions and complex background up to certain degrees. How-
namely, ExG, CIVE, ExGR were used with Otsu threshold and a ever, this method has limitations especially during day light where
fourth (VEG) was used with mean threshold method. The fifth some surfaces of the maize leaves acted like mirrors and reflected
method was a supervised learning algorithm called Environmen- light.
tally Adaptive Segmentation Algorithm (EASA) (Tian and Guo et al. (2013) introduced a new approach called Decision
Slaughter, 1998). Two experiments were carried out in two maize Tree based Segmentation Model (DTSM) for effective segmentation
fields in China under different circumstances to identify the of vegetation from plant images. The study was conducted in
growth stages of maize. The image samples were acquired under wheat fields in Japan and the test images were taken under various
various illumination conditions such as overcast, cloudy, and light conditions. Three colour indices (ExG, MExG, and ExGR) were
sunny days. Difficult backgrounds such as shadow, straws, pipes, used as a benchmark to evaluate the performance of the proposed
and other equipment were included. The efficiency of each algo- method. The accuracy of the segmentation methods is assessed by
rithm was evaluated through computing the mean and standard the same method that used in Meyer and Camargo-Neto (2008).
deviation (SD) of the quality factor defined in Xiao et al. (2011), Moreover, two tasks of segmentation quality were adopted in the
based on mis-classification error. study: one was based on the both plants and background regions
The results of the study have shown that AP-HI gave the highest (include plant pixels or background pixels) which was denoted as
performance, at 96.68%. The performance of EASA was in second Q seg and another was based only on the plant region (including
place; it outperformed the colour index-based algorithms with only plant pixels) and was denoted as Sr . The training process
mean of plant extraction equal to 93.20%. The performance of col- was carried out based on acquired images which were taken over
our index approaches can be seen in Fig. 8. a period of two years under different illumination conditions
It can be seen that ExG demonstrated the highest mean of (sunny and non-sunny) in 2011 and in 2012 (henceforth referred
greenness segmentation over the reminder of selected colour to as Data-2011 and Data-2012). Otsu’s method was used with
indexes, whereas CIVE has shown the lowest at 68.9%. ExGR and ExG and ExGR images for thresholding, while a zero threshold
VEG showed similar performance. was applied with MExG.

100.00% 100.0%
90.00%
Mean accuracy rate

90.0%
Mean and SD of quality factor

80.00%
70.00% 80.0%
60.00% 70.0%
50.00%
40.00% 60.0%
30.00% 50.0%
20.00% 40.0%
10.00%
0.00% 30.0%
Qseg Sr Qseg Sr
20.0%
Data2011 Data2012
10.0%
ExGR 74.50% 78.40% 74.60% 83.60%
0.0%
MExG 79.50% 84.40% 73.80% 81.17% ExGR ExG+Otsu CIVE
ExG 66.30% 69.40% 59.20% 63.30% Mean 62.00% 74.00% 77.10%
Colour index-based method Colour index-based method

Fig. 9. Comparison of the segmentation quality (Qseg & Sr) of plant extraction for Fig. 11. Comparison of the mean and standard deviation of segmentation for ExGR,
ExGR, MExG, and ExG for two different data sets under non-sunny conditions (Guo ExG + Otsu, and CIVE based on ATRWG metric. SD is indicted by error bar in the plot
et al., 2013). (Bai et al., 2013).
192 E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199

DTSM outperformed the three colour indices on segmentation 100.00%


quality. The mean value of Q seg was 80.6% for the Data-2011 data 90.00%

Mean and SD of quality factor


set and 76.7% for the Data-2012 data set. In addition, DTSM gave 80.00%
the best mean of green segmentation quality ðSr Þ compare to the 70.00%
colour indices methods, with 83.3% for Data-2011 and 83.1% for 60.00%
Data-2012. The Q seg and Sr for applied colour indices under non- 50.00%
sunny and sunny conditions are presented in Figs. 9 and 10 40.00%
respectively. 30.00%
According to the results presented in Fig. 9, the means of Q seg 20.00%
and Sr for ExGR for the Data-2012 set are slightly higher than those 10.00%
of MExG, whereas the means of Q seg and Sr of MExG were higher 0.00%
ExG+Otsu ExGR CIVE
than those of ExGR for the Data-2011 data set. The ExG method pro-
Mean 91.80% 80.60% 94.30%
duced the lowest segmentation quality (Qseg & Sr) compared to
ExGR and MExG in both years. Colour Index- based method
According to the results presented in Fig. 10, the means of Q seg
Fig. 13. Comparison of the mean and standard deviation of segmentation for ExGR,
and Sr for ExGR in both data sets under sunny condition were
ExG + Otsu, and CIVE based on evaluated method which is defined in Xiao et al.
higher than the other colour indices. (2011). SD is indicted by error bar in the plot (Bai et al., 2013).
In conclusion, the three colour indices considered have better
segmentation qualities for Sr than Q seg under both conditions.
Comparing the results in Fig. 9 to those of in Fig. 10, the colour for the applied methods: one defined by ATRWG (Neto, 2004)
indices performed better quality segmentation under non-sunny and other is given as in Xiao et al. (2011).
conditions than under sunny conditions. This suggests that colour The segmentation performance for the referred methods was
index methods may in general perform more poorly under sunny evaluated in three ways. Firstly, the tested image were taken under
conditions. The advantage of the DTSM algorithm proposed on different light conditions and used ATRWG metric to measure the
Guo et al. (2013) is that no threshold adjustments are required performance. The study showed that Bai’s algorithm demonstrated
for plant segmentation, unlike colour index methods. However, a the highest segmentation performance, with mean value of 87.2%,
disadvantage of DTSM is that it relies on training data. and standard deviation of 3.8%. The second highest performance of
Bai et al. (2013) introduced a new method for crop segmenta- segmentation quality was given by GMM method with mean of
tion based on the CIE Lab colour space, using morphological mod- 83.9%, and standard deviation of 7.2%. The performance of the
elling. The study was conducted in a rice paddy field in China, and three colour indices considered is shown in Fig. 11.
the images were taken under various conditions for different It can be seen that CIVE demonstrated the highest mean for seg-
growth status of rice plant. To verify the robustness of crop seg- mentation quality whereas ExGR gave the lowest mean of segmen-
mentation using the method under complex illumination, it was tation quality. The method of ExG & Otsu also demonstrated good
compared with six plant segmentation methods that included: segmentation quality.
three colour index based- methods (ExG with Otsu threshold, ExGR, Secondly, the test images were sorted based on their imaging
and CIVE); Environmentally Adaptive Segmentation Algorithm conditions; cloudy, overcast, and sunny. Each of set images was
(EASA) (Tian and Slaughter, 1998); colour image segmentation evaluated separately by using ATRWG metric. The experiment
method using Genetic Algorithm with HSI colour space (GAHSI) showed that Bai’s method also gave better segmentation quality
(Abbasgholipour et al., 2011); and colour image segmentation under different sky conditions than the other methods with mean
method based on Affinity Propagation-Hue Intensity (AP-HI) (Yu of segmentation quality of 85.7%, 86.0%, and 88.6% for cloudy, over-
et al., 2013a). Two well-known skin segmentation methods (seg- cast, and sunny conditions, respectively. The segmentation quality
menting colour pixels as either skin or non-skin classes) were also performance for ExGR, ExG, and CIVE are displayed in Fig. 12.
applied: Gaussian Mixture Modeling (GMM) (Bergasa et al., 2000; As can be seen from above figure, the best overall segmentation
Jones and Rehg, 2002) and the Hue–Saturation–Intensity and quality under the three conditions was obtained by CIVE, whereas
B-Spline curve fitting method (HSI&B-Spline) (Kim et al., 2008). the worst was obtained by ExGR. The ExG method demonstrated
Two approaches were used to measure the segmentation quality reasonably good segmentation quality under all three conditions.
Average segmentation quality

90.00%
80.00% 100.00%
70.00% 90.00%
Mean and SD of quality factor

60.00% 80.00%
50.00% 70.00%
40.00%
60.00%
30.00%
50.00%
20.00%
40.00%
10.00%
0.00% 30.00%
Cloudy Overcast Sunny 20.00%
CIVE 83.10% 74.50% 74.20%
10.00%
ExG+Otsu 77.70% 69.00% 73.50%
0.00%
ExGR 62.40% 58.00% 63.20% ExGR ExG+Otsu
Mean 62.30% 76.20%
Colour index-based method
Colour index-based method
Fig. 12. Comparison of the mean plant extraction for ExGR, ExG + Otsu threshold,
and CIVE under cloudy, overcast, and sunny conditions based on ATRWG metric (Bai Fig. 14. Comparison of mean and standard deviation of plant extraction for ExGR
et al., 2013). and ExG + Otsu. SD is indicted by error bar in the plot (Bai et al., 2014).
E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199 193

Thirdly, the test images were taken under different lighting con- 100.00%
ditions and the method defined in Xiao et al. (2011) was used to 90.00%

Mean and SD of quality factor


measure performance. In this evaluation, the proposed method 80.00%
improved the mean of segmentation quality and reached up to 70.00%
96.0% with standard deviation of only 1.5%. 60.00%
The EASA, GAHSI, AP-HI, GMM, and HIS&B-spline gave mean of 50.00%
segmentation qualities as: 93.9%, 92.4%, 92.5%, 95.2%, and 87.7% 40.00%
respectively. The performance of the three colour indices consid- 30.00%
ered is shown in Fig. 13. 20.00%
It can be seen that CIVE demonstrated the highest mean for seg- 10.00%
mentation quality among applied colour indexes; CIVE also gave 0.00%
ExG NDI VEG CIVE
better performance that the other methods such as EASA, GAHSI,
Mean 73.93% 79.35% 75.92% 86.84%
AP-HI, and HIS&B-spline. ExGR gave the lowest mean of segmenta-
tion quality. The method of ExG with Otsu threshold also demon- Colour index-based method
strated very good segmentation quality, at 91.80%.
Fig. 16. Comparison of mean and standard deviation of crop extraction for ExG, NDI,
Bai et al. (2014) proposed a new plant segmentation approach VEG, and CIVE. SD is indicted by error bar in the plot (Ye et al., 2015).
based on Lab colour space and a clustering method, namely, Parti-
cle Swarm Optimization (PSO) based k-mean. The images that were
used in the study were captured under real conditions, in rice and
cotton fields. Three segmentation approaches (ExG and Otsu, ExGR, of growing days after sowing from 35 to 75. The images of the
and EASA) were used for benchmark purposes. Also, two methods fields were taken by a camera mounted on a UAV at different flight
that had been previously applied for segmenting human skin altitudes (30 m, 60 m). Six colour indices (ExG, ExGR, CIVE, Woeb-
(GMM and ColourHist) were applied to assess the performance of becke Index as given in Eq. (4) (Woebbecke et al., 1995), NGRDI,
the Bai’s method. The ATRWG method was applied to evaluate VEG, and two combined colour indices, COM1 (Guijarro et al.,
the quality of segmentation for each segmented image, and means 2011) and COM2 (Guerrero et al., 2012), were applied to evaluate
and standard deviations of the segmentation accuracies were cal- the VF mapping. The VF is the percentage of pixels classified as
culated. According to results of the study, Bai’s method obtained vegetation in a given area. The mean accuracy (A) and standard
the highest performance of segmentation quality over the others, deviation (SD) were calculated for every index based on three fac-
achieving 88.1% for the mean and 4.7% for standard deviation. tors: threshold, flight date, and altitude. In addition, the coefficient
The method of GMM demonstrated very good performance of variation was calculated to get the best average accuracies of
close to Bai’s method, with mean of 86.9% and standard deviation every vegetation index along the six tested flight dates. According
of 6.9%. The method of ColourHist demonstrated good perfor- to the results of the study, the highest mean accuracy was obtained
mance, 82.1% for the mean and 6.4% for standard deviation. from the images that were captured at 30 m flight altitude, so only
The method of EASA provided also good segmentation results, results obtained at that altitude are considered here. The mean and
with mean of 80.2% and standard deviation of 7.8%. For the two col- standard deviation of all colour indices are shown in Fig. 15.
our index methods considered, ExGR and ExG with Otsu threshold, ExG gave the highest mean accuracy over the other Vegetation
the means and standard deviations of are shown in Fig. 14. Index (VI) methods, at 90.20%, however, most of the other indices
As it can be seen from Fig. 14, ExG with Otsu threshold demon- gave quite similar levels of performance. CIVE has the lowest mean
strated higher mean of segmentation quality than ExGR. accuracy over the other VI methods, at 77.16%.
The performance of colour index-based methods was poorer Ye et al. (2015) introduced a novel method to improve the qual-
than the other algorithms in the study. Bai et al. suggested this ity of crop image extraction under strong illumination conditions
poor performance was because ExGR and ExG with Otsu threshold such as shadow and highlighted region due to sunshine. Ye
usually resulted in over-segmentation or under-segmentation. A suggested reasons for misclassification of crop extraction under a
disadvantage of Bai’s method is that it requires a number of pro- variety of illumination such as cloudy, sunny, and over-sunny
cessing steps, which may affect real time application. weather. In cloudy weather, there are two factors that cause
Torres-Sánchez et al. (2014) measured the accuracy of vegeta- classifying of soil pixels as crop. One is the reduction in the red
tion fraction (VF) mapping for wheat fields at different numbers component in the image because of lack of illumination, the other
one is that the colour of soil is close to dark green. In sunny
weather, shadows generated depending on the relative position
100.00% of the sun and the object cause classification of shadow pixels as
90.00% plant pixels. In over-sunny weather, the dense sunshine produces
Mean and SD of quality factor

80.00% specular reflection (white light spots) in the leaf or soil. This leads
70.00% to mis-classifying of those pixels. Ye proposed a segmentation
60.00% method based on Probabilistic Superpixel Markov Random Field
50.00% (PFMRF). This was based on the assumption that colour gradually
40.00% changes of hue intensity between highlighted areas of crops and
30.00% neighbouring non-highlighted areas.
20.00% The images that were used in the experiment were taken from
10.00% two different crops (cotton and corn) at different stages of growth,
0.00% under actual field conditions including on dark and bright days. To
ExG VEG COM1 COM2 ExGR NGRDI WI CIVE evaluate the performance of the PFMRF method, seven common
Mean 90.20% 89.65% 89.19% 88.09% 89.09% 89.40% 85.73% 77.16%
algorithms were selected for comparison. Among them, four colour
Colour index-based method index-based methods (ExG, NDI, VEG, and CIVE) were applied. In
addition, two learning-based segmentation methods (EASA and
Fig. 15. Comparison of the mean and standard deviation of plant extraction for ExG,
VEG, COM1, COM2, ExGR, NGRD, WI, and CIVE of images were captured at 30 m flight
HI-AP) were applied. Hue Intensity and Probabilistic Super-Pixel
altitude (Torres-Sánchez et al., 2014). Markov Random Field (HI-MRF) proposed by Yu et al. (2013b)
194 E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199

Table 1
Comparison of plant segmentation methods based colour indices.

Author Method Description Advantages Disadvantages


Woebbecke et al. NDI Normalised (1) Easy to compute (1) Does not perform well when
(1992) Difference Index (2) Somewhat robust to lighting, except for extreme values the light is very high or very low
(2) Many false positives

Woebbecke et al. ExG Excess Green Index (1) Easy to compute (1) Does not perform well when
(1995) (2) Widely used the light is high or low
(3) Low sensitivity to background errors and lighting conditions
(4) Showed good adaptability in outdoor environment
Meyer et al. (1998) ExR Excess Red Index (1) Easy to compute (1) Does not perform well when
(2) Although it relies only on red component, it still extracts green pixels the light is high or low
(3) Segment soil texture (2) It is not as accurate as ExG
Kataoka et al. CIVE Colour Index of (1) Low running time (1) Performs poorly when light is
(2003) Vegetation (2) Showed good adaptability in outdoor environment weak or strong
Extraction (2) Has poor adaptability with
shadow
Neto (2004) ExGR Excess Green (1) Showed good adaptability in outdoor environments (1) Does not perform well when
minus Excess Red (2) Can do two tasks: extracting green by ExG and eliminating background the light is high or low
Index noise by ExR (2) Segments the pixel of shadow
as plants (over-segmentation)
Hunt et al. (2005) NGRDI Normalised (1) Reduces the differences in exposure settings selected by the digital (1) Does not perform well when
Green–Red camera the light is high or low
Difference Index (2) Consists of two components (8): one is used to discriminate between (2) Limited use
green plants and soil, and other is used to normalise for variations in light
intensity between different images
Hague et al. (2006) VEG Vegetative Index (1) Invariant to the colour temperature of a black body illuminant (1) Does not perform well when
(2) Insensitive to the amplitude of the illumination the light is high or low. Complex to
(3) Requires a single threshold implement
Guijarro et al. COM1 Combined ExG, (1) Showed very good adaptability in outdoor environment (1) Increase of computational time
(2011) ExGR, CIVE, and (2) Does not perform well when
VEG indexes the light is high or low
(3) Segments shadow as part of
plant because of CIVE
Burgos-Artizzu MExG Modified Excess (1) Showed very good adaptability in outdoor environment (1) Does not perform well when
et al. (2011) Green Index the light is high or low
Guerrero et al. COM2 Combined ExG, (1) Showed very good adaptability in outdoor environment (1) Increased computational time,
(2012) CIVE, and VEG but less than COM1
indexes (2) Does not perform well when
the light is high or low

was also applied. A performance measure (k) (Xiang and Tian, segmentation approaches that have been recently proposed, in
2011) based on the misclassification error was used. particular based on thresholding, and machine learning.
The results of the study showed that the proposed PFMRF
method gave the highest performance over all applied algorithms, 5.1. Threshold-based approaches
with mean of 92.29%, and with the lowest SD, at 4.65%. The perfor-
mance of HI-AP was in second place, at 88.52%, while the perfor- Threshold techniques that are applied in plant/weed detection
mance of EASA was almost the same, at 88.42%. The performance based on image segmentation have generally assumed a two class
of HI-MRF was also high, at 87.74%. The performance of the four problem, namely, plant vegetation class and soil background class.
colour index-based methods can be seen in Fig. 16. Thresholding is generally applied to a transformation of the origi-
As it can be seen from Fig. 16, CIVE demonstrated the best per- nal image in order to determine the class; for example, many of the
formance among colour index-based methods, at 86.8%, whereas colour-index based approaches considered earlier used either zero
ExG showed the lowest. Both NDI and VEG demonstrated good per- threshold or a threshold based on Otsu’s method. However, other
formance. All colour index-based methods demonstrated good more sophisticated approaches for threshold selection exist.
adaptability in light changing, but failed when shadow and high- Choosing the proper threshold plays an important role in segmen-
light conditions occur. tation. For example, if the threshold value is set too high, some
A summary of the established colour index-based segmentation important regions (plant pixels) may be merged with other regions
methods, highlighting their primary advantages and disadvan- (background pixels) which leads to under-segmentation, while a
tages, is presented in Table 1. low threshold that is set too low may lead to over-segmentation.
Thus, numerous researchers have applied different threshold tech-
nique to address these problems. These techniques are given as fol-
5. Other segmentation approaches lows. Dynamic thresholding was applied in Reid and Searcy (1987).
Hysteresis thresholding was applied in Marchant et al. (1998).
The previous two sections have considered colour index-based Fixed threshold is also a technique which was utilised in many
segmentation methods in some detail, including their perfor- studies such as Hemming and Rath (2001) and Aitkenhead et al.
mance. This section briefly discusses some of the other (2003). Tellaeche et al. (2008) applied entropy of a histogram to
E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199 195

Table 2
Comparison of threshold based segmentation methods.

Author Method Description Advantages Disadvantages


Reid and Searcy Dynamic Thresholds are dynamically set according to local (1) Insensitive to shading or (1) Increase in computation time since it
(1987) threshold rather than global characteristics. The approach is to gradually changing requires several steps
partition the image into sub-images of size m  m illumination
pixels, and then choose a threshold for each
sub-image
Marchant et al. Hysteresis Includes two thresholds, high and low. This leads to (1) Effective in handling (1) Various morphological operations
(1998) threshold the creation of 3 classes: below low threshold (to be overlap between the modes were required to improve the
removed), above high threshold (to be retained), and in the histogram of an image segmentation, increasing computation
between low and high thresholds (to be retained
only if connected to a pixel above high threshold)
Hemming and Fixed Empirical threshold selection (1) Simple (1) Sensitive to light changes
Rath (2001) threshold
and
Aitkenhead
et al. (2003)
Tellaeche et al. Entropy of a Can be chosen through the peaks of grey-level (1) Easy to choose a (1) It is hard to choose a threshold value
(2008) histogram histogram of an image threshold value when grey- when the peaks vary significantly in size
level histograms are bimodal and the distance between modes is
(plant and soil) relatively large
Otsu (1979) Otsu Based on finding the threshold that minimises the (1) Automatic method (1) Can produce under-segmentation, i.e.
threshold weighted within-class variance (2) Widely used some green pixels were not identified in
some circumstances
(2) Slower than the mean intensity
method
Gebhardt et al. Homogeneity Local homogeneity is calculated for an image pixel (1) Helpful in recognising (1) Increase in computation time since it
(2006) and threshold and used to obtain a homogeneity threshold value to small objects requires several steps
Gebhardt and derive binary images (2) Since local information is
Kaühbauch considered, may be useful to
(2007) address light changes
Kirk et al. (2009) Automatic The threshold value is selected based on Gaussian (1) Good in handling light (1) Increase in computation time since it
threshold distribution functions of intensities; the Gaussian changes requires several steps
distribution with the lower mean represents the soil (2) Automatic method
and the one with the higher mean represents plant
vegetation
Jeon et al. (2011) Automatic The threshold value is determined by dividing the (1) Provides adaptive (1) Requires threshold adjustment to
threshold pixel distribution of the image into two groups by a segmentation update segmentation limit especially for
pixel value ranging from 1 to 255. The pixel value (2) Automatic method high plant density
that minimises the variance sum of two groups was (2) High computation time since it
used as the threshold value for each image requires several steps

distinguish plant vegetation pixels from soil pixels. Otsu’s method the lower mean represented the soil distribution and the one with
is a threshold technique widely used in many applications of image the higher mean represented plant vegetation distribution. Jeon
processing based-segmentation. According to a survey carried out et al. (2011) applied another threshold technique to automatically
by Sahoo et al. (1988) to compare the segmentation accuracy of segment plant pixels from soil pixels based on transformed RGB
nine threshold methods, Otsu’s method demonstrated the highest image (nearly greyscale image).
accuracy value over the others. This has inspired numerous Meyer and Camargo-Neto (2008) have examined the segmenta-
researchers to utilise it particularly in plant and weed segmenta- tion quality for some colour indices with using automatic Otsu
tion. Otsu’s method was applied by Ling and Ruzhitsky in (1996) threshold and zero threshold methods; in particular, ExG and NDI
to segment tomato seedlings from background. It was also applied were tested with an Otsu threshold, and ExGR was tested with zero
in Shrestha et al. (2004) to separate the plant vegetation from the threshold. The results showed that the fixed zero threshold was
background; it was preferred to remove the noise pixels instead of sufficient for binarisation of ExGR images, so the Otsu’s method
using morphological dilation as it does not require as much com- was not required. Two different automatic threshold approaches
putation as dilation operations. Gebhardt et al. (2006) and were used and evaluated for vegetation segmentation in Guijarro
Gebhardt and Kaühbauch (2007) introduced an algorithm to seg- et al. (2011): one was Otsu’s method and the other was based on
ment weed leaves from grassland by converting RGB images into mean intensity. The results showed that the Otsu threshold pro-
greyscale image intensity and then calculating local homogeneity duced under-segmentation, i.e. some green pixels were not identi-
images and obtaining a homogeneity threshold value to derive bin- fied. Besides, it was slower than the mean intensity method.
ary images. Finally, morphological opening was used to eliminate Therefore, the automatic threshold adjustment approach (the
the remaining blades of grass in the binary images. Kirk et al. mean intensity value) was adopted in the study as it produced fast
(2009) introduced a new algorithm for pixel classification (plant and robust segmentation. On the other hand, the mean intensity
or soil pixels) to work under a variety of illuminations. The algo- value was not found suitable in Burgos-Artizzu et al. (2011) to
rithm is based on greenness and intensity pixels, which are derived binarise a grey image which was generated by the combined colour
from the combination of R and G pixel values, and an automatic indexes, because its value was less than the threshold value
threshold was applied based on the assumption of two Gaussian received with Otsu method. Therefore, the combination of Otsu
distribution functions of intensities; the Gaussian distribution with and a morphological operation were used instead. The advantages
196 E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199

Table 3
Comparison of learning based segmentation methods.

Reference Method Description Colour Task Advantages Disadvantages


model
Tian and Slaughter EASA Environmentally RGB space Detect plants (1) Adapts to most daytime (1) Only 45–66% of all the
(1998) Adaptive conditions in outdoor fields cotyledons were recognised under
Segmentation partially cloudy and overcast
Algorithm conditions
(2) It requires sufficient training
data to obtain good segmentation
results
Meyer et al. (2004) FC Fuzzy Clustering RGB space Extract the plant (1) Identifying green plants from (1) When plant pixel coverage is less
region of interest soil and residue than 10% in the image, there
from ExG and ExR apparently is not enough colour
images information to cluster them
Ruiz-Ruiz et al. EASA Environmentally Hue– Plant image (1) Reduced the computation time (1) It is not effective to segment
(2009) Adaptive saturation segmentation (2) It is more robust to a variety of plants at early growing stage where
Segmentation (HS) and under complex illumination than the EASA in Tian the cotyledons start to appear
Algorithm only hue field conditions and Slaughter (1998)
(H)
Zheng et al. (2009) MS-BPNN Mean-shift RGB and Classify between (1) Demonstrate good segmentation (1) It suffers from long run time
algorithm with HSI colour plant and non-plant performance under different (2) Suffers from low segmentation
Back Propagation space region illuminations rate on the green parts with
Neural Network shadows
Zheng et al. (2010) MS-FLD Mean-shift LUV space Separate green (1) No longer suffers from the low (1) It suffers from long run time
algorithm with from non-green segmentation rate on the green
Fisher Linear vegetation parts with shadows
Discriminant
Guerrero et al. SVM Support Vector RGB space Classify between (1) The method is able to identify (1) Relies on other steps (threshold)
(2012) Machines masked (soil and plants (weeds and crops) when they
other materials) have been contaminated with
and unmasked materials coming from the soil, due
(plants) plant to artificial irrigation or natural
regions rainfall
Guo et al. (2013) DTSM Decision Tree RGB space Segment the (1) Addressing illumination (1) It relies on training data
based vegetation form the problem such as shadow and
Segmentation background specularly reflected regions
Model (2) Not requiring a threshold
adjustment for each image
Yu et al. (2013a) AP-HI Affinity Hue– Separate the pixels (1) Robust and not sensitive to the (1) Misclassifying highlighted
Propagation-Hue Intensity of crop and challenging variation of outdoor region in leaves
Intensity (HI) space background under luminosity and complex
light conditions and environmental elements
complex
environment
Bai et al. (2013) MM Morphology Lab colour Distinguishes the (1) Robust to the variation of (1) Despite utilizing different sizes
Modelling space crop and illumination in the field of structure elements in the training
background pixels phase, it did not give a significant
under complex improvement; the mean of
illumination segmentation qualities of MM was
conditions 87.2%
Bai et al. (2014) PSO-MM Particle Swarm Lab colour Distinguishes the (1) Robust to variation of (1) It suffers from long run time as it
Optimisation space crop and illumination in the field depends on many processing steps
clustering and background pixels
Morphology under complex
Modelling illumination
conditions

and disadvantages of threshold based-approaches are summarised instance, Meyer et al. (2004) applied unsupervised learning
in Table 2. approach called fuzzy clustering to extract the area of interest from
ExG and ExR images. For supervised learning approaches, several
5.2. Learning-based approaches researchers have also proposed several approaches. Tian and
Slaughter (1998) proposed Environmentally Adaptive Segmenta-
Although the colour-based approaches have demonstrated tion Algorithm (EASA) and applied to normalised RGB images of
promising segmentation results, there are a few cases where it outdoor fields to detect plants. Later, Ruiz-Ruiz et al. (2009)
could not perform well particularly in sunny and overcast condi- applied EASA with hue–saturation (HS) and only hue (H) instead
tions. As a result, several studies have investigated more sophisti- of RGB colour space to produce robust and fast plant image seg-
cated approaches, including applied supervised and unsupervised mentation under complex field conditions. Zheng et al. (2009) pro-
machine learning approaches with simple transformation of colour posed a supervised mean-shift algorithm with Back Propagation
features such as HIS, LUV, and LAB, or with colour index to extract Neural Network to classify images into plant and non-plant
the plant pixels from the background, and looked to improve the regions. The features used in the algorithm were RGB and HSI col-
segmentation under variety of illumination conditions. For our space. Zheng et al. (2010) applied a supervised mean-shift
E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199 197

Table 4
Suggested segmentation algorithms for use in different conditions.

Algorithms Relative Real-time Accurate Suitable application fields Suggested algorithms


Complexity performance
Cloudy Overcast Sunny
Colour index- Simple Effective Low accuracy if Effective Poor Poor – For cloudy day: CIVE, COM1
based approach the light is strong segmentation segmentation – For overcast day: CIVE, ExGR
or poor result result – For sunny day: COM2 or ExG because both are
good for addressing shadow
Threshold-based Fairly Somewhat Fairly good Effective Threshold Threshed – For cloudy day: Otsu
approach sample. effective accuracy adjustments adjustments – For overcast and sunny days: Dynamic
are required are required threshold or Homogeneity threshold
Learning-based Complex Expansive High accuracy Effective Several Several – For cloudy day: EASA
approach training steps training steps – For overcast day: AP-HI
are required are required – For sunny day: DTSM because it is good in
addressing problems such as shadow and
specularly reflected regions

algorithm, but with Fisher Linear Discriminant to separate green Threshold based-methods require several adjustments with dif-
from non-green plant; the colour space used in the algorithm ferent lighting conditions. Therefore, once change occurs, the seg-
was LUV instead of RGB and HSI. Support vector machines (SVM) mentation error may increase. Moreover, some threshold
have been applied as the learning method to classify between techniques might be suitable for one case, but not for others.
masked and unmasked plant regions by Guerrero et al. (2012). The learning-based approaches demonstrated better perfor-
To address illumination problem such as shadow and specularly mance over colour index-based methods under a variety of illumi-
reflected regions, Guo et al. (2013) introduced a new method as nation conditions because they rely on a training phase, but this
learning approach based on decision tree model to segment plant results increased computation time which is not preferable in real
region form the background in RGB images. The advantages and time applications. Moreover, in order to perform reliable segmen-
disadvantages of learning based-approaches are summarised in tation results, substantial training samples are required.
Table 3. While good segmentation performance has been achieved with
the methods considered, several challenges remain:
6. Discussion and conclusions
 Lighting conditions: cloudy, overcast, and sunny conditions
According to some of the studies considered above, colour impact segmentation quality. For example, when the light is
index-based methods have some limitations: they may result in strong as on a sunny day, the surface of some leaf types such
over-segmentation (excessive green) in one application and as corn leaf, acts as a mirror (specular reflection); as a result,
under-segmentation in another application, especially when a sin- it may be segmented into the wrong category.
gle index is applied by itself. This varies considerably with imaging  Shadow, including shadow caused by a plant itself or by other
conditions, and the fact that the same test data are not used in all objects (cast shadow), may be extracted as foreground (plant
studies makes direct comparison more difficult. Few comparative vegetation); as a result, the mis-segmentation rate is increased.
studies have been carried out using a common set of test data.  Complex background (scene of the image), including straws,
One somewhat recent example was carried out by Meyer and stones, soil colour, water pipes, and other residues, can affect
Camargo-Neto (2008), to compare three green indices, namely, the segmentation quality particularly if a background element
ExGR, ExG, and NDI. However, colour index-based methods have has a green colour such as green pipe; as a result, it might be
both advantages and disadvantages that can be summarised as mis-segmented as plant.
follows:
Advantages: These factors still remain as serious challenges for the available
segmentation approaches. Therefore, further research is required
 Simple methods that are easy to understand and implement. to fully optimise the technology of computer vision for the com-
 Easy to modify their formulas to create a new colour index. plex conditions that may occur in commercial agriculture fields.
 Generally do not require training. In addition to the development of specific algorithms for pro-
 Generally require low computation which makes them suitable cessing colour images, a number of studies have also considered
for real time use. other factors associated with acquiring images, and the issues that
 They are effective in normal condition where the light is neither need to be considered in order to obtain good performance.
very high nor very low. Woebbecke et al. (1994) considered the detection of plants using
 Some of the colour index-based methods have shown results a range of sensors (thermal and optical) and determined that the
that are comparable to other more sophisticated methods e.g. location and coverage of target plant components within the field
see study by Bai et al. (2013). of view of the sensor can significantly influence performance and
must be taken into consideration. This work was extended by
Disadvantages: Criner et al. (1999) who specifically considered the detection of
bind weed and determined the maximum field of view for a given
 They require threshold optimisation to meet the particular tar- target size on bare soil. Both of these studies uses the Normalised
get for final segmentation. Difference Vegetative Index (NDVI), which is the ratio of the differ-
 They generally cannot perform good segmentation when the ence between near infra-red reflectance and red components, and
light is strong or poor. their sum. Criner et al. also emphasised the value of being able to
 They are only suitable for segmentation where the dominant configure the detection algorithm based on specific conditions, e.g.
plant colour is green. knowledge of field conditions or soil moisture can be used to adapt
198 E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199

detection thresholds to maximise performance. Later studies Slaughter, D.C., Giles, D.K., Downey, D., 2008. Autonomous robotic weed control
systems: a review. Comput. Electron. Agric. 61, 63–78.
reflected increasing use of digital visible spectrum cameras and
Dew, D.A., 1972. An index of competition for estimating crop loss due to weeds. Can.
relied on the indices based on R, G and B channels and their deriva- J. Plant Sci. 52 (6), 921–927.
tives discussed in this paper. Meyer et al. (2004) described a sys- Donald, W.W., 2007. Between-row mowing systems control summer annual weeds
tem based on the use of a number of clustering algorithms using in no-till grain sorghum. Weed Technol. 21 (2), 511–517.
Dworak, V., Selbeck, J., Dammer, K.H., Hoffmann, M., Zarezadeh, A.A., Bobda, C.,
colour indices. Images were acquired using a camera that automat- 2013. Strategy for the development of a smart NDVI camera system for outdoor
ically set parameters such as focus, exposure time and white bal- plant detection and agricultural embedded systems. Sensors (Switzerland) 13,
ance. More recent studies have also examined the specifics of the 1523–1538. http://dx.doi.org/10.3390/s130201523.
Eyre, M.D., Critchley, C.N.R., Leifert, C., Wilcockson, S.J., 2011. Crop sequence, crop
imaging sensor. For example, Dworak et al. (2013) used a low- protection and fertility management effects on weed cover in an
cost single-chip camera again using NDVI, and compared it to a organic/conventional farm management trial. Eur. J. Agron. 34, 153–162.
much more expensive specialised imaging device. Good perfor- Forcella, F., 2000. Rotary hoeing substitutes for two-thirds rate of soil-applied
herbicide. Weed Technol. 14 (2), 298–303.
mance was achieved by appropriately reconfiguring camera filters, Gebhardt, S., Kaühbauch, W.A., 2007. A new algorithm for automatic Rumex
coupled with algorithmic modifications. While the topic of the obtusifolius detection in digital image using colour and texture features and the
optimal camera parameters (such as field of view) is beyond the influence of image resolution. Precision Agric. 8 (1), 1–13.
Gebhardt, S., Schellberg, J., Lock, R., Kaühbauch, W.A., 2006. Identification of
scope of this survey, previous studies suggest that the ability to broadleaved dock (Rumex obtusifolius L.) on grassland by means of digital image
adapt detection algorithm parameters such as thresholds can pro- processing. Precision Agric. 7 (3), 165–178.
vide an advantage in ensuring optimal performance. Gianessi, L.P., Reigner, N.P., 2007. The value of herbicides in U.S. crop production.
Weed Technol. 21 (2), 559–566.
By way of conclusion, Table 4 summarises the key conclusions
Grichar, W.J., Colburn, A.E., 1993. Effect of dinitroaniline herbicides upon yield and
from the review, and in particular suggests specific algorithms that grade of five runner cultivars. Peanut Sci. 20, 126–128.
may perform well in particular conditions, based on analysis of Guerrero, J.M., Pajares, G., Montalvo, M., Romeo, J., Guijarro, M., 2012. Support
their performance based on studies from the literature. vector machines for crop/weeds identification in maize fields. Exp. Syst. Appl.
39, 11149–11155.
Based on prevalence in literature, the survey focused on colour Guijarro, M., Pajares, G., Riomoros, I., Herrera, P.J., Burgos-Artizzu, X.P., Ribeiro, A.,
index based-methods. While performing well in their own right, 2011. Automatic segmentation of relevant textures in agricultural images.
these methods are also widely used as a reference to evaluate Comput. Electron. Agric. 75, 75–83. http://dx.doi.org/10.1016/
j.compag.2010.09.013.
the performance of other proposed methods. A detailed discussion Guo, W., Rage, U.K., Ninomiya, S., 2013. Illumination invariant segmentation of
of the performance of colour index based-methods, based on a vegetation for time series wheat images based on decision tree model. Comput.
number of recent studies, was presented. Threshold-based Electron. Agric. 96, 58–66.
Hague, T., Tillet, N., Wheeler, H., 2006. Automated crop and weed monitoring in
approaches were briefly discussed and their advantages and widely spaced cereals. Precision Agric. 1 (1), 95–113.
disadvantages were presented. In addition, the advantages and Hall, M.R., Swanton, C.J., Anderson, G.W., 1992. The critical period of weed control in
disadvantages of learning-based segmentation methods were grain corn (Zea mays). Weed Sci. 40, 441–447.
Heisel, T., Chersitensen, S., Walter, A.M., 1999. Whole-field experiments with sit-
briefly considered. The challenges and limitations that continue specific weed management. Proceeding of the Second European Conference on
to hold for segmentation approaches were also highlighted. Finally, Precision Agriculture, vol. 2, pp. 759–768.
Suggested segmentation algorithms for use in different conditions Hemming, J., Rath, T., 2001. PA-precision agriculture. Computer-vision-based weed
identification under field conditions using controlled lighting. J. Agric. Eng. Res.
were give.
78 (3), 233–243.
Hunt, E.R., Cavigelli, M., Daughtry, C.S.T., McMurtrey, J.E., Walthall, C.L., 2005.
Evaluation of digital photography from model aircraft for remote sensing of
References crop biomass and nitrogen status. Precision Agric. 6, 359–378.
Jeon, Gwanggil, 2014. Color image enhancement by histogram equalization in
heterogeneous color space. Int. J. Multimedia Ubiquitous Eng. 9 (7), 309–318.
Abbasgholipour, M., Omid, M., Keyhani, A., Mohtasebi, S.S., 2011. Color image
http://dx.doi.org/10.14257/ijmue.2014.9.7.26.
segmentation with genetic algorithm in a raisin sorting system based on
Jeon, Hong Y., Tian, Lei F., Zhu, Heping, 2011. Robust crop and weed segmentation
machine vision in variable conditions. Expert Syst. Appl. 38, 3671–3678.
under uncontrolled outdoor illumination. Sensors 11 (12), 6270–6283. http://
Aitkenhead, M.J., Dalgetty, I.A., Mullins, C.E., McDonald, A.J.S., Strachan, N.J.C., 2003.
dx.doi.org/10.3390/s110606270.
Weed and crop discrimination using image analysis and artificial intelligence
Jeschke, M.R., Stoltenberg, D.E., Kegode, G.O., Sprague, C.L., Knezevic, S.Z., Hock, S.M.,
methods. Comput. Electron. Agric. 39 (3), 157–171.
Johnson, G.A., 2011. Predicted soybean yield loss as affected by emergence time
Bai, X.D., Cao, Z.G., Wang, Y., Yu, Z.H., Zhang, X.F., Li, C.N., 2013. Crop segmentation
of mixed-species weed communities. Weed Sci. 59, 416–423.
from images by morphology modeling in the CIE L⁄a⁄b⁄ color space. Comput.
Jones, M.J., Rehg, J.M., 2002. Statistical color models with application to skin
Electron. Agric. 99 (November), 21–34. http://dx.doi.org/10.1016/j.compag.
detection. Int. J. Comput. Vision 46, 81–96.
2013.08.022.
Karkanis, Anestis, Bilalis, Dimitrios, Efthimiadou, Aspasia, Katsenios, Nikolaos, 2012.
Bai, Xiaodong, Cao, Zhiguo, Wang, Yu, Hu, Zhu, Zhang, Xuefen, Li, Cuina, 2014.
The critical period for weed competition in parsley (Petroselinum crispum (Mill.)
Vegetation segmentation robust to illumination variations based on clustering
Nyman Ex A.W. Hill) in Mediterranean Areas. Crop Prot. 42 (December), 268–
and morphology modelling. Biosyst. Eng. 125 (September), 80–97. http://dx.doi.
272. http://dx.doi.org/10.1016/j.cropro.2012.07.003.
org/10.1016/j.biosystemseng.2014.06.015.
Kataoka, T., Kaneko, T., Okamoto, H., Hata, S., 2003. Crop growth estimation system
BCC Research Chemical Report, Jun 2014. CHM029E. Global Markets for
using machine vision. Proc. 2003 IEEE/ASME Int. Conf. Adv. Intell. Mechatronics
Biopesticides. Available at: <http://www.bccresearch.com/market-research/
(AIM 2003), vol. 2, pp. 1079–1083. http://dx.doi.org/10.1109/AIM.2003.
chemicals/biopesticides-chm029e.html>.
1225492.
Bergasa, L.M., Mazo, M., Gardel, a., Sotelo, M.a., Boquete, L., 2000. Unsupervised and
Kim, C., You, B.-J., Jeong, M.-H., Kim, H., 2008. Color segmentation robust to
adaptive Gaussian skin-color model. Image Vision Comput. 18, 987–1003.
brightness variations by using B-spline curve modeling. Pattern Recogn. 41,
Berge, T.W., Goldberg, S., Kaspersen, K., Netland, J., 2012. Towards machine vision
22–37.
based site-specific weed management in cereals. Comput. Electron. Agric. 81,
Kirk, K., Andersen, H.J., Thomsen, A.G., Jørgensen, J.R., 2009. Estimation of leaf
79–86.
area index in cereal crops using red–green images. Biosyst. Eng. 104, 308–
Bhanu, B., Jones, T.L., 1993. Image understanding research for automatic target
317.
recognition. Aerospace Electron. Syst. (IEEE) 8 (10), 15–23.
Knezevic, S.Z., Weise, S.F., Swanton, C.J., 1994. Interference of redroot pigweed
Bridges, D.C., 1992. Crop Losses due to Weeds in the United States-1992. Weed
(Amaranthus retroflexus) in corn (Zea mays). Weed Sci. 42, 568–573.
Science Society of America, Champaign, ll.
Kropff, M.J., Weaver, S.E., Smits, M.A., 1992. Use of ecophysiological models of crop-
Burgos-Artizzu, X.P., Ribeiro, A., Guijarro, M., Pajares, G., 2011. Real-time image
weed interference: relation amongst weed density, relative time of weed
processing for crop/weed discrimination in maize fields. Comput. Electron.
emergence, relative leaf area and yield loss. Weed Sci. 40, 296–301.
Agric. 75 (2), 337–346.
Lamm, R.D., Slaughter, D.C., Giles, D.K., 2002. Precision weed control for cotton.
Camargo, a., Smith, J.S., 2009. An image-processing based algorithm to
Trans. ASAE 45, 231–238.
automatically identify plant disease visual symptoms. Biosyst. Eng. 102, 9–21.
Lei, Z., Jun, K., Xiaoyun, Z., Jiayue, R., 2008. Plant species identification based on
http://dx.doi.org/10.1016/j.biosystemseng.2008.09.030.
neural network. Proc. – 4th Int. Conf. Nat. Comput. ICNC 2008, vol. 5, pp. 90–94.
Christensen, S., Søgaard, H.T., Kudsk, P., Nørremark, M., Lund, I., Nadimi, E.S.,
http://dx.doi.org/10.1109/ICNC.2008.253.
Jørgensen, R., 2009. Site-specific weed control technologies. Weed Res. 49, 233–
Lindquist, J.L., Dieleman, A.J., Mortensen, D.A., Johnson, G.A., Wyse-Pester, D.Y.,
241.
1998. Economic importance of managing spatially heterogeneous weed
Criner, B.R., Solie, J.B., Stone, M.L., Whitney, R.W., 1999. Field-of-view determination
populations. Weed Technol. 12, 7–13.
for a bindweed detection sensor. Trans. ASAE 42 (5), 1485–1491.
E. Hamuda et al. / Computers and Electronics in Agriculture 125 (2016) 184–199 199

Ling, P.P., Ruzhitsky, V.N., 1996. Machine vision techniques for measuring the Shrestha, D.S., Steward, B.L., Birrell, S.J., 2004. Video processing for early stage maize
canopy of tomato seedling. J. Agric. Eng. Res. 65 (2), 85–95. plant detection. Biosyst. Eng. 89 (2), 119–129.
Liu, F.H., O’Connell, N.V., 2002. Off-site movement of surface-applied simazine from Slaughter, D., Harrel, R., 1989. Discriminating fruit for robotic harvest using color in
a citrus orchard as affected by irrigation incorporation. Weed Sci. 50 (5), 672– natural outdoor scenes. Trans. ASAE 32 (2), 757–763.
676. Søgaard, H.T., 2005. Weed classification by active shape models. Biosyst. Eng. 91,
Lotz, L.A.P., van der Weide, R.Y., Horeman, G.H., Joosten, L.T.A., 2002. Weed 271–281.
management and policies: from prevention and precision technology to Spliid, N.H., Koeppen, B., 1998. Occurrence of pesticides in Danish shallow ground
certification of individual farms. In: Proceedings of the 12th EWRS water. Chemosphere 37 (7), 1307–1316.
Symposium, 2–3. Wageningen, Papendal, the Netherlands. Stall, W.M., 2009. Weed Control in Sweet Corn. University of Florida, HS 197:
Manh, A.G., Rabatel, G., Assemat, L., Aldon, M.J., 2001. Weed Leaf Image Gainesville. Available at: <http://edis.ifas.ufl.edu/pdffiles/WG/WG03800.pdf>.
Segmentation by Deformable Templates. Automatic and Emerging Swanton, C.J., Weise, S.F., 1991. Integrated weed management: the rationale and
Technologies, Silsoe Research Institute, pp. 139–146. approach. Weed Technol. 5, 657–663.
Marchant, J.A., Tillett, R.D., Brivot, R., 1998. Real-time segmentation of plants and Swanton, C.J., Weaver, S., Cowan, P., Van Acker, R., Deen, W., Shreshta, A., 1999.
weeds. Real-Time Imaging 4 (4), 243–253. Weed thresholds: theory and applicability. J. Crop Prod. 2, 9–29.
Meyer, G.E., Camargo-Neto, J., 2008. Verification of color vegetation indices for Tellaeche, A., Burgos-Artizzu, X.P., Pajares, G., Ribeiro, A., 2008. A vision-based
automated crop imaging applications. Comput. Electron. Agric. 63, 282–293. method for weeds identification through the Bayesian decision theory. Pattern
Meyer, G.E., Camargo-Neto, J., Jones, D.D., Hindman, T.W., 2004. Intensified fuzzy Recogn. 41 (2), 521–530.
clusters for classifying plant, soil, and residue regions of interest from color Thangadurai, K., Padmavathi, K., 2014. Computer Vision image Enhancement for
images. Comput. Electron. Agric. 42, 161–180. Plant Leaves Disease Detection. IEEE, pp. 173–175. http://dx.doi.org/10.1109/
Meyer, G.E., Hindman, T.W., Lakshmi, K., 1998. Machine vision detection parameters WCCCT.2014.39.
for plant species identification. In: Meyer, G.E., DeShazer, J.A. (Eds.), Precision Tian, Slaughter, 1998. Environmentally adaptive segmentation algorithm for
Agriculture and Biological Quality, Proceedings of SPIE. vol. 3543, Bellingham, outdoor image segmentation. Comput. Electron. Agric. 21, 153–168.
WA, pp. 327–335. Torres-Sánchez, J., Peña, J.M., de Castro, A.I., López-Granados, F., 2014. Multi-
Nelson, D.C., Giles, J.F., 1986. Implication of post emergence tillage on root injury temporal mapping of the vegetation fraction in early-season wheat fields using
and yields of potatoes. Am. Potato J. 63, 445. images from UAV. Comput. Electron. Agric. 103 (April), 104–113. http://dx.doi.
Neto, J.C., 2004. A Combined Statistical-Soft Computing Approach for Classification org/10.1016/j.compag.2014.02.009.
and Mapping Weed Species in Minimum-Tillage Systems Unpublished Ph.D. U.S. Department of Agriculture, 1996. Chart of U.S. Hired Farm Workers and Wage
Dissertation. University of Nebraska, Lincoln, NE, 117 pp. Rates Available at: <www.usda.gov/nass/aggraphs/fl-hired.htm>.
Nieto, H.J., Brondo, M.A., Gonzales, J.T., 1968. Critical period of crop growth Van Acker, R.C., Swanton, C.J., Weise, S.F., 1993. The critical period of weed control
cycles for competition from weeds. Pest Articles News Summaries (C) 14, 159– in soybean (Glycine max (L.) Merr). Weed Sci. 41, 194–200.
166. Van Henten, E.J., Van Tuijl, B.a.J., Hemming, J., Kornet, J.G., Bontsema, J., Van Os, E.a.,
O’Donovan, J.T., de St. Remy, E.A., O’Sullivan, P.A., Dew, Dew, D.A., Sharma, A.K., 2003. Field test of an autonomous cucumber picking robot. Biosyst. Eng. 86,
1985. Influence of the relative time of emergence of wild oat (Avena fatua) on 305–313.
yield loss of barley (Hordeum vulgare) and wheat (Triticum aestivum). Weed Sci. Woebbecke, D.M., Meyer, G.E., Von Bargen, K., Mortensen, D., 1992. Plant species
33, 498–503. identification, size, and enumeration using machine vision techniques on near-
Onyango, C.M., Marchant, J.A., 2003. Segmentation of row crop plants from weeds binary images. SPIE Opt. Agric. Forestry 1836, 208–219.
using colour and morphology. Comput. Electron. Agric. 39, 141–155. Woebbecke, D., Meyer, G., VonBargen, K., Mortensen, D., 1995. Color indices for
Otsu, N., 1979. A threshold selection method from gray-level histogram. IEEE Trans. weed identification under various soil, residue, and lighting conditions. Trans.
Syst. Man Cybern. 9, 62–66. ASAE 38 (1), 271–281.
Pajares, G., Ruz, J.J., Cruz, J.M., 2005. Performance analysis performance analysis of Woebbecke, D.M., Al-Faraj, A., Meyer, G.E., 1994. Calibration of large field of view
homomorphic systems for image change detection. In: Marques, J.S., de la thermal and optical sensors for plant and soil. Trans. ASAE 37 (2), 669–677.
Blanca, N.P., Pins, P. (Eds.), Pattern Recognition and Image Analysis, vol. 3522. Xiang, H., Tian, L., 2011. An automated stand-alone in-field remote sensing system
Springer-Verlag, Berlín, pp. 563–570. (SIRSS) for in-season crop monitoring. Comput. Electron. Agric. 78, 1–8.
Papamichail, D., Eleftherohorinos, I., Froud-Williams, R., Gravanis, F., 2002. Critical Xiao, Y., Cao, Z., Zhuo, W., 2011. Type-2 fuzzy thresholding using GLSC histogram of
periods of weed competition in cotton in Greece. Phytoparasitica 30, 105–111. human visual nonlinearity characteristics. Opt. Express 19, 10656–10672.
http://dx.doi.org/10.1007/BF02983976. Ye, Mengni, Cao, Zhiguo, Yu, Zhenghong, Bai, Xiaodong, 2015. Crop feature
Perez, A.J., Lopez, F., Benlloch, J.V., Christensen, Svend., 2000. Colour and shape extraction from images with probabilistic superpixel Markov random field.
analysis techniques for weed detection in cereal fields. Comput. Electron. Agric. Comput. Electron. Agric. 114 (June), 247–260. http://dx.doi.org/10.1016/
25 (3), 197–212. j.compag.2015.04.010.
Qasem, J.R., 2009. Weed competition in cauliflower (Brassica oleracea L. Var. Yu, Z., Cao, Z., Wu, X., Bai, X., Qin, Y., Zhuo, W., Xiao, Y., Zhang, X., Xue, H., 2013a.
Botrytis) in the Jordan Valley. Sci. Hortic. 121 (3), 255–259. http://dx.doi.org/ Automatic image-based detection technology for two critical growth stages of
10.1016/j.scienta.2009.02.010. maize. Emergence and three-leaf stage. Agric. Forest Meteorol. 174–175, 65–84.
Rasmussen, J., Nørremark, M., Bibby, B., 2007. Assessment of leaf cover and crop soil Yu, Z., Cao, Z., Ye, M., Bai, X., Li, Y., Wang, Y., 2013b. Specularity-invariant crop
cover in weed harrowing research using digital images. Weed Res. 47, 299–310. extraction with probabilistic superpixel Markov random field. In: Eighth
Reid, J., Searcy, S., 1987. Vision-based guidance of an agricultural tractor. IEEE International Symposium on Multispectral Image Processing and Pattern
Control Syst. Mag. 7 (2), 39–43. Recognition. International Society for Optics and Photonics, pp. 891806–
Ribeiro, A., Fernández-Quintanilla, C., Barroso, J., García-Alegre, M.C., 2005. 891808.
Development of an image analysis system for estimation of weed. In: Zheng, L., Zhang, J., Wang, Q.y., 2009. Mean-shift-based color segmentation of
Proceedings of the 5th European Conference on Precision Agriculture (5ECPA), images containing green vegetation. Comput. Electron. Agric. 65, 93–98.
pp. 169–174. Zheng, L., Shi, D., Zhang, J., 2010. Segmentation of green vegetation of crop canopy
Ruiz-Ruiz, G., Gómez-Gil, J., Navas-Gracia, L.M., 2009. Testing different color spaces images based on mean shift and Fisher linear discriminate. Pattern Recogn. Lett.
based on hue for the environmentally adaptive segmentation algorithm (EASA). 31 (9), 920–925.
Comput. Electron. Agric. 68, 88–96. Zimdahl, R.L., 1988. The concept and application of the critical weed-free period. In:
Sahoo, R.K., Soltani, S., Wong, K.C., Chen, Y.C., 1988. A survey of thresholding Altieri, M.A., Eibman, F.M.L. (Eds.), Weed Management in Agroecosystems:
techniques. Comput. Vision, Graph., Image Process. 41, 233–260. Ecological Approaches. CRC Press, Boca Raton, FL, USA, pp. 145–155.
Schuster, I., Nordmeyer, H., Rath, T., 2007. Comparison of vision-based and manual Zimdahl, R.L., 1993. Fundamentals of Weed Science. Academic Press, San Diego, CA,
weed mapping in sugar beet. Biosyst. Eng. 98, 17–25. USA.

You might also like