Abstract
It is important to recognize the rice image captured by unmanned aerial vehicle (UAV) for monitoring the growth of rice and preventing the diseases and pests. Aiming at the image recognition, we use rice images captured by UAV as our data source, the structure of capsule network (CapsNet) is built to recognize rice images in this paper. The images are preprocessed through histogram equalization method into grayscale images and through superpixel algorithm into the superpixel segmentation results. The both results are output into the CapsNet. The function of CapsNet is to perform the reverse analysis of rice images. The CapsNet consists of five layers: an input layer, a convolution layer, a primary capsules layer, a digital capsules layer and an output layer. The CapsNet trains classification and predicts the output vector based on routing-by-agreement protocol. Therefore, the features of rice image by UAV can be precisely and efficiently extracted. The method is more convenient than the traditional artificial recognition. It provides the scientific support and reference for decision-making process of precision agriculture.
Similar content being viewed by others
References
Pádua, L., Adão, T., Hruška, J., et al.: Very high resolution aerial data to support multi-temporal precision agriculture information management. Proc. Comput. Sci. 121, 407–414 (2017)
Schmidt, D.F., Botwinick, J.: UAV-based imaging for multi-temporal, very high resolution crop surface models to monitor crop growth variability. Photogrammetrie-Fernerkundung-Geoinformation 6(6), 551–562 (2013)
Bendig, J., Yu, K., Aasen, H., et al.: Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 39, 79–87 (2015)
Shen, K., Li, W., Pei, Z., et al.: Crop area estimation from UAV transect and MSR image data using spatial sampling method. Proc. Environ. Sci. 26, 95–100 (2015)
Chang, A., Jung, J., Maeda, M.M., et al.: Crop height monitoring with digital imagery from unmanned aerial system (UAS). Comput. Electron. Agric. 141, 232–237 (2017)
Bendig, J., Yu, K., Aasen, H., et al.: Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 39, 79–87 (2015)
Xiang, H., Tian, L.: Development of a low-cost agricultural remote sensing system based on an autonomous unmanned aerial vehicle (UAV). Biosys. Eng. 108(2), 174–190 (2011)
Gibson-Poole, S., Humphris, S., Toth, I.: Identification of the onset of disease within a potato crop using a UAV equipped with unmodified and modified commercial off-the-shelf digital cameras. Adv. Anim. Biosci. 8, 2812–2816 (2017)
Barbedo, J.G.A.: A review on the main challenges in automatic plant disease identification based on visible range images. Biosys. Eng. 144, 52–60 (2016)
Latte, M.V., Shidnal, S., Anami, B.S., et al.: A combined HSV and GLCM approach for paddy variety identification from crop images. Int. J. Signal Process. Image Process. Pattern Recognit. 8 (2015)
Dorj, U.O., Lee, M., Yun, S.S.: An yield estimation in citrus orchards via fruit detection and counting using image processing. Comput. Electron. Agric. 140, 103–112 (2017)
Grinblat, G.L., Uzal, L.C., Larese, M.G., et al.: Deep learning for plant identification using vein morphological patterns. Comput. Electron. Agric. 127, 418–424 (2016)
dos Santos Ferreira, A., Freitas, D.M., da Silva, G.G.: Weed detection in soybean crops using ConvNets. Comput. Electron. Agric. 143, 314–324 (2017)
Sabour, S., Frosst, N., Hinton, G.E.: Dynamic routing between capsules. arXiv:1710.09829.2017
Shin, M., Kim, M., Kwon, D.S.: Baseline CNN structure analysis for facial expression recognition. In: Robot and Human Interactive Communication (RO-MAN), 2016 25th IEEE International Symposium on, pp. 724–729 (2016)
García-Santillán, I.D., Pajares, G.: On-line crop/weed discrimination through the Mahalanobis distance from images in maize fields. Biosyst. Eng. 166, 28–43 (2018)
Achanta, R., Shaji, A., Smith, K., et al.: SLIC superpixels. Epfl (2010)
Achanta, R., Shaji, A., Smith, K., et al.: SLIC superpixels compared to state-of-the-art superpixel methods. IEEE Trans. Pattern Anal. Mach. Intell. 34(11), 2274–2282 (2012)
Hsu, C.Y., Ding, J.J.: Efficient image segmentation algorithm using SLIC superpixels and boundary-focused region merging, pp. 1–5. In: Communications and Signal Processing. IEEE (2013)
Dubey, S.R., Jalal, A.S.: Detection and classification of apple fruit diseases using complete local binary patterns. In: Third International Conference on Computer and Communication Technology, pp. 346–351. IEEE Computer Society (2012)
Omrani, E., Khoshnevisan, B., Shamshirband, S., et al.: Potential of radial basis function-based support vector regression for apple disease detection. Measurement 55(9), 512–519 (2014)
Wen, C., Wu, D., Hu, H., et al.: Pose estimation-dependent identification method for field moth images using deep learning architecture. Biosys. Eng. 136, 117–128 (2015)
Grinblat, G.L., Uzal, L.C., Larese, M.G., et al.: Deep learning for plant identification using vein morphological patterns. Comput. Electron. Agric. 127, 418–424 (2016)
Acknowledgement
This paper is acknowledged by the National Natural Science Foundation of China (Grant No. 51502209), the Government Support Enterprise Development Funding of Hubei Province (Grant No. 16441), the Three-dimensional Textiles Engineering Research Center of Hubei Province, the Anqing Technology Transfer Center of Wuhan Textile University.
Author information
Authors and Affiliations
Corresponding author
Additional information
Yu Li, Meiyu Qian, Pengfeng Liu, Qian Cai and Xiaoying Li are Co-first authors.
Rights and permissions
About this article
Cite this article
Li, Y., Qian, M., Liu, P. et al. The recognition of rice images by UAV based on capsule network. Cluster Comput 22 (Suppl 4), 9515–9524 (2019). https://doi.org/10.1007/s10586-018-2482-7
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10586-018-2482-7