YOLOv8 Model for Weed Detection in Wheat Fields Based on a Visual Converter and Multi-Scale Feature Fusion
Abstract
:1. Introduction
2. Materials and Methods
2.1. Wheat Weed Dataset
2.1.1. Weed Image Acquisition
2.1.2. Image Annotation and Dataset Construction
2.1.3. Dataset Expansion
2.2. Deep Learning Network Construction
2.2.1. The YOLOv8 Network Model
2.2.2. Residual Module Combined with Vision Converter
2.2.3. Feature Fusion Networks
2.2.4. Bounding Box Regression Loss Function (MPDIOU)
2.3. Experimental Environment
2.4. Assessment of Indicators
3. Results and Discussion
3.1. Analysis of Ablation Experiments
3.2. Visualisation of Target Area Model Features
3.3. Experimental Analysis of Different Modelling Algorithms
3.4. Weed Detection Effect of YOLOv8-MBM Algorithm
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- He, Z.; Ding, L.; Ji, J.; Jin, X.; Feng, Z.; Hao, M. Design and Experiment of Variable-Spray System Based on Deep Learning. Appl. Sci. 2024, 14, 3330. [Google Scholar] [CrossRef]
- Xu, Y.; Bai, Y.; Fu, D.; Cong, X.; Jing, H.; Liu, Z.; Zhou, Y. Multi-species weed detection and variable spraying system for farmland based on W-YOLOv5. Crop Prot. 2024, 182, 106720. [Google Scholar] [CrossRef]
- Indra, S.; Innazent, A.; Joseph, P.A.; Kumar, P.D. Effect of Tillage and Weed Management Practices on Growth and Yield of Green Gram [Vigna radiata (L.) Wilczek]. Int. J. Environ. Clim. Chang. 2024, 14, 62–68. [Google Scholar] [CrossRef]
- Khan, A.; Ilyas, M.; Hussain, T. Response of wheat to herbicides application and hand weeding under irrigated and non-irrig ated conditions. Pak. J. Weed Sci. Res. 2005, 11, 1–9. [Google Scholar]
- Das, T.K.; Behera, B.; Nath, C.P.; Ghosh, S.; Sen, S.; Raj, R.; Ghosh, S.; Sharma, A.R.; Yaduraju, N.T.; Nalia, A.; et al. Herbicides use in crop production: An analysis of cost-benefit, non-target toxicities and environmental risks. Crop Prot. 2024, 181, 106691. [Google Scholar] [CrossRef]
- Reddy, P.; Varma, N.R.G.; Babu, T.K.; Krishna, L.; Mohan, Y.C.; Ramprakash, T. Productivity of Dry Direct Seeded Rice as Influenced by Herbicide Combinations. Int. J. Environ. Clim. Chang. 2024, 14, 81–94. [Google Scholar]
- Vinay, V.; Yiannis, A.K.S.J.; Schueller, J.K.; Tom, B. Smart spraying technologies for precision weed management: A review. Smart Agric. Technol. 2023, 6, 100337. [Google Scholar]
- Francesco, V.; Simone, C.; Marco, S.; Luca, S.; Moira, S.; Francesco, M.; Riccardo, M. A mixed-autonomous robotic platform for intra-row and inter-row weed removal for precision agriculture. Comput. Electron. Agric. 2023, 214, 108270. [Google Scholar]
- Li, Y.; Guo, Z.; Shuang, F.; Zhang, M.; Li, X. Key technologies of machine vision for weeding robots: A review and benchmark. Comput. Electron. Agric. 2022, 196, 106880. [Google Scholar] [CrossRef]
- Ghazali, K.H.; Mustafa, M.M.; Hussain, A. Machine vision system for automatic weeding strategy using image processing technique. Am. -Eurasian J. Agric. Environ. Sci. 2008, 3, 451–458. [Google Scholar]
- Hu, L.; Luo, X.; Zeng, S.; Zhang, Z.; Chen, X.; Lin, C. Plant recognition and localization for intra-row mechanical weeding device based on machine vision. Trans. Chin. Soc. Agric. Eng. 2013, 29, 12–18. [Google Scholar]
- Li, N.; Zhang, X.; Zhang, C.; Ge, L.; He, Y.; Wu, X. Review of machine-vision-based plant detection technologies for robotic weeding. In Proceedings of the 2019 IEEE International Conference on Robotics and Biomimetics (ROBIO), Dali, China, 6–8 December 2019; pp. 2370–2377. [Google Scholar]
- Tang, J.; Zhang, Z.; Wang, D.; Xin, J.; He, L. Research on weeds identification based on K-means feature learning. Soft Comput. 2018, 22, 7649–7658. [Google Scholar] [CrossRef]
- Zhang, S.; Huang, W.; Wang, Z. Combing modified Grabcut, K-means clustering and sparse representation classification for weed recognition in wheat field. Neurocomputing 2021, 452, 665–674. [Google Scholar] [CrossRef]
- Bakhshipour, A.; Jafari, A. Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Comput. Electron. Agric. 2018, 145, 153–160. [Google Scholar] [CrossRef]
- Espejo-Garcia, B.; Mylonas, N.; Athanasakos, L.; Fountas, S.; Vasilakoglou, I. Towards weeds identification assistance through transfer learning. Comput. Electron. Agric. 2020, 171, 105306. [Google Scholar] [CrossRef]
- Jin, X.; Sun, Y.; Che, J.; Bagavathiannan, M.; Yu, J.; Chen, Y. A novel deep learning-based method for detection of weeds in vegetables. Pest Manag. Sci. 2022, 78, 1861–1869. [Google Scholar] [CrossRef]
- Hasan, A.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G. A survey of deep learning techniques for weed detection from images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
- Bah, M.D.; Dericquebourg, E.; Hafiane, A.; Canals, R. Deep learning based classification system for identifying weeds using high-resolution UAV imagery. In Proceedings of the Intelligent Computing: Proceedings of the 2018 Computing Conference, London, UK, 10–12 July 2018; Volume 2, pp. 176–187. [Google Scholar]
- Hu, K.; Coleman, G.; Zeng, S.; Wang, Z.; Walsh, M. Graph weeds net: A graph-based deep learning method for weed recognition. Comput. Electron. Agric. 2020, 174, 105520. [Google Scholar] [CrossRef]
- Ahmad, A.; Saraswat, D.; Aggarwal, V.; Etienne, A.; Hancock, B. Performance of deep learning models for classifying and detecting common weeds in corn and soybean production systems. Comput. Electron. Agric. 2021, 184, 106081. [Google Scholar] [CrossRef]
- Dang, F.; Chen, D.; Lu, Y.; Li, Z. YOLOWeeds: A novel benchmark of YOLO object detectors for multi-class weed detection in cotton production systems. Comput. Electron. Agric. 2023, 205, 107655. [Google Scholar] [CrossRef]
- Wang, Q.; Cheng, M.; Huang, S.; Cai, Z.; Zhang, J.; Yuan, H. A deep learning approach incorporating YOLO v5 and attention mechanisms for field real-time detection of the invasive weed Solanum rostratum Dunal seedlings. Comput. Electron. Agric. 2022, 199, 107194. [Google Scholar] [CrossRef]
- Ajayi, O.G.; Ashi, J.; Guda, B. Performance evaluation of YOLO v5 model for automatic crop and weed classification on UAV images. Smart Agric. Technol. 2023, 5, 100231. [Google Scholar] [CrossRef]
- Zhu, H.; Zhang, Y.; Mu, D.; Bai, L.; Wu, X.; Zhuang, H.; Li, H. Research on improved YOLOx weed detection based on lightweight attention module. Crop Prot. 2024, 177, 106563. [Google Scholar] [CrossRef]
- Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Zhang, L. A fully convolutional network for weed mapping of unmanned aerial vehicle (UAV) imagery. PLoS ONE 2018, 13, e0196302. [Google Scholar] [CrossRef]
- Sudars, K.; Jasko, J.; Namatevs, I.; Ozola, L.; Badaukis, N. Dataset of annotated food crops and weed images for robotic computer vision control. Data Brief 2020, 31, 105833. [Google Scholar] [CrossRef]
- Mini, G.A.; Sales, D.O.; Luppe, M. Weed segmentation in sugarcane crops using Mask R-CNN through aerial images. In Proceedings of the 2020 International Conference on Computational Science and Computational Intelligence (CSCI), Las Vegas, NV, USA, 16–18 December 2020; pp. 485–491. [Google Scholar]
- Jiang, P.; Ergu, D.; Liu, F.; Cai, Y.; Ma, B. A Review of Yolo algorithm developments. Procedia Comput. Sci. 2022, 199, 1066–1073. [Google Scholar] [CrossRef]
- Hussain, M. YOLO-v1 to YOLO-v8, the rise of YOLO and its complementary nature toward digital manufacturing and industrial defect detection. Machines 2023, 11, 677. [Google Scholar] [CrossRef]
- Tan, M.; Pang, R.; Le, Q.V. Efficientdet: Scalable and efficient object detection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 10781–10790. [Google Scholar]
Model | Size | mAP50 | mAP50–95 | Parameters/×10 M6 | Weight Size/MB | FPS |
---|---|---|---|---|---|---|
YOLOv8n | 640 | 0.694 | 0.653 | 3.2 | 5.9 | 42 |
YOLOv8s | 640 | 0.707 | 0.759 | 11.7 | 21.4 | 37.9 |
YOLOv8m | 640 | 0.712 | 0.734 | 26.0 | 49.5 | 32.3 |
YOLOv8l | 640 | 0.708 | 0.721 | 43.7 | 83.5 | 26.1 |
YOLOv8x | 640 | 0.703 | 0.719 | 68.2 | 130 | 20.9 |
Configuration | Parameters |
---|---|
CPU | Intel Core i5-12600KF 3.70 GHz |
GPUs | NVIDIA GeForce RTX 4060Ti |
Operating system | Windows 10 |
Accelerated environment | CUDA11.6 CUDNN8.9.7 |
Development environment (computer) | Pycharm 2020.2.1 |
random access memory (RAM) | 32G |
Model | C2f-MobileViTv3 | BiFPN | MPDIoU | P/% | R/% | mAP1/% | mAP2/% | Weight Size/MB |
---|---|---|---|---|---|---|---|---|
YOLOv8s | × | × | × | 82.1 | 80.8 | 80.0 | 75.9 | 21.4 |
√ | × | × | 88.7 | 76.4 | 84.9 | 82.3 | 22.8 | |
× | √ | × | 87.5 | 74.9 | 82.6 | 78.9 | 22.6 | |
× | × | √ | 85.6 | 80.0 | 83.8 | 80.0 | 22.5 | |
√ | √ | × | 92.6 | 82.8 | 83.7 | 82.8 | 23.0 | |
√ | × | √ | 91.1 | 82.9 | 88.7 | 81.5 | 22.8 | |
× | √ | √ | 91.0 | 82.8 | 89.8 | 82.5 | 22.6 | |
√ | √ | √ | 92.7 | 89.7 | 89.7 | 85.2 | 23.0 |
Model | P/% | R/% | mAP1/% | mAP2/% | Weight Size/MB | FPS |
---|---|---|---|---|---|---|
YOLOv3 | 78.0 | 71.0 | 73.1 | 69.4 | 219.9 | 34.3 |
YOLOv4-tiny | 78.4 | 74.6 | 75.8 | 75.1 | 27.6 | 30.8 |
YOLOv5s | 79.0 | 71.2 | 74.0 | 70.7 | 14.1 | 36.7 |
YOLOv7 | 81.9 | 73.2 | 78.3 | 72.1 | 71.2 | 20.5 |
YOLOv9 | 79.6 | 70.4 | 78.4 | 74.3 | 116.0 | 11.6 |
YOLOv8-C2f_M3-BiFPN-MPDIoU | 92.7 | 87.6 | 89.7 | 85.2 | 23.0 | 35.5 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, Y.; Zeng, F.; Diao, H.; Zhu, J.; Ji, D.; Liao, X.; Zhao, Z. YOLOv8 Model for Weed Detection in Wheat Fields Based on a Visual Converter and Multi-Scale Feature Fusion. Sensors 2024, 24, 4379. https://doi.org/10.3390/s24134379
Liu Y, Zeng F, Diao H, Zhu J, Ji D, Liao X, Zhao Z. YOLOv8 Model for Weed Detection in Wheat Fields Based on a Visual Converter and Multi-Scale Feature Fusion. Sensors. 2024; 24(13):4379. https://doi.org/10.3390/s24134379
Chicago/Turabian StyleLiu, Yinzeng, Fandi Zeng, Hongwei Diao, Junke Zhu, Dong Ji, Xijie Liao, and Zhihuan Zhao. 2024. "YOLOv8 Model for Weed Detection in Wheat Fields Based on a Visual Converter and Multi-Scale Feature Fusion" Sensors 24, no. 13: 4379. https://doi.org/10.3390/s24134379
APA StyleLiu, Y., Zeng, F., Diao, H., Zhu, J., Ji, D., Liao, X., & Zhao, Z. (2024). YOLOv8 Model for Weed Detection in Wheat Fields Based on a Visual Converter and Multi-Scale Feature Fusion. Sensors, 24(13), 4379. https://doi.org/10.3390/s24134379