Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

IoT Based Precision Farming

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Volume 9, Issue 9, September – 2024 International Journal of Innovative Science and Research Technology

ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24SEP1159

IoT Based Precision Farming


Elvin Paul K S1; Sudha B2; Abhishek S3; Supreeth Raj M4; Nagababau A V5
UG Scholar-ETE1,3,4,5; Assistant Professor –ETE2
Bangalore institute of Technology
Bangalore, Karnataka

Abstract: - This project focuses on leveraging drone depths, incorporating sensors for monitoring humidity and
images of the pests equipped with advanced sensors for temperature.
pest detection in crops, combined with methods for image
processing to identify diseases. The ultimate goal is to This system has two main sections, monitoring station
enhance crop health and productivity through timely and and control station, which are intercommunicated using/aided
targeted pesticide application. Image processing by the wireless Zigbee or Wi-Fi communication technologies.
techniques are used to detect signs of diseases and pests in The control station as well as robotic station possesses the
the captured images. The use of machine learning CNN amenities which is soil moisture sensor, ultraviolet sensor,
algorithm enhances the system’s ability to accurately robotic system with motors, ARM microcontroller, and
classify and diagnose crop heath issues. Upon detection of power supply. Next, the sick plants will be categorized, and a
pests, the IOT platform triggers a response mechanism to camera equipped with Internet of Things technology will be
deploy a precision pesticide spraying system. This ensures used to take pictures of the afflicted areas of the plants [2].
targeted and localized treatment, reducing the overall use After that, pre-processing, modification, and grouping are
of pesticides and minimizing environmental impact. This applied to these pictures. The processor then receives these
project involves capturing images of pests using a camera, images as input, and using the CNN algorithm, compares the
followed by processing these images to extract key images with the set of tested and pre-trained pests [3].
features using various image processing techniques. The
extracted features are analyzed using algorithms, An automated pesticide sprayer is used to target specific
primarily Convolutional Neural Networks (CNNs), to areas of the leaf for pesticide application if the UAV [4]
detect variations in color and other dominant captured image detects signs of damage. This ensures precise
characteristics in the images. By comparing these features and efficient use of pesticides. If not, it will be automatically
across samples, the system can identify pests and plant discarded by the processors, and the spraying robot [5] will
diseases more efficiently. This approach aims to provide a move further. This project focuses on making an automated
quicker and more cost-effective solution for pest detection system accessible to farmers for early detection of plant
and disease management. diseases. The system integrates robotics, where a drone
captures images, and an onboard processor analyzes them.
Keywords:- CNN, IOT, Sprayer Robot, Image Processing, Once the disease is evaluated, real-time results are sent to the
ZigBee Module, Precision. farmer via a Bluetooth HC-05 Android app and displayed on
an LCD screen for quick access. The process of disease
I. INTRODUCTION detection involves several key steps: digital image capture,
image pre-processing (including noise removal, color
In recent years, robotics in the agricultural sector, driven transformation, and histogram equalization), segmentation
by the principles of precision agriculture, has emerged as an using the K-means algorithm, feature extraction, and
innovative and rapidly growing technology. The primary classification using a support vector machine (SVM), a
drivers of farming process automation are the reduction of supervised learning algorithm. There are two stages to the
time and energy spent on monotonous farming operations and processing that is done with these components. Training
the enhancement of yield productivity through the application Phase, often known as the offline phase, is the first processing
of precision farming principles to each crop on an individual stage. During this stage, an image analyzer examined a set of
basis. The design of such robots is based on specific input photographs of leaves (both damaged and healthy), and
approaches and takes into account the agricultural specific features were extracted. Subsequently, the classifier
environment in which they will operate. Additionally, a received these attributes as input together with the
prototype of an autonomous agricultural robot is introduced, information identifying whether the image depicts a healthy
specifically designed for tasks like spraying pesticides or diseased leaf. Subsequently, the classifier ascertains the
efficiently and autonomously. Robotic systems have a relationship between the retrieved features and the potential
significant impact across various sectors, including society, conclusion about the existence of the disease [7]. The system
organizations, and industries. This research focuses on is trained as a result.
creating an automated device designed for farm operations,
such as pest identification using machine learning [1] and
precise pesticide spraying at predetermined distances and

IJISRT24SEP1159 www.ijisrt.com 2444


Volume 9, Issue 9, September – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24SEP1159

II. LITERATURE SURVEY A. Problems Recognized


From the Literature Survey carried out, several
From Table 1, PAPER-1 refers to The paper discusses problems were identified in the existing technology of pest
the key components of IoT-based smart agriculture systems, detection by drones and spraying methods. They are as
such as sensor nodes, communication protocols, data follows:
analytics platforms, and control mechanisms. It highlights the
importance of each component and their functioning to  Pest Infestation in Crops
optimize agricultural practices. Paper 2 refer to the role of Farmers often face challenges in identifying and
OpenCV in analyzing images or video feeds captured from monitoring pest infestations in their crops efficiently.
agricultural fields to identify pests or signs of pest infestation. Traditional methods of pest detection can be time-consuming
It encompasses a range of image processing techniques, and may not provide real- time information.
including segmentation, feature extraction, and classification.
 Inefficient Pesticide Use
Paper-3 refers to the integration of AI technology with Conventional methods of pesticide application may
UAVs for pest recognition purposes. It discusses the result in over use or under use, leading to increased costs
hardware and software components of the AI drone system, and potential environmental harm. Lack of precision in
including the UAV platform, on board cameras, pesticide application can harm non-target organisms and soil
computational resources, and software algorithms for image health
analysis. Paper-4 refers to the process of training the deep
learning model on the dataset and fine-tuning its parameters  Manual Monitoring and Treatment.
to optimize performance. It also presents the results from Monitoring and treating crops manually are labor-
performance evaluations, including metrics such as accuracy intensive tasks, and the effectiveness of pest management
and precision. These papers have respective drawbacks such may vary. Timely intervention is crucial, and delays in
as connectivity issues, improper detection, outdated identifying and addressing pest issues can lead to crop losses.
technology and also problem in retrieval of the given data.

Table 1: Comparison of Various IEEE Papers

B. Suggested Solution professional interference and thus reduce the expenditure


Following the difficulties noted in the current system, involved in the spraying the pests using the traditional
the following solutions are proposed: methods.

Creation and advancement of an efficient and automatic The classification software helps in classifying various
system to detect diseases in affected plants, by use of image pests while the hardware consists of a sprayer bot
processing of the pests, which minimize the required communicated through wireless module.

IJISRT24SEP1159 www.ijisrt.com 2445


Volume 9, Issue 9, September – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24SEP1159

This in turn increases efficiency and precision as  Ultrasonic Sensor


required, the efficiency in the software is done by the CNN  Bluetooth Module
machine learning and precision is done by the nozzle of the  Soil Moisture Sensor
sprayer  Solar Panel
 Battery
C. Hardware and Software Components:
 LCD (16*2)
The hardware and software requirements are depicted
 Motor Drive Circuit
in the Component Diagram of Figure 1.

D. The Project’s Hardware Requirements are: E. The Project’s Software Requirements are:

 Arduino Uno  Arduino IDE


 DC motor  Tensor Flow
 Water Pump  OpenCV
 Power Supply Module  Flask
 Relay  Python
 ZigBee Module
 Embedded c

Fig 1: Connection Diagram

III. METHODOLOGY since it uses a full-cone nozzle, while the software component
gives an accurate and precise classification of the pests. The
A. Hardware project's block diagram, depicted in Figure 1, is the
connection diagram, which includes the different motors and
 Proposed Methodology: sensors needed for movement and sensing.
The primary goal of this project is to provide precise
farming equipment. The hardware component helps to The Arduino Uno controls the DC motors by the motor
minimize the use of pesticides and aids in precise spraying drive circuit.

IJISRT24SEP1159 www.ijisrt.com 2446


Volume 9, Issue 9, September – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24SEP1159

The motor drive circuit is responsible for shaft direction. for the movement of the pesticide spraying nozzle. The
The below Fig 2 is the practical model implementation where practical model is created which is seen in the connection
the yellow long vertical pole is the shaft which is responsible diagram.

Fig 2: Practical Model

 The sensors and controlling all the motion of the robot is  Motor drivers are used to control the DC motors. Because
processed and sensed by the microcontroller Arduino dc motors require a 12V supply, motor drivers are
UNO. employed. The Arduino board is connected to the motor
 The Arduino UNO is connected with the soil moisture driver, which enables Arduino to control the dc motor that
sensor and ultrasonic sensor, where the soil moisture operates on a 12V supply (the motor drive controller uses
sensor is employed to ascertain the percentage of the an L290D integrated circuit).
water content in the soil and the ultra-sonic is employed  The L290D IC is responsible for power amplification and
to ascertain the amount of water present in the pesticide also helps in bidirectional.
tank.  The Zigbee module is the device used for wireless
 The power is supplied to the Arduino UNO through an communication between the robot and the laptop.
external rechargeable battery source which has the supply  Three DC motors are utilized in the project; two of them
of 12V. are employed to move the robot, while the third one
 The battery is the connected with the power module, the moves the sprayer's shaft. where the sprayer's height can
power module is utilized in this project since various be changed.
electronic gadgets require varying voltage values. (for  The dc motors are controlled using motor drivers. The
example: Arduino UNO requires 5V, while the dc motor motor driver is used because the dc motor requires 12V
requires 12V). supply and hence the Arduino board is in turn integrated
 The solar panel is connected with the battery, the solar with motor driver where it helps Arduino to control the dc
panel produces a good amount of 12V voltage during the motor which work in 12V supply (L290DICis used in the
sunny day which helps in recharging of the battery. motor drive controller).
 A Zener diode is connected between the solar panel and  The L290DIC is responsible for power amplification and
the battery in order to avoid the voltage to flow in reverse also helps in bidirectional movement of the robot.
direction from battery to solar panel.  The Bluetooth module is in turn integrated with Arduino
 The Lcd screen of 16*2 display is connected to the uno in order for the movement control of the robot.
microcontroller in order to provide the visible output to  The Bluetooth module helps in wireless control of the
the user. robot where it can be controlled by the user’s phone

IJISRT24SEP1159 www.ijisrt.com 2447


Volume 9, Issue 9, September – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24SEP1159

 The relay is connected with Arduino in order to maintain  Dataset Collection:


the control on the water pump, here relay is used to
overcome the voltage difference between both the  The dataset is collected by capturing images with a digital
Arduino and the water pump. The pump end is connected camera mounted on a drone, which is connected to a
with the nozzle for spraying. laptop.
 The captured images undergo additional pre-processing.
B. Software:  The data augmentation is also a part of the dataset
collection where the taken set of the images are augmented
as per the required resolution by the user
 Image Pre-Processing:

 The images captured by the camera are pre-processed to


enhance their quality.
 Pre-processing steps can include color transformation,
noise reduction, histogram equalization, and green
masking.
 Color transformation is employed to improve image
quality by converting RGB images to grayscale and HSI
formats for better clarity.

 Feature Extraction:

 An image has several key features, primarily color,


texture, and shape. In this context, three main features are
considered: the color histogram, texture representing
color patterns, and the shape and texture characteristics of
the object.

 Classification Using CNN:

 In this module, the images are classified using a


Convolutional Neural Network (CNN) classifier.
 To assess the relevant elements and identify
distinguishing characteristics for crack identification, a
number of features are combined.

 Results and Analysis:

 The outcomes are far more effective than those of the


earlier models and come from a suitable training of the
CNN model.
 The study shows the kinds of pests that are found, and
the model's effectiveness is gauged by how well it can
detect the pests.

 Software Applications

 Arduino IDE:
The Arduino IDE software is used to program the
hardware components of the project which include the coding
Fig 3: Steps Involved in Image Processing for the sensors, movement of the sprayer bot and also the shaft
movement of the sprayer.

The software is programmed using embedded C, where


the Arduino IDE is the main interface for the hardware and
the user, the user can program his required needs in the
hardware by Arduino IDE.

IJISRT24SEP1159 www.ijisrt.com 2448


Volume 9, Issue 9, September – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24SEP1159

Fig 4: Arduino IDE Software Sketch Screen

 Tensor Flow:  Developed by the Google Brain team for internal use in
Google's manufacturing and research, TensorFlow was
 TensorFlow is a free and open-source software library first released in 2015 under the Apache License 2.0. An
designed for artificial intelligence and machine learning. enhanced version, TensorFlow 2.0, was released by
Although it can be used for a variety of tasks, its primary Google in September 2019.
emphasis is on training and inference with deep neural  TensorFlow supports a variety of programming
networks. languages, including Python, JavaScript, C++, and Java,
making it versatile for a broad spectrum of applications
across different industries.

IJISRT24SEP1159 www.ijisrt.com 2449


Volume 9, Issue 9, September – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24SEP1159

Fig 5: Tensor Flow Program Snippet

The Fig 5 shows the python code for the machine  The cross-platform library is distributed as free and open-
learning CNN where we can see that many libraries are source software under the Apache License 2. OpenCV has
imported as required. included GPU acceleration for real-time operations since
2011.
 OpenCV:
 Among the Application Areas of OpenCV are:
 Programming functions for real-time computer vision are
predominantly available in the OpenCV (Open-Source  Toolkits for 2D and 3D features
Computer Vision Library), which was originally  Assessment of egomotion
developed by Intel and later supported by Itseez (a  System for facial recognition
company later acquired by Intel) and Willow Garage.

IJISRT24SEP1159 www.ijisrt.com 2450


Volume 9, Issue 9, September – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24SEP1159

The Fig 6 shows the code snippet for the required program is run, the smaller code snippet shows how the
OpenCV, this shows how the real time application of the OpenCV is implemented.

Fig 6: OpenCV Program Snippet

 Flask:  The Flask Features Include:

 Flask is a Python-based micro web framework known for  Debugging and development server
its minimalistic design, as it doesn’t require any specific  Uses Jinja templating
libraries or tools to function.  Offers comprehensive assistance for unit testing.
 Lacking built-in features like form validation or database  RESTful request dispatching.
abstraction layers, Flask relies on third-party libraries to  assistance with safe cookies
provide common functionalities, though it supports  the Fig 7 gives us the code snippet of the web driven
extensions to add capabilities as if they were native to application built in Flask for the project.
Flask.
 Extensions in Flask support various functions including
open authentication protocols, object-relational mappers,
form validation, file upload processing, and utilities for
shared frameworks.

IJISRT24SEP1159 www.ijisrt.com 2451


Volume 9, Issue 9, September – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24SEP1159

Fig 7: Code Snippet for Web Application of Flask

IJISRT24SEP1159 www.ijisrt.com 2452


Volume 9, Issue 9, September – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24SEP1159

 Flow Chart
A flow chart depicting the stages involved in IOT Based Precision Farming is illustrated in Fig 8. This Flowchart gives a
detailed explanation of the working of the model.

Fig 8: Flow Chart of the Process of IOT Based Precision Farming

IV. RESULTS also Fig 9(d) tells the pesticide detection and Fig 9(e) tells the
soil moisture level with the help of sensor.
Figure 9(a) depicts the software page of the project. The
place where the pests images are inserted for testing and Fig
9(b) depicts the output of the given image along with the
precision required while Fig 9(c) depicts the sprayer level
with the help of ultrasonic sensor where it tells the tank level

IJISRT24SEP1159 www.ijisrt.com 2453


Volume 9, Issue 9, September – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24SEP1159

Fig 9(d): Shows Pest Detected by Hardware


Fig 9(a): Web Application Page

Fig 9(e): Shows the Soil Moisture Level

In this model the IOT Based Precision Farming is done


by both hardware and software methods. The software parts
and their results are depicted in Fig 9(a) and Fig 9(b) where it
identifies and classifies the pests while the rest of the figure
Fig 9(b): Software Result in Classification 9(c),9(d) and 9(e) depicts the hardware results the hardware
model is shown below in Fig 9(f).

Fig 9(c): Shows the Sprayer Tank Level Fig 9(f): The Hardware Practical Model

IJISRT24SEP1159 www.ijisrt.com 2454


Volume 9, Issue 9, September – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24SEP1159

V. CONCLUSION

The prototype is non-invasive, low-cost and user-


friendly.

 This prototype may be substantially less expensive than


the goods that are sold commercially. The prototype's
efficacy and efficiency result in improved power control
and little energy loss.
 This self-powered device uses solar power and is
connected tothe central gateway using the communication
protocol.
 The experiment effectively illustrates how precision
agriculture and image processing may revolutionize pest
detection and management. by applying pesticides
precisely and intervening early.
 The automated and data-driven method encourages
environmental sustainability in addition to improving
efficiency and cost-effectiveness.

ACKNOWLEDGEMENT

We would like to convey our heartfelt appreciation to


our Head of Department, Dr. M Rajeswari, and Project
Guide, Prof. Sudha B also Dr. Boraiah, for their invaluable
assistance and advice throughout this project.

REFERENCES

[1]. Dipti. D. Desai., Priyanka .B. Patil , “A review of the


literature on IOT based Smart agriculture monitoring
and control system on IOT” published in the year
2023.
[2]. Aishwarya M S, Karthik k, Rachana N ,Nandan D,“A
Survey on Pest Detection system”,on Open CV, IOT ,
5G Technology published in the year 2022.
[3]. Abhishek Kamal, Adarsh, Aviral Kumar Gopal , “Pest
Recognition on UAV: AI Drone” on Cutting edge
Recognition Technology published in the year 2022.
[4]. A.G.Mazare, L.M.Lonescu, D.Visan, “Pest Detection
system for agricultural crops using intelligent image
analysis” on Deep Learning Artificial Neural
Networks published in the year 2021.
[5]. Ankit Singh, Abhishek Gupta, Akash Bhosale, Sumeet
Poddar, “Agribot: An Agriculture Robot”,
International Journal of Advanced Research in
Computer and Communication Engineering Vol. 4,
Issue 1, January 2015.
[6]. N. Firthous Begum, P. Vignesh ,“Design and
Implementation of Pick and Place Robot with
Wireless Charging Application”, International Journal
of Science and Research (IJSR-2013).
[7]. Buniyamin N,Wan Ngah, W.A. J Sariff N,Mohamad
Z, “A Simple Local Path Planning Algorithm For
Autonomous Mobile Robots”, International Journal
Of Systems Applications, Engineering &
Development Issue 2, Volume 5, 2011.

IJISRT24SEP1159 www.ijisrt.com 2455

You might also like