IoT Based Precision Farming
IoT Based Precision Farming
IoT Based Precision Farming
Abstract: - This project focuses on leveraging drone depths, incorporating sensors for monitoring humidity and
images of the pests equipped with advanced sensors for temperature.
pest detection in crops, combined with methods for image
processing to identify diseases. The ultimate goal is to This system has two main sections, monitoring station
enhance crop health and productivity through timely and and control station, which are intercommunicated using/aided
targeted pesticide application. Image processing by the wireless Zigbee or Wi-Fi communication technologies.
techniques are used to detect signs of diseases and pests in The control station as well as robotic station possesses the
the captured images. The use of machine learning CNN amenities which is soil moisture sensor, ultraviolet sensor,
algorithm enhances the system’s ability to accurately robotic system with motors, ARM microcontroller, and
classify and diagnose crop heath issues. Upon detection of power supply. Next, the sick plants will be categorized, and a
pests, the IOT platform triggers a response mechanism to camera equipped with Internet of Things technology will be
deploy a precision pesticide spraying system. This ensures used to take pictures of the afflicted areas of the plants [2].
targeted and localized treatment, reducing the overall use After that, pre-processing, modification, and grouping are
of pesticides and minimizing environmental impact. This applied to these pictures. The processor then receives these
project involves capturing images of pests using a camera, images as input, and using the CNN algorithm, compares the
followed by processing these images to extract key images with the set of tested and pre-trained pests [3].
features using various image processing techniques. The
extracted features are analyzed using algorithms, An automated pesticide sprayer is used to target specific
primarily Convolutional Neural Networks (CNNs), to areas of the leaf for pesticide application if the UAV [4]
detect variations in color and other dominant captured image detects signs of damage. This ensures precise
characteristics in the images. By comparing these features and efficient use of pesticides. If not, it will be automatically
across samples, the system can identify pests and plant discarded by the processors, and the spraying robot [5] will
diseases more efficiently. This approach aims to provide a move further. This project focuses on making an automated
quicker and more cost-effective solution for pest detection system accessible to farmers for early detection of plant
and disease management. diseases. The system integrates robotics, where a drone
captures images, and an onboard processor analyzes them.
Keywords:- CNN, IOT, Sprayer Robot, Image Processing, Once the disease is evaluated, real-time results are sent to the
ZigBee Module, Precision. farmer via a Bluetooth HC-05 Android app and displayed on
an LCD screen for quick access. The process of disease
I. INTRODUCTION detection involves several key steps: digital image capture,
image pre-processing (including noise removal, color
In recent years, robotics in the agricultural sector, driven transformation, and histogram equalization), segmentation
by the principles of precision agriculture, has emerged as an using the K-means algorithm, feature extraction, and
innovative and rapidly growing technology. The primary classification using a support vector machine (SVM), a
drivers of farming process automation are the reduction of supervised learning algorithm. There are two stages to the
time and energy spent on monotonous farming operations and processing that is done with these components. Training
the enhancement of yield productivity through the application Phase, often known as the offline phase, is the first processing
of precision farming principles to each crop on an individual stage. During this stage, an image analyzer examined a set of
basis. The design of such robots is based on specific input photographs of leaves (both damaged and healthy), and
approaches and takes into account the agricultural specific features were extracted. Subsequently, the classifier
environment in which they will operate. Additionally, a received these attributes as input together with the
prototype of an autonomous agricultural robot is introduced, information identifying whether the image depicts a healthy
specifically designed for tasks like spraying pesticides or diseased leaf. Subsequently, the classifier ascertains the
efficiently and autonomously. Robotic systems have a relationship between the retrieved features and the potential
significant impact across various sectors, including society, conclusion about the existence of the disease [7]. The system
organizations, and industries. This research focuses on is trained as a result.
creating an automated device designed for farm operations,
such as pest identification using machine learning [1] and
precise pesticide spraying at predetermined distances and
Creation and advancement of an efficient and automatic The classification software helps in classifying various
system to detect diseases in affected plants, by use of image pests while the hardware consists of a sprayer bot
processing of the pests, which minimize the required communicated through wireless module.
III. METHODOLOGY since it uses a full-cone nozzle, while the software component
gives an accurate and precise classification of the pests. The
A. Hardware project's block diagram, depicted in Figure 1, is the
connection diagram, which includes the different motors and
Proposed Methodology: sensors needed for movement and sensing.
The primary goal of this project is to provide precise
farming equipment. The hardware component helps to The Arduino Uno controls the DC motors by the motor
minimize the use of pesticides and aids in precise spraying drive circuit.
The motor drive circuit is responsible for shaft direction. for the movement of the pesticide spraying nozzle. The
The below Fig 2 is the practical model implementation where practical model is created which is seen in the connection
the yellow long vertical pole is the shaft which is responsible diagram.
The sensors and controlling all the motion of the robot is Motor drivers are used to control the DC motors. Because
processed and sensed by the microcontroller Arduino dc motors require a 12V supply, motor drivers are
UNO. employed. The Arduino board is connected to the motor
The Arduino UNO is connected with the soil moisture driver, which enables Arduino to control the dc motor that
sensor and ultrasonic sensor, where the soil moisture operates on a 12V supply (the motor drive controller uses
sensor is employed to ascertain the percentage of the an L290D integrated circuit).
water content in the soil and the ultra-sonic is employed The L290D IC is responsible for power amplification and
to ascertain the amount of water present in the pesticide also helps in bidirectional.
tank. The Zigbee module is the device used for wireless
The power is supplied to the Arduino UNO through an communication between the robot and the laptop.
external rechargeable battery source which has the supply Three DC motors are utilized in the project; two of them
of 12V. are employed to move the robot, while the third one
The battery is the connected with the power module, the moves the sprayer's shaft. where the sprayer's height can
power module is utilized in this project since various be changed.
electronic gadgets require varying voltage values. (for The dc motors are controlled using motor drivers. The
example: Arduino UNO requires 5V, while the dc motor motor driver is used because the dc motor requires 12V
requires 12V). supply and hence the Arduino board is in turn integrated
The solar panel is connected with the battery, the solar with motor driver where it helps Arduino to control the dc
panel produces a good amount of 12V voltage during the motor which work in 12V supply (L290DICis used in the
sunny day which helps in recharging of the battery. motor drive controller).
A Zener diode is connected between the solar panel and The L290DIC is responsible for power amplification and
the battery in order to avoid the voltage to flow in reverse also helps in bidirectional movement of the robot.
direction from battery to solar panel. The Bluetooth module is in turn integrated with Arduino
The Lcd screen of 16*2 display is connected to the uno in order for the movement control of the robot.
microcontroller in order to provide the visible output to The Bluetooth module helps in wireless control of the
the user. robot where it can be controlled by the user’s phone
Feature Extraction:
Software Applications
Arduino IDE:
The Arduino IDE software is used to program the
hardware components of the project which include the coding
Fig 3: Steps Involved in Image Processing for the sensors, movement of the sprayer bot and also the shaft
movement of the sprayer.
Tensor Flow: Developed by the Google Brain team for internal use in
Google's manufacturing and research, TensorFlow was
TensorFlow is a free and open-source software library first released in 2015 under the Apache License 2.0. An
designed for artificial intelligence and machine learning. enhanced version, TensorFlow 2.0, was released by
Although it can be used for a variety of tasks, its primary Google in September 2019.
emphasis is on training and inference with deep neural TensorFlow supports a variety of programming
networks. languages, including Python, JavaScript, C++, and Java,
making it versatile for a broad spectrum of applications
across different industries.
The Fig 5 shows the python code for the machine The cross-platform library is distributed as free and open-
learning CNN where we can see that many libraries are source software under the Apache License 2. OpenCV has
imported as required. included GPU acceleration for real-time operations since
2011.
OpenCV:
Among the Application Areas of OpenCV are:
Programming functions for real-time computer vision are
predominantly available in the OpenCV (Open-Source Toolkits for 2D and 3D features
Computer Vision Library), which was originally Assessment of egomotion
developed by Intel and later supported by Itseez (a System for facial recognition
company later acquired by Intel) and Willow Garage.
The Fig 6 shows the code snippet for the required program is run, the smaller code snippet shows how the
OpenCV, this shows how the real time application of the OpenCV is implemented.
Flask is a Python-based micro web framework known for Debugging and development server
its minimalistic design, as it doesn’t require any specific Uses Jinja templating
libraries or tools to function. Offers comprehensive assistance for unit testing.
Lacking built-in features like form validation or database RESTful request dispatching.
abstraction layers, Flask relies on third-party libraries to assistance with safe cookies
provide common functionalities, though it supports the Fig 7 gives us the code snippet of the web driven
extensions to add capabilities as if they were native to application built in Flask for the project.
Flask.
Extensions in Flask support various functions including
open authentication protocols, object-relational mappers,
form validation, file upload processing, and utilities for
shared frameworks.
Flow Chart
A flow chart depicting the stages involved in IOT Based Precision Farming is illustrated in Fig 8. This Flowchart gives a
detailed explanation of the working of the model.
IV. RESULTS also Fig 9(d) tells the pesticide detection and Fig 9(e) tells the
soil moisture level with the help of sensor.
Figure 9(a) depicts the software page of the project. The
place where the pests images are inserted for testing and Fig
9(b) depicts the output of the given image along with the
precision required while Fig 9(c) depicts the sprayer level
with the help of ultrasonic sensor where it tells the tank level
Fig 9(c): Shows the Sprayer Tank Level Fig 9(f): The Hardware Practical Model
V. CONCLUSION
ACKNOWLEDGEMENT
REFERENCES