Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
139 views

Real Time Hand Gesture Recognition Based Control of Arduino Robot

Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-6 | Issue-4 , June 2022, URL:https://www.ijtsrd.com/papers/ijtsrd49934.pdf Paper URL: https://www.ijtsrd.com/engineering/electronics-and-communication-engineering/49934/realtime-hand-gesture-recognition-based-control-of-arduino-robot/dr-sunil-chavan

Uploaded by

Editor IJTSRD
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
139 views

Real Time Hand Gesture Recognition Based Control of Arduino Robot

Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-6 | Issue-4 , June 2022, URL:https://www.ijtsrd.com/papers/ijtsrd49934.pdf Paper URL: https://www.ijtsrd.com/engineering/electronics-and-communication-engineering/49934/realtime-hand-gesture-recognition-based-control-of-arduino-robot/dr-sunil-chavan

Uploaded by

Editor IJTSRD
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

International Journal of Trend in Scientific Research and Development (IJTSRD)

Volume 6 Issue 4, May-June 2022 Available Online: www.ijtsrd.com e-ISSN: 2456 – 6470

Real-Time Hand Gesture Recognition


Based Control of Arduino Robot
Dr. Sunil Chavan1, Prof. Revti Jadhav2, Jayesh Mankar3,
Suyash Thakur3, Shahid Ansari3, Vatsal Panchal3
1
Principal, 2Professor, 3Student,
1, 2, 3
Smt. Indira Gandhi College of Engineering, Navi Mumbai, Maharashtra, India

ABSTRACT How to cite this paper: Dr. Sunil


The main objective of this project aims at delivering the need for one Chavan | Prof. Revti Jadhav | Jayesh
of many applications of the Internet of Things domain. It is a Mankar | Suyash Thakur | Shahid Ansari
combination of various domains such as image processing and | Vatsal Panchal "Real-Time Hand
robotics for development in the field of the Internet Of Things (IoT) Gesture Recognition Based Control of
Arduino Robot"
now also termed as Internet Of Everything (IoE) after seeing its
Published in
involvement in almost everything happening in our day to day lives. International Journal
With increased dependency on home automation and wireless of Trend in
controlled equipment and gadgets, this project also shares some Scientific Research
scope in home automation. To improve the functionality of gadgets and Development
and to increase their efficiency we thought of creating a device to (ijtsrd), ISSN: 2456- IJTSRD49934
cater to these needs and explore alongside our own interest in the 6470, Volume-6 |
field of the Internet of Things. This paper presents the design, Issue-4, June 2022, pp.79-83, URL:
functioning, and successful testing of a rover controlled wirelessly www.ijtsrd.com/papers/ijtsrd49934.pdf
with help of hand gestures. Gesture Recognition has played important
role in the field of Human-Computer Interaction (HCI). Vision-based Copyright © 2022 by author(s) and
International Journal of Trend in
hand gesture recognition provides a great solution to various machine Scientific Research and Development
vision-based applications by providing an easy interaction channel. Journal. This is an
For various automated machine control applications, an effective Open Access article
real-time communication approach is required. In this paper, a vision- distributed under the
based hand gesture-based approach is presented for providing a real- terms of the Creative Commons
time control of Arduino-based robot. Combining all of these concepts Attribution License (CC BY 4.0)
into a single device that can recognize hand gestures and sends data (http://creativecommons.org/licenses/by/4.0)
wirelessly to remote devices for surveillance and other purposes.
KEYWORDS: Computer vision, Human-computer interface (HCI),
Hand Gesture Recognition (HGR), Image processing, IoT, Arduino
Uno

INTRODUCTION
With an increase in digitalization and automation in now also termed Internet of Everything (IoE) after
many industries, various problems and jobs which seeing its involvement in almost everything
were once considered laborious for humans are now happening in our day to day life. With increased
becoming effortless. Robots' functionality and dependency on home automation and wireless
accuracy have proven to be efficient, clean as well as controlled equipment and gadgets, this project also
productive in various industrial and security sectors. shares some scope in home automation.
This along with the help of other domains such as
To improve the functionality of gadgets and to
image processing and wireless connectivity over the
increase their efficiency we thought of creating a
internet is now becoming a major part contributing to
device to cater to the needs and explore alongside our
the betterment of people in the coming future. The
interest in the field of the Internet of Things. This
main objective of this project aims at delivering the paper uncovers Human-computer interaction using
above saying. It is a combination of various domains
Hand Gesture Recognition. It presents the design,
such as computer vision, HCI, and robotics for
functioning, and successful testing of a rover
development in the field of Internet of Things (IoT)
controlled wirelessly with help of hand gestures. With

@ IJTSRD | Unique Paper ID – IJTSRD49934 | Volume – 6 | Issue – 4 | May-June 2022 Page 79


International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
wireless connectivity and Bluetooth support for easy Mediapipe: MediaPipe is a framework that is used for
data transmission, this rover can be monitored and applying in a machine learning pipeline, and it is an
controlled on multiple devices. The camera module open-source framework of Google. The MediaPipe
capture and send real-time footage to users remotely framework is beneficial for cross-platform
here as an Ip address link for live tracking of the development since the framework is made using
environment surrounding the rover. Adding motor statistical data. The MediaPipe framework is
control gave us maneuverability and Arduino multimodal, where this framework is often applied to
functions as the brains of this rover. To instruct this varied audios and videos.
rover we took the help of serial communication to Methodology:
coordinate tasks between Arduino and python A. Hand gesture recognition
programming language. The proposed system presented in this paper performs
Literature Survey: real-time hand gesture recognition which is done
In past, a lot of research work has been done on using various image processing techniques. The
recognizing hand gestures efficiently. Systems were overall system architecture of the hand gesture
developed with applications in robotic control, recognition system using media pipe is shown in fig
gaming applications, and many more. There has been
much research in the field of hand gesture recognition
for controlling robots.
Rafiqul Zaman Khan et al [2] discussed the issues in
hand gesture recognition and provided the literature
review on different techniques used for Hand Gesture
Recognition.
R. Pradipa et al [4] performed an analysis of different
Hand Gesture Recognition Algorithms and provided
their advantages and disadvantages.
A Computer Vision-based approach for gesture
recognition includes four phases commonly Image
Acquisition, Pre-processing, Feature Extraction, and
Classification as shown in Fig. 1. Firstly, the image of
the user providing gestures is collected through a
webcam and is followed by some preprocessing.
Then, the hand portion is extracted and classified into
the category of a particular gesture [5].
F. Arce et al [6] illustrated the use of an B. System Architecture
accelerometer based on 3 axes to recognize Hand
Gestures
Proposed System:
The system hardware includes Arduino Uno
microcontroller, Dc Motors, L298N motor Driver,
HC-05 Bluetooth Module, LED, and Esp3-
Cam.Esp32 camera is a small size, low power
consumption camera module based on ESP32. It is
widely used in IoT applications such as wireless
video monitoring, Wifi image upload, and QR
identification. The battery is required to power
everything in this project from Arduino to motors.
The system software includes Arduino IDE, and
Python IDE (Pycharm).
Pyqt5 designer (GUI toolkit) for designing a user
interface.
Libraries Used: Pyserial, OpenCV, Matplotlib,
NumPy, Cvlib, Urlib, SoftwareSerial library.

@ IJTSRD | Unique Paper ID – IJTSRD49934 | Volume – 6 | Issue – 4 | May-June 2022 Page 80


International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
In the block diagram, we can observe the serial The esp32 camera can be accessed remotely using its
communication between Python and Arduino Uno. IP address for monitoring the robot's surroundings.
Here Esp32 camera is working Independently, the This can be used for surveillance, monitoring, and
live stream can be accessed by any device available if other wide applications.
an IP address is known.

The hand gesture recognition process captures the


hand gestures of the user. The video of the user is
captured through the webcam of a laptop or
Before running the Python code, First, enable the computer. This process involves converting the video
Bluetooth application and pair the Bluetooth HC05 to static frames which can be used for image
device with a PC or laptop. When we run the python processing. These frames are obtained through the
program it first establishes a Serial connection function of the OpenCV library
between PC to Arduino Uno. It first checks for the
available connection, after paring it starts capturing
video from the Webcam for live interaction between
user and Robot.
Then it detects for HandLandmarks of the hand and
when it is detected successfully it counts the number
of fingers raised and Displays it accordingly. If two
fingers are raised it will show 2 counts. Depending on
the number of fingers counted it will receive certain
instructions to send commands like to go forward,
backward, left, right, and stop.
Later Arduino Uno receives the commands through
serial communication between PC and HC05
Bluetooth. In Arduino Uno, various libraries are
included like a Software serial that reads the received
data. Here we first initialized the pins of Arduino Uno
and then Begin Serial communication. It reads the
instruction that is received and Begins commands like
the Forward, Backward, Left, Right, and Stop.
Result

@ IJTSRD | Unique Paper ID – IJTSRD49934 | Volume – 6 | Issue – 4 | May-June 2022 Page 81


International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
Automotive and Robotics Industry
A gesture recognition solution from Sony Depth
Sensing Solutions features a time-of-flight feature
that measures the time it takes for a gesture to "travel"
from the infrared sensor to the object and back. The
AI is trained to differentiate main gestures from
gestural noise and to work under any lighting
conditions. The BMW 7 Series features a built-in
HGR system that recognizes five gestures and can
control music and incoming calls, among other
things. Less interaction with the touchscreen makes
the driving experience safer and more convenient.
Machine learning methods, artificial intelligence, and
complex algorithms are used by robotic control
systems to perform a particular task, enabling natural
communication of robotic systems with the
environment and making autonomous decisions.
Research studies suggest that combining computer
vision technology with a robot can be used to create
assistive devices which can be used by elderly.
Another study uses computer vision to allow a robot
to ask a human for a proper path within a house. The
The movement of the robot is controlled by these five hand gesture can be used to design a surveillance
gestures. The captured video goes through image robot used to spy on the enemies in the areas where
processing to detect Multi-hand landmarks. When humans are restricted. The dynamic feature of the
landmarks are detected it counts for the number of surveillance robot increase the range of area under
figures that are raised. surveillance. The small size and portability of the
robot allows it to reach inconvenient places easily.
When 5 fingers are raised it sends a forward
Search and rescue operations can be performed using
command. When 4 fingers are raised it sends a
Hand gesture controlled robots in the areas affected
backward command. When 2 fingers are raised it
by natural calamities. These robots play a very
sends the Right command. When 1 finger is raised it
important role to perform rescue operations in the
sends the left command. When 0 fingers are raised it
areas which are not safe for humans
sends a stop command. I
Healthcare Industry
Advantages:
Operating rooms and emergency room may be
The main advantage of this system is that it does not
chaotic, with lots of noise machines and personnels.
require any additional hardware implementation as
In such environments, voice commands can be less
the web cameras of the laptop can be used to receive
effective than gestures. Touchscreens are not an
gestures from the user. It requires only a web camera
on the laptop and hence can be used with any of the option because there’s a strict boundary between what
is and is not sterile. But access to information and
systems for gesture recognition. There is no need for
contact with the computer physically and the user can images during surgery or another manipulation is
possible with HGR tech. Various companies have
communicate with the system from a few meters
proved it and many more research is ongoing on this
distance. The robotic systems controlled by hand
system. Gesture can provide doctors with the ability
gestures provide many useful applications in various
to check Imagery, MRI and CT scans with simple
fields. Real-time vision-based gestures can be used in
gestures while performing their task. Patient
gaming technology for designing interactive gaming
monitoring in the hospitals can be done by using
applications.
robots which can be moved from one ward to another
Applications: using hand gestures. The robot will perform
In recent years, HGR technology has made its way surveillance and keep an eye on the patients wherever
into various industries as advances in Deep Learning, they are present.
Computer Vision, machine learning, Cameras, and
sensors have made it more available, reliable, and Virtual reality
Leap Motion (acquired by Ultrahaptics) presented
accurate. The top four fields actively adopting hand
updated HGR software that allows users to track
tracking and gesture recognition are given below.

@ IJTSRD | Unique Paper ID – IJTSRD49934 | Volume – 6 | Issue – 4 | May-June 2022 Page 82


International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
gestures in virtual reality in addition to controlling a arduino robot moves in the respective direction with
PC. The Leap Motion controller is a USB device that the help of wheels attached to the DC motors.
observes the area of about one meter with the help of
References:
two IR camera sensors and three infrared LEDs. A
[1] T. Jha, B. Singh, A. Sharma, S. Sharma and R.
hand tracking application developed by ManoMotion Kumar, "Real Time Hand Gesture Recognition
recognizes gestures in three dimensions using just a
for Robotic Control," 2018 Second
smartphone camera (on both Android and iOS) and International Conference on Green Computing
can be applied in VR and AR environments. The use
and Internet of Things (ICGCIoT), 2018, pp.
cases for this technology include IoT devices, VR- 371-375, doi: 10.1109/ICGCIoT.2018.8752966.
gaming and interaction with virtual simulations.
[2] Rafiqul Zaman Khan and Noor Adnan
Consumer electronics Ibraheem, "Hand Gesture Recognition: a
Home automation is another vast field within the Literature Review", International Journal of
consumer electronics domain in which gesture Artificial Intelligence & Applications (IJAIA),
recognition can be used for various applications. July 2012.
uSens has developed a hardware and software to [3] N. Mohamed, M. B. Mustafa and N. Jomhari,
make smart TVs sense finger movements and hand "A Review of the Hand Gesture Recognition
gestures. Gestoos is a AI Platform in which gestures System: Current Progress and Future
can be created and assigned by a smartphone or Directions," in IEEE Access, vol. 9, pp.
another device, and one gesture can be used to enable 157422-157436, 2021, doi:
various commands. It also offers touch less control 10.1109/ACCESS.2021.3129650.
over lighting and audio systems [4] Pradipa, R. and S N Kavitha. “Hand Gesture
Recognition - Analysis of Various Techniques,
The consumer market is open to accept and try new
Methods and Their Algorithms.” (2014).
HMI technologies, and hand gesture recognition
[5] Ebrahim Aghajari and Damayanti Gharpure,
technology can be said as evolution from
"Real Time Vision-Based Hand Gesture
touchscreens. Demand for more smooth and means of
Recognition for Robotic Application",
interaction with devices as well as a concern for
International Journal, vol. 4, no. 3, 2014
Safety of driver are pushing the adoption of HGR.
[6] F. Arce and J. M. G. Valdez, "Accelerometer-
Future scope: Based Hand Gesture Recognition Using
A limited (five) number of hand gestures have been Artificial Neural Networks", Soft Computing
considered here. Though the number of gestures can for Intelligent Control and Mobile Robotics
be extended by adding different algorithm, the Studies in Computational Intelligence, 2011
computation will be increased. OpenCV library is [7] J. Hossain Gourob, S. Raxit, and A. Hasan, "A
preferred here as it is applicable for real time and Robotic Hand: Controlled With Vision-Based
execution is faster. It may be used on a variety of Hand Gesture Recognition System," 2021
robotic systems with a set of gesture commands International Conference on Automation,
appropriate to that system. Further the robot can be Control and Mechatronics for Industry 4.0
designed to perform various other tasks and gestures (ACMI), 2021, pp. 1-4, doi:
can be designed accordingly. The algorithm improved 10.1109/ACMI53878.2021.9528192.
further working in various light conditions. The [8] Shaikh, S.A., Gupta, R., Shaikh, I., & Borade,
performance analysis of the algorithm will be done by J. (2016). Hand Gesture Recognition Using
taking images of gestures at different conditions like Open CV.
non uniform background, poor light conditions. [9] Joseph Howse. OpenCV computer vision with
Conclusion: python. Packt Publishing Birmingham, 2013.
The proposed system was achieved by using a [10] Sudhakar Kumar, Manas Ranjan Das, Rajesh
webcam or a built-in camera which detects the hand Kushalkar, Nirmala Venkat, Chandrashekhar
gestures. The main advantage of the system is that it Gourshete, and Kannan M Moudgalya.
does not require any physical contact with humans Microcontroller programming with Arduino
and provides a natural way of communicating with and python. 2021.
the robots. The algorithm implementation does not [11] Google, MP,
require any additional hardware. The proposed https://ai.googleblog.com/2019/08/on-device-
system can be extended further for a wide range of real-time-hand-tracking-with.html.
applications. Movement of robot is controlled using [12] https://learnopencv.com/introduction-to-
these five gestures, which were implemented. The mediapipe

@ IJTSRD | Unique Paper ID – IJTSRD49934 | Volume – 6 | Issue – 4 | May-June 2022 Page 83

You might also like