Hand Gesture Control System Using Open CV
Hand Gesture Control System Using Open CV
ISSN No:-2456-2165
Abstract:- All around the world, researchers are recognition as a means for computers to start deciphering
working to improve the interactivity of digital devices body language. It creates a richer connection between
and to make them operate with the least amount of computers and people than the simple user interfaces, like
physical interaction possible. The interactive computer the keyboard and mouse. Gesture control systems interpret
system proposed in this study can function without a and recognize human body gestures to enable user
keyboard or mouse. By the implementation of this interaction with a computer system.
project, everyone would have an advantage including a
paralyzed person who finds it difficult to use a mouse or Gesture recognition is a topic of active research in
keyboard. The easiest and most natural way to Human Computer Interaction (HCI) technology. In the
communicate is through hand gestures and hence an realm of Human Computer Interaction, gesture recognition
innovative method of computer control using a real-time is a well-liked and in-demand analytical technique. It has
web camera is demonstrated in this Hand Gesture applications in a variety of jobs, including controlling
Control project. The approach is to use a camera and robots, managing virtual environments, translating sign
computer vision technologies to handle tasks and language for medical applications, automating homes, etc.
perform the jobs without the use of a physical mouse or The consumer electronics industry, the automobile industry,
keyboard. A webcam with normal resolution is used to the healthcare industry, and the educational sector are some
track the user's hand in two dimensions. The system is of the industries that have embraced gesture control
created using Python and OpenCV. The output of the technology. Recent HCI research has taken on a special
camera will be seen on the monitor and the tasks significance. Due to its competence, the hand is the most
associated with the hand gestures will be performed. useful communication tool in all the body parts. The term
Thereby, an application of an easy-to-use gesture control "gesture" is used in many situations involving human
system is shown in this project. motion. Only certain human motions, particularly those of
the hands, arms, and face which are referred to as gestures
Keywords:- Interactive Computer System, Hand Gesture, are instructive. The ability for machines to recognize and
Webcam, Keyboard, Mouse. categorize hand gestures is provided by hand gesture
identification, which is regarded as a vital component of
I. INTRODUCTION Human Computer Interaction (HCI). With the development
of current technology and the widespread use of hand
Nowadays, gesture control applications are becoming movements or hand gestures in daily communication, the
an area of wide interest with evolving technology. hand gesture identification is treated as a crucial part of
Keyboards and mouse have been the primary input methods HCI. Recently, the study of hand gestures has seen an
for computers for many years. However, as ubiquitous, and increase in interest and popularity. Moreover, it permits a
ambient technology (like PlayStations) and tools that let straightforward and natural method of interaction.
users hold virtual items become more and more common,
hand or body gestures are becoming increasingly important. The goal of this project is to create an interface that
Gesture controllers are now an integral part of the human dynamically captures human hand gestures and controls
computer interaction system. Computers can grasp human various functions of the computer device
body language with the help of gesture recognition. Instead using hand movements. This will improve the entire
of using only the simple text-user interfaces or graphical- experience of using a computer without the mouse or
user interfaces, the gesture recognition system aids in keyboard and using hand gestures to control the
creating a stronger connection between humans and fundamental operations. As humans, we always look for
technology (GUIs). The gesture recognition is an ways to make our lives easier by the use of technology.
advancement of Touch User Interface (TUI), a type of Moreover, with the help of gesture control, even those with
interaction between a user and a computer-based device by disabilities will be able to operate computers with ease.
making a physical touch on the screen. As the name implies,
gesture recognition aims to identify the physical motions, or II. LITERATURE SURVEY
"gestures," made by people. For example, one could instruct
the gadget to start a specific app and carry out other In terms of popularity, gesture control technologies
operations by waving hand in a precise pattern in front of it. have eclipsed more traditional mechanical communication
In this gesture recognition and control project, the computer techniques. The domain market is divided into many
camera interprets the movements of human hand. This categories depending on a variety of criteria, such as
information is subsequently used as input by the computer location, assistive robots, sign language recognition
camera to handle various computer activities as well as technology, immersive gaming, smart Televisions, virtual
applications. Gesture recognition is the foundation of controllers, virtual mice, etc. Myron W. Krueger initially
gesture control technology. You may think of gesture proposed the method of hand gesture-based computer
The methodology of the project is explained stepwise Initially, the video input is obtained from the
in Fig 1. It is initiated when the hand movement is computer’s primary camera and then the image is converted
recognized through the camera. OpenCV captures the hand into RGB form, and the processing of the image is
gestures by capturing the frames. The video accessed completed. The primary camera's video input is utilized to
through the camera is flipped. The hand gesture language is generate the video input for the MediaPipe Hand module,
detected using MediaPipe library and then the fingertip which is used to detect hands. The initialization and
coordinates are detected. Further the distance between each configuration of the discovered hands are finished by
point on the hand is calculated. The final gesture made by ‘MpHands.Hands()’. Using ‘mp.solutions.drawing_utils’,
the hand is detected after calculating the distance between the connections and landmarks are depicted on the detected
the points. If any application such as Word Document is hand. It uses a neural network to predict the locations of 21
open, then the associated gestures are performed on the key points on the hand, including the tip of each finger, the
running application. However, if no application is open, then base of each finger, the center of the palm, and the wrist.
the gestures are performed on the operating system itself. The points 2 - 4 represent the thumb, points 5 - 8 represent
the index finger, points 9 - 12 represent the middle finger,
points 13 – 16 represent the ring finger and points 17 – 20
represent the little finger.
The list of hand components, or the number of points different operations on the operating system or the current
on the hand, as detected by the MediaPipe Hand module, is running application.
produced as an empty list. It also examines whether there
are several hands on the input. Then the key points of the Details of Hardware & Software for the
fingertip are specified in a lmList i.e. lmList[8][1], implementation of the project are as follows:
lmList[8][2]. In this way the x and y coordinates of a key
point are created (such as: x, y = lmList[8][1], lmList[8][2] A. Hardware requirements: -
). After obtaining the coordinates for the key points a Pentium i3 Processor System
miniature opaque circle is drawn at the coordinates using 4 GB Ram
cv2.circle() function. These circles are then connected by 500 GB Hard Disk Space
drawing solid lines using c2.line() function. LED Monitor
Input Devices like Keyboard, Camera, Mouse
Further, the distance between the coordinates is
calculated using the hypot() function imported from the B. Software requirements: -
math library. The distance is calculated to differentiate Python 3.8 Programming Language
between different gestures and to identify the gestures Windows 10 Operating system
distinctively. Then it is detected whether an application is in
the running state or not. If an application is running, for IV. RESULTS AND DISCUSSIONS
example, Word Document is open, then a set of gestures can
be used to perform different operations on the word file such After the text edit has been completed, the paper is
as print, save, close, etc. If no application is running then the ready for the template. Duplicate the template file by using
same set of gestures can be used to perform operations such the Save As command and use the naming convention
as volume up and down, system shutdown, screenshot, etc. prescribed by your conference for the name of your paper.
In this newly created file, highlight all of the contents and
As different gestures made by the hand have different import your prepared text file. You are now ready to style
distances between the coordinates, using these different your paper; use the scroll down window on the left of the
distances we can differentiate between the different MS Word Formatting toolbar.
gestures. For example, if a fist is made then the distance
between the finger tips become very close to zero so we The project has resulted in identifying the hand
define a range like 0-15 for the distance between the finger gestures appropriately and performing the associated
tips, but if a ‘yo gesture’ is made then the distance between activities such as volume control, screenshot, closing a file,
the fingertip of thumb, middle and ring finger becomes close save and print operations on a file, device sleep and shut
to zero and the distance between little and index finger down successfully. The implementation of the hand gesture
becomes very large, thereby giving it a higher range. Once control system along with the screenshots can be seen
the gesture is identified, the pycaw library for volume below. Actions associated with specific gestures were
controls and pyAutoGUI library is used to perform other captured by the device camera and the operations for various
tasks were performed.
Gesture control technology has advanced greatly since version of the same technology might undoubtedly be
the 1980s, when the first research was conducted. Based on advantageous and will result in a better interaction of human
the project's findings, it can be concluded that hand gesture and computer devices by elevating the Touch User Interface
identification and control using Python and OpenCV can be (TUI).
developed utilizing the theories of hand segmentation and
the hand detection system, which apply the Haar-cascade ACKNOWLEDGMENT
classifier. To sum it up, this system has achieved project
goals such as (1) To establish a comprehensive system for We would like to extend our sincere gratitude to all the
hand gesture detection, interpretation, and recognition using people who played a crucial role in the completion of our
Python and OpenCV and (2) To create the numbers and sign project. We would like to express our deepest gratitude to
languages of hand gestures displayed in the system to fulfil our coordinator, Prof. Arathi Boyanapalli, for her
the project's title. continuous support throughout the duration of the project.
We would also like to express our sincere gratitude to our
The implementation of additional gestures can be colleagues, whose efforts were instrumental in the
included in the system's future recommendations so that completion of the project. Through the exchange of
users with varying skin tones and hand sizes will be able to interesting ideas and thoughts, we were able to present a
effortlessly complete more tasks. The right hand with a project with correct information. We also want to express
specified area in ROI is the sole hand used in the current our gratitude to our parents for their personal support and
system to make gestures. As a result, the technique may be encouragement to us to pursue our own paths. In addition,
improved to allow for the employment of both user hands to we would like to express our sincere gratitude to our
execute various signs in conjunction with computer Director, Dr. Atul Kemkar, and our Head of Department,
processes. A recognition system for both the hands working Dr. Aparna Bannore. We also thank our other faculty
simultaneously can be implemented to level the project on a members for their invaluable contribution in providing us
higher scale. This can further be used in gaming with the required resources and references for the project.
technologies, thus enhancing the user experience.
Algorithms for background subtraction can also be DECLARATION
employed for better detection and tracking of the hand We confirm that the contents of this written submission
thereby resulting in better performance. In the future, when reflect our own ideas and opinions. Where we have
users will be able to interpret a gesture's meaning into a sign incorporated ideas or words from other sources, we have
or number and vice versa, the Graphical User Interface provided appropriate citations and sources to confirm the
(GUI) can be implemented. Nevertheless, technology is original source. We also confirm our commitment to
constantly evolving, so implementing a more advanced academic integrity and honesty. We have not