Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
26 views

Gesture Detection and Its Applications Using Python and OPENCV

In our world, Time and Effort are worth more than money. With everything around constantly becoming easier and easier to do, we still dream of having work done by actually doing next to nothing. Making things happen with the snap of a finger (quite literally with our paper) is so close yet so far from our grasp.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views

Gesture Detection and Its Applications Using Python and OPENCV

In our world, Time and Effort are worth more than money. With everything around constantly becoming easier and easier to do, we still dream of having work done by actually doing next to nothing. Making things happen with the snap of a finger (quite literally with our paper) is so close yet so far from our grasp.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

11 IV April 2023

https://doi.org/10.22214/ijraset.2023.50351
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue IV Apr 2023- Available at www.ijraset.com

Gesture Detection and Its Applications Using


Python and OPENCV
Dr. M. Saraswathi1, K R Kaasyap2, P V Sai Vamsi3
1
Assistant Professor, 2, 3B.E, 4th Year, CSE Branch, SCSVMV University, Enathur, Kanchipuram

Abstract: In our world, Time and Effort are worth more than money. With everything around constantly becoming easier and
easier to do, we still dream of having work done by actually doing next to nothing. Making things happen with the snap of a
finger (quite literally with our paper) is so close yet so far from our grasp. In our paper we aim to use these kind of hand
gestures to make complex actions happen. Our aim is to create a System that can Intake and Understand Commands and then
use said commands in various Static and Dynamic Environments.
Keywords: Gesture Detection, Computer vision, 2d-3d scene modelling.

I. INTRODUCTION
Evolution is the backbone of Mankind. Ever since the world began to take shape as we know it now, communication has been an
important part of the Human Experience. Before Languages, Words or even Sounds were used in communications, Signs and
Gestures were present making them the earliest form of Humans reaching out to fellow humans. The world as we know it has been
shaped by Communication and Gestures. Now as we hit the pinnacle of Human Innovation, The world has come full circle. With
people becoming more accustomed to getting their work done easily with the help of technology, Artificial Intelligence and Deep
Vision are shaping out to be the two most important parts of the future of technology and mankind. Sensors, Cameras and other
Reception devices have taken center stage in the tech world with Artificial Assistants, Voice Assistants, Vision-Oriented
Vehicles,Sensor-Based Machinery and Automation making the lion-share of the world’s dependencies. OpenCV applications that
have been using a device as simple as a webcam have been very interesting developments in the world of coding. Making things
happen with the flick of a wrist, or lifting of a finger all without the need to even type.

II. LITERATURE SURVEY


In this paper [1], Background subtraction was used to extract Hand Region. Camera records a live Video from which frames are
taken as Gesture boards. The performance is Efficient but Underwhelming.
In this paper [2]Hand gestures were recognized using live video. The method used to filter and accurately Pixelate was Greyscaling.
This method was inefficient and had trouble recognizing gestures when the skin tone of the hand was on the lighter side.
In this paper [3], Dynamic Hand gestures were used to imply Alphabets to determine words and sentences in Sign Language. Real
Time Video Capture was performed and Visual Grid was used to recognize the Hand Positioning. This application is effective but
limited to alphabets. Visual Grid is also time taking method.
In this paper[4], Human Computer Interaction Techniques and Hand Gesture Recognition systems were examined and evaluated.
Methods included Hand Signal Segmentation, RGB color scheme and Markov hidden models.
In this paper[5],Various three-dimensional Hand recognition methods and Input methods examined it includes Conventional
Cameras, Webcams and Depth Sensors. Static, Dynamic, 2D and 3D Recognition and Trajectory Models were examined.

III. PROBLEM STATEMENT


There have been many Image processing programs that use heavy machinery with very specific high level specifications with a very
low ease of access. Our aim is to create an efficient and effective Hand Gesture Recognition system without requiring a high amount
of processing power or a very powerful visual sensor. We also aim to apply the inputted gestures to applications in both 2 and 3
dimensional simulations in a hassle-free manner.

IV. PROPOSED SYSTEM


Hand Gesture Recognition using a Method knows as Gaussian Blur to capture a clear image of a Hand making a Gesture and
Converting them into necessary contours.

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 1820
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue IV Apr 2023- Available at www.ijraset.com

1) A 2-Dimensional Static Application


2) A 2-Dimensional Dynamic Application
3) A 3-Dimensional Dynamic Application Using OPENCV, Python and C#.

A. Advantages Of Proposed System


1) A new and Dynamic approach to using OPENCV and Python in Hand Gesture recognition. Gaussian blur improves efficiency
and ability for the program to be effective in any kind of an environment.
2) Applications provide gateway to using the gestures recognized into a wide range of Environments and Situations. Having both
two and three-dimensional applications that are perceived statically and dynamically, our paper covers all the major domains
that can be accessed using Python and Computer Vision using a Basic Camera (Webcam).

B. System Architecture
Below given System Architecture shows the step by step work we did while implementing our project from Hand Gesture Intake
to the various applications.

HAND GESTURE
INPUT, INTAKE AND
RECOGNITION

GAUSSIAN BLUR FOR


IMAGE
MODIFICATION

2-DIMENTIONAL
STATIC APPLICATION
(CALCULATOR)

3-DIMENSIONAL
DYNAMIC
APPLICATION (UNITY
ENVIRONMENT)

2-DIMENSIONAL
DYNAMIC
APPLICATION (PING
-
PONG)

Fig 1. System Architecture

SYSTEM ARCHITECTURE:
This process involves 5 major steps which are:
1) Hand Gesture Input, Intake And Recognition: Hand gestures are taken in through the webcam attached to the system. The
background ideally shouldn’t be spotty or inconsistent. Python and OpenCV are used to recognize the hand out of the picture
frame and recognize the contours and Hand structure.
2) Gaussian Blur For Image Modification: Gaussian Blur or Gaussian Smoothing is a technique with which Images are
intentionally blurred to reduce noise and detail of the image. It is an image preprocessing method used to enhance image
structures and scale representations.

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 1821
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue IV Apr 2023- Available at www.ijraset.com

3) 2-Dimensional Static Application (Calculator): The gesture is taken in as an image (An image is captured every time the user
makes two of his fingers make a snapping gesture). The grid location at which the snapping takes place if in the calculator’s
display matrix on the screen would take an input as to where the fingers are. Multiple finger clicks can be used to perform
simple arithmetic operations.
4) 2-Dimensional Dynamic Application (Ping Pong Game): Gestures are being constantly monitored in video format and the
gesture grid keeps moving along with the movement of the hand. This movement is used to move the pong paddles to the
desirable position in order to strike the ball. 2 Hand gestures (Left and Right) are constantly monitored and can be used to play
a 2-Man game with a scorecard and an end of game screen.
5) 3-Dimensional Dynamic Application (Unity Simulation): An environment is built using Unity engine and, in this environment,
we introduced some blocks as weighted 3D objects held in place using placeholders which the user can manipulate using his
constantly monitored hand gesture.

V. RESULTS
Hand gesture as inputted by the user is shown in Fig 2. This is automatically darkened to view the fingers and the spaces between
them. The program identifies the Hand and determines its setpoints (Contours) as shown in Fig 3. Finger placement-based selection
is used to input the desired numbers into the calculator (Created in a 2D simulation) as shown in Fig 4 and the output is shown in
Fig 5. These same hand contours are used to control paddles in the mini pong game (2D Dynamic Environment designed by us) as
shown in Fig 6. Fig shows A 3D Block environment developed using Unity. The blocks have weight and act like real life objects.
The Hand gestures are used to control a 3D Hand to capture movements resulting in Impact as shown in Fig 8.

Fig 2. Hand gesture as inputted by the user.

Fig 3 Contour Determination

Fig 4 Calculator Input

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 1822
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue IV Apr 2023- Available at www.ijraset.com

Fig 5 Calculator Output

Fig 6 Mini Pong Game

Fig 7 Hand Positioning

Fig 8 Impact

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 1823
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue IV Apr 2023- Available at www.ijraset.com

VI. CONCLUSION
1) In this paper we created a System that can intake Hand Gestures, understand them and assign them to various commands in
Static and Dynamic Environments.
2) We have presented the Outputs and the Implementation in this paper.

VII. FUTURE SCOPE


1) This System can be enhanced Further by using better image capturing equipment like Depth-Based Motion Sensors or
Professional Frame-by-frame Cameras.
2) Computer Vision can be used in a vast multitude of Application ranging from Remote Sensing and Microscopic Applications
all the way to space exploration and Astronomy.

REFERENCES
[1] “Literature Review on Dynamic Hand gesture recognition” by Avinash D. Harale, Kailash J. Karande. AIP CONFERENCE (2021).
[2] “A Scientometric analysis on hand gesture recognition” Jatinder Kaur, Nithin Mittal, Sarabpreet Kaur. AIP Conference (2022).
[3] “The development of hand gestures recognition research: A Review” Achmad Noyar Aziz, Arrie Kurniawardhani. International Journal of AI (2021).
[4] “A review of Sign Language Hand Gesture recognition algorithms” Casam Njaji Nyaja, Ruth Diko Wario. Advances in AI (2020).
[5] “Hand gesture recognition on Python and OpenCV” AP Ismail, Farah Adhira Abd Aziz, Nazita Mohamat Kasim. IOP Conference (2021)
[6] “Hand Gesture recognition through depth sensors” Bhupesh Kumar Devangan, Princi Mishra. “Challenges in Hand Gesture Applications” (2022).
[7] “Hand Gesture Recognition System” Neha Kadam, Priyanaka Kalambarkar, Shilpa Joshi, Rameez Shamalik. Research Gate (2021).
[8] “Hand gesture recognition for sign language using 3dcnn” Munir AlHamadi, Gulam Mohammed, Mahdood Abdul, Mansour Al-Suleiman. IEEE Access (2020)
[9] “Applications of Hand gesture recognition” Hitesh Kumar Sharma, Tanupriya Chowdhury. “Challenges in Hand Gesture Applications” (2022).

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 1824

You might also like