Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
5 views

Gesture Recognition Based Virtual Mouse

The document presents a project on developing a virtual mouse and keyboard using hand gestures and computer vision technology, eliminating the need for physical devices. It utilizes a webcam to track hand movements, enabling users to control cursor actions and keyboard functions through gestures. The project aims to enhance human-computer interaction and has potential applications in various fields, including medicine.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Gesture Recognition Based Virtual Mouse

The document presents a project on developing a virtual mouse and keyboard using hand gestures and computer vision technology, eliminating the need for physical devices. It utilizes a webcam to track hand movements, enabling users to control cursor actions and keyboard functions through gestures. The project aims to enhance human-computer interaction and has potential applications in various fields, including medicine.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

International Journal of Scientific Research in Engineering and Management (IJSREM)

Volume: 06 Issue: 11 | November - 2022 Impact Factor: 7.185 ISSN: 2582-3930

Gesture Recognition Based Virtual Mouse and Keyboard

Naresh Thoutam Bhumi chavan Kabir Patole Sakshi Jadhav Vidhya Bagal
Computer Engineering Computer Engineering Computer Engineering Computer Engineering Computer Engineering
Sandip Institute of Sandip Institute of Sandip Institute of
Sandip Institute of Sandip Institute of
Technology and Technology and Technology and
Technology and Technology and Research
Research Centre, Research Centre, Research Centre,
Research Centre, Nashik Centre, Nashik
Nashik Nashik Nashik
naresh.thoutam@sitrc.or Bhumichauhan4321@ sakshi608104@gmail.co vidhyabagal19@gmail.
Patole1719@gmail.com
g gmail.com m com

Abstract - In this project, an optical mouse and The goal is to develop a virtual human computer interface and
keyboard are created utilizing hand motions and computer an object tracking application for computer interaction.
vision. The computer's camera will scan the image of Develop such an AI-related application.
various hand gestures made by a user, and the computer's
mouse or pointer will move in accordance with the B. Problem Definition
movements of the Users can even conduct right and left In PCs and laptops, we often use physical mouse or touchpads
clicks using various gestures. Similar to this, several for personal use. However, in this project, the need for
gestures can be used to operate the keyboard, such as the external hardware is completely eliminated by employing HCI
one-finger gesture for selecting an alphabet and the four- technology, which detects hand and eye gestures as well as
figure motion for swiping left and right. Without a wire or mouse movements and events.
other external devices, it will function as a virtual mouse
and keyboard. The project's webcam is its sole piece of
hardware, and Python is used to code on the Anaconda II. RELATED WORK
platform. Here, the Convex hull defects are first Paper 1: Research on Deep Learning-Based Hand Gesture
constructed, and then an algorithm is generated and maps Recognition"
the mouse and keyboard functions to the defects using the Authors: Shu-Bin Zhang, Ting-Ting Ji, and Jing-Hao Sun
defect calculations. By mapping a few of them with the The need for interaction between humans and machines is
mouse and keyboard, the computer will recognise the growing as computer vision technology advances quickly.
user's gesture and respond appropriately. Hand gesture recognition is frequently utilised in robot
control, intelligent furniture, and other areas because hand
Index Terms - Optical Mouse, Face Recognition, Virtual motions can represent enhanced information. The
Mouse And Keyboard, Anaconda Platform segmentation of hand gestures is accomplished in this study by
developing a skin colour model and an AdaBoost classifier
I. INTRODUCTION based on haar to account for the specificities of skin colour for
A small green box will appear in the centre of the screen while hand motions. Additionally, hand movements are denaturized
the computer webcam records video of the person using it with one frame of video being cut for analysis.In this sense, the
while they are seated in front of it. The objects displayed there human hand is separated from the complex background, and
will be processed by the code and compared with it in that the CamShift algorithm also makes it possible to track hand
green box. If they match, a red border will appear, indicating gestures in real-time. Convolutional neural network then
that the computer has located the object. The mouse pointer recognises the area of hand motions that have been detected in
can then be moved by dragging the object. This will contribute real-time in order to achieve the recognition of 10 common
to the computer's security as well as the creation of a virtual digits. Research indicates 98.3% accuracy.
computing environment. Using hand gestures, the cursor will Paper 2: Personalized and Dynamic Keyboard for Eye Tracker
be moved here in place of various objects. A different gesture Typing
will be used for a right click and a different gesture for a left Authors: Kadir Akdeniz and Zehra C. ataltepe
click. In a similar manner, keyboard functions that would Stroke and Amyotrophic lateral sclerosis (ALS) patients are
typically be performed on a physical keyboard can be unable to speak or convey their daily necessities. Since they
emulated using a single gesture. When the recognised gesture can still move their heads and use their eyes, they can converse
is detected, a red border will appear if the gesture does not via eye trackers. This study offers fresh ideas for enhancing
match the box, which will only display a green box otherwise. the speed and usability of eye tracking software. First, letter
prediction is used to increase speed, and second, a novel
A. Motivation design eliminates the need for blinking while using eye

© 2022, IJSREM | www.ijsrem.com DOI: 10.55041/IJSREM16986 | Page 1


International Journal of Scientific Research in Engineering and Management (IJSREM)
Volume: 06 Issue: 11 | November - 2022 Impact Factor: 7.185 ISSN: 2582-3930

trackers, allowing for more comfortable and extended writing technical and scientific computing. A library of programming
sessions. tools with a primary focus on real-time computer vision is
"Paper 3: Visual gesture decoding algorithm for an assisted called OpenCV. A Python-based, cross-platform GUI
virtual keyboard" automation module is called PyautoGUI. This allows you to
Authors:Rafael Augusto Da Silva, an IEEE member, and automate computer chores by controlling the mouse and
Antonio Claudio Paschoarelli Veiga, an IEEE member, keyboard as well as performing simple picture recognition.
One of the most common computer tasks is creating text, a The device uses the webcam to identify a person's pupil.
simple task that can be difficult for those with severe Students discovered that a person can now move the mouse
neuromotor illnesses like Amyotrophic Lateral Sclerosis, cursor by moving his eyes.
which can cause Locked-in syndrome. Since these people may On the computer's home screen, the cursor movement is
only be able to communicate and engage with the outside visible. We followed these steps to type on the virtual
world through eye movements, they require augmentative and keyboard using our fingertip:
alternative communication tools. This study explores eye Step 1: Using the computer's webcam to record live video
movement-based interaction techniques and introduces a Step 2: Processing each every image frame from the recorded
virtual keyboard that accepts text input via gaze detection. video
Paper 4: "Using Colored Finger Tips and Hand Gesture Step 3: Image frame conversion
Recognition, Virtual Mouse Control," Step 4: Virtual Keyboard
Authors: Galla Vamsi Krishna, Thumma Dhyanchand, and Step 5: Hand landmarks are used to recognise hand gestures.
Vantukala VishnuTeja Reddy Step 6: Find the object's position over the virtual keyboard and
One study in the field of human-computer interaction uses a flip the input device.
virtual mouse with finger tip recognition and hand motion Step7.The character Convolutional Neural Network
tracking based on image in a live video. identification techniques should be printed.
This work proposes finger tip identification and hand motion The machine then finds the face. After the system identifies
detection for virtual mouse control. The two finger tracking and captures the eyes, advantages of gesture recognition The
techniques used in this study are hand gesture detection and system then finds the pupils. The system used to improve a
employing coloured caps. number of fields, including starting to move the mouse cursor
by tracking pupil movements, is described in the final module.
III. SYSTEM DESIGN
Module 1 GUI: We have designed our GUI in TKinter.Tkinter
is a Python binding to the Tk GUI toolkit. It is the standard
Python interface to the Tk GUI toolkit, and is Python's de facto
standard GUI.

Module 2: Registration/Login System: User need to Register


before using the application. User’s data will get store into the
database and while user login to the system database will be
get fetch.If user is registered then only they can login to the
system.

Module 3: Database Module: Database is used to store data of


user. We have used DBsqlite database.

Module 5: Mouse Functioning:The system is a mouse-like eye-


based interface that converts eye movements like blinking,
staring, and squinting into mouse cursor actions. The software
requirements for this technique include Python, OpenCV,
NumPy, and a few more facial recognition Harr Cascade
Fig.1 System Architecture algorithm, as well as a basic camera.
Python was used to design the mouse system, and the
following Python modules were imported to make the system Module 6:keyboard Functioning: Keyboard functioning will be
operate. A Python extension module is called Numpy. managed by gesture methods. We are using forefinger and mid
It offers quick and effective actions on collections of related finger for gestures. We are locating coordinates as top, mid
data. Scipy is a Python library that is open-source and used in and base. As a finger moves we will manage using keyboard.

© 2022, IJSREM | www.ijsrem.com DOI: 10.55041/IJSREM16986 | Page 2


International Journal of Scientific Research in Engineering and Management (IJSREM)
Volume: 06 Issue: 11 | November - 2022 Impact Factor: 7.185 ISSN: 2582-3930

CONCLUSION
8. Applying Hand Gesture Recognition for User Guide Application
This proposal suggests a system that would recognise hand Using MediaPipe Indriani et. al. 2021
gestures and take the place of the keyboard and mouse. This 9. An Interactive Computer System with Gesture-Based Mouse and
covers mouse cursor movement, keyboard drag-and-click Keyboard, by Dipankar Gupta et. al. 2020
actions, and other keyboard functionality like printing 10. Design and Development of Hand Gesture Based Virtual Mouse
alphabets. The skin segmentation technique is used to isolate by Kabid Hassab Shibly et. al. 2019
11. Finger recognition and gesture based augmented keyboard by
the hand's colour and picture from the background. The full
Jagannathan MJ et. al. 2018.
body being taken into the camera can be resolved using the 12. Sahu, G., Mittal, S.: Controlling mouse pointer using web cam
remove arm technique. The suggested method can generally (2016)
detect and understand hand gestures, allowing it to control 13. Eye gaze system to operate virtual keyboard by Vidya Itsna
keyboard and mouse functions and produce a real-world user Saraswati 2016
interface. 3D printing, architectural renderings, and even 14. A Study on Weareable Gesture Interface, Rakesh D. Desale et. al.
performing medical procedures remotely. 201
This project is simple to construct, and it has a wide range of 15. S.S. Rautaray, A. Agrawal, “Real Time Gesture Recognition Sys
potential applications in the field of medicine where t em fo r Interaction in Dynamic Environment,”procedia technology.
16. "Sixth Sense Technology" S. Sadhana Rao, Proceedings of the
computing is necessary but has not yet been completely
International Conference on Communication and Computational
realised due to a lack of human-computer connection. Intelligence- 2010, pp.336-339.
17. "International journal for Research in Applied Science and
FUTURE WORK Engineering Technology", (IJSAT) Volume II, Issue I, (Jan.- Mar.)
2011, pp.018-027.
The technique will work well in the future for fundamental 18. Imperial Journal of Interdisciplinary Research (IJIR) Vol- 3,
pointing and pinching. Nonetheless, there are still a lot of Issue-4, 2017.
things that can be done better. Currently The system's 19. " F a c e p h ot o recognition using sketch image for security
background is static, but the deployment of this hand tracking system ",vol. volume -8 , issue 9S2, july 2019.
20. Christy, A., Vaithyasubramanian, S., Mary, V.A., Naveen
device would be highly advantageous and essential. setting up
Renold, J. (2019)," Artificial intelligence based automatic
the augmented reality environment so that a user could use a decelerating vehicle control system to avoid misfortunes",
head-mounted display to interact with virtual 3D environments International Journal of Advanced Trends in Computer Science and
while wearing real-world objects. Since there is a layer of Engineering, Vol. 8, Issue.6, Pp. 3129-3134.
capturing capability needed in this case, multiple
multidimensional camera angles are needed to capture the
hand motions.
REFERENCES
1. S. Sadhana Rao,” Sixth Sense Technology”, Proceedings of the
International Conference on Communication and Computational
Intelligence– 2010, pp.336-339.
2. Game P. M., Mahajan A.R,”A gestural user interface to Interact
with computer system ”, International Journal on Science and
Technology (IJSAT) Volume II, Issue I, (Jan.- Mar.) 2011, pp.018 –
027.
3. International Journal of Latest Trends in Engineering and
Technology Vol. (7)Issue(4), pp.055-062.
4. Imperial Journal of Interdisciplinary Research (IJIR) Vol-3, Issue-
4, 2017.
5. Christy, A., Vaithyasubramanian, S., Mary, V.A., Naveen Renold,
J. (2019),” Artificial intelligence based automatic decelerating
vehicle control system to avoid misfortunes “, International Journal
of Advanced Trends in Computer Science and Engineering, Vol. 8,
Issue.
6. G. M. Gandhi and Salvi, ”Artificial Intelligence Integrated
Blockchain For Training Autonomous Cars,” 2019 Fifth International
Conference on Science Technology Engineering and Mathematics
(ICONSTEM), Chennai, India, 2019, pp.157-161.
7. Grishchenko I, Bazarevsky V, MediaPipe
HolositicSimultaneous Face, Hand and Pose Prediction on Device,
Google Research, USA, 2020, Access 2021.

© 2022, IJSREM | www.ijsrem.com DOI: 10.55041/IJSREM16986 | Page 3

You might also like