Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Eyeball Based Cursor Movement Control

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/344063858

Eyeball based Cursor Movement Control

Conference Paper · July 2020


DOI: 10.1109/ICCSP48568.2020.9182296

CITATIONS READS
4 2,621

5 authors, including:

A. Sivasangari D. Deepa
Sathyabama Institute of Science and Technology Sathyabama Institute of Science and Technology
69 PUBLICATIONS   145 CITATIONS    27 PUBLICATIONS   53 CITATIONS   

SEE PROFILE SEE PROFILE

Roobini M.S
Sathyabama Institute of Science and Technology
42 PUBLICATIONS   35 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

PREDICTION OF SUSTENANCE AND RETAIL ADVERTISING USING BIGDATA View project

All content following this page was uploaded by Roobini M.S on 03 November 2020.

The user has requested enhancement of the downloaded file.


International Conference on Communication and Signal Processing, July 28 - 30, 2020, India

Eyeball based Cursor Movement Control


Sivasangari.A, Deepa.D, Anandhi.T, Anitha Ponraj and Roobini.M.S

Abstract—An individual Human computer interference system In the proposed system, we have included the face
is being introduced. In olden times, as an input device the mouse
detection, face tracking, eye detection and interpretation of a
and keyboard were used by human computer interference
system. Those people who are suffering from certain disease or sequence of eye blinks in real time for controlling a non-
illness cannot be able to operate computers. The idea of intrusive human computer interface. Conventional method of
controlling the computers with the eyes will serve a great use for interaction with the computer with the mouse is replaced with
handicapped and disabled person. Also this type of control will the human eye movements. This technique will help the
eliminate the help required by other person to handle the paralyzed person, physically challenged people especially
computer. This measure will be the most useful for the person person without hands to compute efficiently and with the ease
who is without hands through which they can operate with the of use. Firstly, camera captures the image and focuses on the
help of their eye movements. The movement of the cursor is eye in the image using OpenCV code for pupil detection. This
directly associated with the center of the pupil. Hence our first
results the center position of the human eye (pupil). Then the
step would be detecting the center of point pupil. This process of
pupil detection is implemented using the Raspberry Pi and center position of the pupil is taken as a reference and based
OpenCV. The Raspberry Pi has a SD/MMC card slot which is on that the human or the user will control the cursor by
used for placing the SD card. The SD card is boosted with the moving left and right [6-9].
operating system that is required starting up of Raspberry Pi. This paper organization is described as follows. Section II
The Raspberry PI will get executed once the application program describes existing solutions to find the cursor movement using
is loaded into it. some 3D models. In Section III we present how cursor is
working based only on Eyeball movement using OpenCV
Index Terms—Human Computer Interaction (HCI), Eyeball methodology. In Section IV how the cursor is moving using
movement, Computer, OpenCV, Support Vector Machine. eyeball with example with the better solutions. And the
Conclusion part are presented in section V.
I. INTRODUCTION

A S the computer technologies are growing rapidly, the


importance of human computer interaction becomes highly
notable. Some persons who are disabled cannot be able to use
II. RELATED WORK
The basic actions of a mouse are mouse click and mouse
movement. The advance technology replaces this mouse
the computers. Eye ball movement control mainly used for movement by eye motion with the help of an OpenCV. The
disabled people. Incorporating this eye controlling system with mouse button click is implemented by any of the facial
the computers will make them to work without the help of other expressions such as blinking eyes, opening mouth and head
individual. Human-Computer Interface (HCI) is focused on use movement. This model introduces a novel camera mouse
of computer technology to provide interface between the driven by 3D model based bias face tracking technique. In
computer and the human. There is a need for finding the suitable personal computer(PC) due to the standard configuration it
technology that makes the effective communication between achieves human machine interaction through faster visual face
human and computer. Human computer interaction plays the tracking and provides a feasible solution to hand-free control.
important role .Thus there is a need to find a method that The face tracker used here is based on 3D model to control the
spreads an alternate way for making communication between mouse and carry out mouse operations.
the human and computer to the individuals those who have Gaze estimation can be used in Head-mounted display
impairments and give them an equivalent space to be an element (HMD) environments since they can afford important natural
of Information Society [1-5]. computer interface cues. This new gaze estimation is based on
In recent years, the human computer interfaces are attracting 3D analysis of human eye. There are various commercial
the attention of various researchers across the globe. Human products which use gaze detection technology. In this method,
computer interface is an implementation of the vision-based the user has to point only one point for calibration it will then
system for eye movement detection for the disabled people. estimate the gaze points. The facial features such as eyes and
nose tip are recognized and tracked to avoid the traditional
mouse movements with the human face for human interaction
with the computer. This method can be applied to face scales in
a wide range.
Sivasangari.A, Deepa.D, Anandhi.T, Anitha.Ponraj and Roobini.M.S are
with the Sathyabama Institute of science and technology, Chennai. (email:
sivasangarikavya@gmail.com deepa21me@gmail.com
anandhi.cse@sathyabama.ac.in anisainosoft@gmail.com
roobinims@gmail.com)

978-1-7281-4987-5/20/$31.00 ©2020 IEEE 1120


Six-Segmented Rectangular (SSR) filter and support vector III. Proposed Work
machine are used for fast extraction of face candidates and face The proposed system uses Raspberry Pi board of version 3,
verification respectively. This comprises of our basic strategy which is attached with the Monitor, PIR Sensor, and Camera.
for detection. Using JAVA(J2ME) for face candidate detection, These materials are attached by USB adaptors. Raspberry plays
the scale adaptive face detection and tracking system are a vital role in the working module that keeps the eye
implemented to perform left/right mouse click events when the movement with sensors. Raspberry pi uses SD card, to install
left/right eye blinks. Camera mouse has been used for disabled raspbian is along with programming codes. PIR sensor also
people to make an interaction with the computer. [1] used for detecting human movement.
The camera mouse is used to change all roles of traditional
mouse and keyboard actions. The proposed system can give all A. Face Detection
mouse click events and keyboard functions. In this method, the The computer technology which is used for a variety of
camera mouse system along with the timer acts as left click applications by identifying the human faces in digital images is
event and blinking as right click event. The real time eye-gaze called as face detection. The proposed method detects features
estimation system is used for eye controlled mouse for assisting from the face. A simple face tracking system was developed.
the disabled. This system based on the methodology in which a Face images can be analyzed without ever requiring any
general low-resolution webcam is used, but detects the eyes and interaction with the user/person. Facial recognition can be used
track gaze accurately in less expense and without specific as an important measure of tracking the attendance and time
equipment. PIR sensor is specifically used for the human information. Human face provide facial information that can be
movement detection. This paper introduces a novel camera used for many applications like emotion recognition and
mouse driven by visual face tracking based on a 3D mode. This human computer interface. Local binary pattern algorithm can
camera has a standard configuration for PCs with increased be used for feature extraction.3×3 pixel image can be taken
computation speed and also providing feasible solution to hands from the web camera. Encoding operation can be performed
free Control through visual face tracking. Human facial pixel values and transformed in to binary value 0 or 1.The face
expressions can be classified as rigid motion and non rigid image is divided into N blocks. The thresholding function is
motions. The rigid motions are rotation and translation whereas described below:
the non-rigid motions are opening, closing and stretching of the
mouth.
Firstly, we use a virtual eyeball model which is based on the
3D characteristics of the human eyeball. By using a camera and (1)
three collimated IR-LEDs secondly, we will be calculating the Weight values are calculated for every neighbour and
3D position of virtual eyeball and gaze vector [2]. Thirdly the calculate the LBP value.
calculation of 3D eye position and gaze position on a HMD
monitor is allowed. This used simplified complex 3D converting B. Eye Region Detection
calculations that have three reference frames (the camera, the The exact position of the pupil is known by using vertical
monitor and the eye reference frames). Fourth, based on kappa integral projection and horizontal projection. These projections
compensation, a simple user-dependent calibration method was divide the whole picture to homogenous subsets. The arbitrary
proposed by gazing at one position. threshold is used in the proposed method. The noise can be
In our work, we are trying to compensate the need of people removed by using Gaussian filter. The strong pixel value is
who have hand disabilities and could not be able to use based on minimum gradient point. The lower threshold
computer resources without other individual’s help. Our protects against splitting edges in the contrast region. Circular
application mainly uses facial features to interact with the Hough transform is used for finding the inner and outer
computer hence there will be no need of hands to operate the boundaries. Hough transform check all the edge points with
mouse [3]. center coordinates.
Paralysis is a special case, in which the loss of muscle
functions in part of your body. It happens when something goes C. Eye Movement Classification
wrong with the way messages pass between your brain and The different eye-motions are classified with the help of
muscles. When such a thing happens, the person’s ability to support vector machine classifier. The eye-movements are eye
control movement is limited to the muscles around the eyes [4]. open, eye close, eyeball left and eyeball right are captured by
Blinking and movement of eyes is the only way of web camera. SVM can analyze data and used for classification
communication for them. To such communication defects the and regression analysis. SVM is a set of associated supervised
assistance given is often intrusive that is it requires a special learning functions used for classification and regression
hardware or a device. The alternate way for interfacing is problems. In SVM, the multi class file is used. This PIR sensor
through a non-intrusive communication system such as Eye is connected to the General Purpose I/O port of the raspbian
Keys which works without special lightning. The eye direction board, it detects the eye pupil movement and hence makes the
is detected if the person looks at the camera, which can be used camera to start capturing the images. Sensor can cover the
to control various applications [5]. range up to 5cm.

1121
Once when the sensor detects the movement of eye pupil IV. PERFORMANCE ANALYSIS
the camera starts to capture the images and send it into the After initiating the OS the operation starts. At first PIR
raspbian board through an USB cable. The camera used here sensor is used to detect the presence of individual in front of
is an USB web camera of affordable cost. Then the transferred system. If the person is detected then camera will ON, where
image is processed and monitored out. The function of the SD image will be captured by USB Camera. Focus on eye in image
card is to store the Raspbian Jessie operating system module is shown in Fig. 2. Now the center position of pupil is detected
and it stores the program in it. The Raspberry Pi board is in OpenCV. Take the exact position of pupil as reference, and
activated using Python programming language. The SD card then the next the different value of X, Y coordinates will be set
capacity is up to 8GB. for accurate command.
The monitor input is got from the HDMI port of the
raspberry board. HDMI is a High Definition Multimedia
Interface port which is used to monitor the uncompressed
video data. The HDMI converts digital image signal into
analog signal and gives it to the monitor. Camera is used to
capture the movement of eye. The center portion of the eye as
well as the position of the people is identified for getting the
various movements of eye. The process is implemented using
Raspberry Pi. In Fig. 1 shows that the pupil reference has the
coordinates of (x, y). Raspberry pi will be combining with
USB Camera. Raspberry pi will be use SD card, then the
install raspbian OS and Opencv on raspberry pi. First image
will be capture by USB Camera. Focus on eye in image and
detect the center position of pupil by Opencv code.
Support vector machine algorithm is used for
classification. Eye image is denoted as a vector I which Fig. 2. Detection of Eye
represents the pixel values. Training images are divided into
two sets. The task of SVM is to determine the new data
belongs to either positive set or negative set. Principle (2)
component analyses are applied to the training data and
reduce the dimensionality of data.

Detection error is the one of the parameter to evaluate the


accuracy of proposed work. Images can be collected from the
BioID database. The proposed method achieves better
accuracy than existing methods. The following Fig. 3 shows
the efficiency at different error levels.

Fig. 3. Efficiency Analysis

Fig. 1. Flow of Proposed Work

1122
V. CONCLUSION [3] John J. Magee, MargritBetke, James Gips, Matthew R. Scott, and Benjamin
N.Waber“A Human-Computer Interface Using Symmetry Between Eyes to
From the process implemented it is cleared that the cursor Detect Gaze Direction” IEEE Trans, Vol. 38, no.6,pp.1248-1259, Nov
can be controlled by the eyeball movement i.e., without using (2008).
[4] SunitaBarve, DhavalDholakiya, Shashank Gupta, DhananjayDhatrak,
hands on the computers. This will be helpful for the people
“Facial Feature Based Method For Real Time Face Detection and Tracking
having disability in using the physical parts of the computers to I-CURSOR”, International Journal of EnggResearchand App., Vol. 2, pp.
control the cursor points. Because the cursor points can be 1406-1410, Apr (2012).
operated by moving the eyeballs. Without the help of others [5] Yu-Tzu Lin Ruei-Yan Lin Yu-Chih Lin Greg C Lee “Real-time eye-gaze
estimation using a low-resolution webcam”, Springer, pp.543-568, Aug
disabled people can use the computers. This technology can be
(2012).
enhanced in the future by inventing more techniques like [6] Samuel Epstein-Eric MissimerMargritBetke “Using Kernels for avideo-
clicking events as well as to do all the mouse movements and based mouse-replacement interface”, Springer link, Nov (2012)
also for human interface systems using eye blinks. Technology [7] Hossain, Zakir, Md Maruf Hossain Shuvo, and Prionjit Sarker. "Hardware
and software implementation of real time electrooculogram (EOG)
also extended to the eyeball movement and eye blinking to get
acquisition system to control computer cursor with eyeball movement."
the efficient and accurate movement. In 2017 4th International Conference on Advances in Electrical Engineering
(ICAEE), pp. 132-137. IEEE, 2017.
[8] Lee, Jun-Seok, Kyung-hwa Yu, Sang-won Leigh, Jin-Yong Chung, and
REFERENCES Sung-Goo Cho. "Method for controlling device on the basis of eyeball
motion, and device therefor." U.S. Patent 9,864,429, issued January 9, 2018.
[1] Jilin Tu, Thomas Huang, Elect and Comp EngrDept, Hai Tao, [9] Lee, Po-Lei, Jyun-Jie Sie, Yu-Ju Liu, Chi-Hsun Wu, Ming-Huan Lee, Chih-
ElectEnggDept, “Face as Mouse through Visual Face Hung Shu, Po-Hung Li, Chia-Wei Sun, and Kuo-Kai Shyu. "An SSVEP-
Tracking”,IEEE,(2005). actuated brain computer interface using phase-tagged flickering sequences: a
[2] EniChul Lee Kang Ryoung Park “A robust eye gaze tracking methodbased cursor system." Annals of biomedical engineering 38, no. 7 (2010): 2383-
on a virtual eyeball model”, Springer, pp.319-337, Apr (2008). 2397

1123
View publication stats

You might also like