IJCRT2303148
IJCRT2303148
IJCRT2303148
org © 2023 IJCRT | Volume 11, Issue 3 March 2023 | ISSN: 2320-2882
Abstract: It's probable that two factors contribute in some measure to the yearly increase in the population of impaired individuals.
Some people suffer terrible catastrophes, whereas some people are born with it. It's possible that these paralysed people can't speak
since speech is so important to our survival. Although they cannot speak or use their hands to communicate, they may be able to
control their eye movements and have good vision. Hence, we propose an affordable eye motion-based eye gazing communication
system. The Eye Gaze project aims to use a human's eye gaze through a computer interface. The phrase "eye gaze communication"
describes using a computer system that tracks a person's eye movement. The method is eye tracking.
A living thing's ability to communicate is one of its most crucial abilities. Life gets more challenging without communication. It's
possible that people with paralysis and motor neuron illness can't communicate like us. When paralyzed patients lack sensation,
they are unable to move their limbs in any way. They consequently lose the ability to speak, which makes it difficult for them to
explain their basic requirements and necessities and makes them dependent on others for all of their needs. The only voluntary
motion a disabled person can control is eye movement. There are an estimated 150,000 people with severe disabilities who can
easily control only their eye muscles. In this situation, an eye tracking device might offer a different choice for those with severe
disabilities who are still only able to move their eyes. We can create a system that monitors a disabled person's eye movement for
communication purposes.
So, in order to facilitate communication, we suggest a low-cost eye gaze communication system. Using a variety of image
processing techniques, the position of the iris is used to track the movement of the eyeball. After eye tracking is established,
numerous eye movements that are useful for communication can be integrated in a graphical user interface. Normally, eyes are
utilized for observation rather than control. Moments in the eye happen naturally and quite quickly. By moving their eyes, users of
eye gaze communication systems can control the system through eye gaze tacking.
The range of techniques for estimating and eye movement monitoring were explored by Kyung- Nam et al [1]. These methods
include eye lid tracking, occluded circular edge matching, and longest line scanning. Computer vision and image processing
techniques are utilized to measure eye gazing. There are two methods for estimating eye gaze: geometry-based estimation and
adaptive-based estimation. Geometric estimate is superior to estimation based on an adaptive strategy. To gain eye gaze,
determine the relationship between the face model and the camera picture point. The tracking method is non-intrusive. The 3x3,
4x5, and 8x10 screen resolution eye gaze tracking systems estimate gazing point using an adaptive base and geometry base
estimation method. However at 8x10 screen resolution, eye gaze tracking techniques appear to be fairly effective. The non-
intrusive tracking and measurement of eye movements based on vision was studied by the author. Kumar, Manu [2] the author
covered a range of methods for using eye movement data in everyday tasks. The primary goal of this research is to provide
flexible alternatives that users can utilize in accordance with their physical capabilities or preferences, rather than to replace the
present methodologies. Also, this author defends the use of eye contact in interactions.
Linda Sibert and others described the effectiveness of eye gaze contact with a common, all- purpose device, such as a mouse, is
compared. Computer input device illustrates advantages of human eye gaze. Two experiments that compare the eye gaze
interaction technology with the commonly utilized mouse are described for this aim. In this experiment, the amount of time
needed to complete simple computer tasks using the mouse and eye gaze is measured. In the first experiment, the respondent had
to quickly choose a highlighted circle from a grid of circles measuring three by four.
III. METHODOLOGY
A customized video camera situated underneath the monitor watches one of the user's eyes as they are seated in front of the eye-
gaze monitor. The eye-gaze System's computer uses sophisticated image-processing software to continuously examine the video
image of the user'seye to pinpoint where they are looking on the screen.
The pupil-center/corneal reflection approach is used by the Eye-Gaze System to ascertain where the user is gazing at the screen. To
capture images of the user's eye, 60 frames per second infrared-sensitive video camera is placed beneath the system's monitor. The
eye is illuminated by a low-power infrared light-emitting diode (LED) positioned in the center of the camera's lens. A tiny amount of
light is reflected off the cornea of the eye by the LED. Moreover, light enters the pupil, reflects off the retina, and the eye's back
surface, and gives the pupil its white appearance.
The bright-pupil effect improves the camera's depiction of the pupil and facilitates the location of the pupil's center
during image processing. Based on the relative positions of the pupil center and corneal reflection inside the video image of the
eye, the computer determines the user's gaze point or the coordinates of where he is looking on the screen.
3.1 Electro-occulography
With this technique, sensors are affixed to the skin close to the eye to detect the presence of an electric field
created as the eyes spin. The position of the eye can be determined by noting the minute variations in the skin potential around it.
It is feasible to record horizontal and vertical movements by precisely positioning electrodes. Since it necessitates intimate contact
between the electrodes, this approach is not suitable for everyday use. It's a quick, simple, and intrusive procedure.
alignment and the possibility of data aberrations or errors caused by things like head motions or electrical noise
from other sources.
An approach to measuring eye movements with infrared light is called infrared Oculography. It entails positioning tiny
infrared cameras or sensors near the eyes to monitor how the eyes move as they respond to various stimuli or objects. As infrared
light is invisible to the human eye, it has no negative effects on how the eyes move normally and doesn't hurt. The great spatial
and temporal resolution of infrared Oculography is one of its benefits, enabling accurate detection of eye movements and quick
data collection. It can be used to research eye movements in a variety of populations, including infants, children, and people with
specific disabilities, and is also non-invasive.
Infrared Oculography has some limitations, though, including the need for exact calibration and alignment of the cameras or
sensors and the possibility of artifacts or errors in the data brought on by things like head movements or ocular movements that take
place outside the camera’s or sensor’s field of view.
SPECIFICATIONS
This device is dependable and incredibly simple to calibrate. The system specifically takes into account some frequent reasons of
gaze-point tracking inaccuracy. Remotely and discretely, the subject's eye is observed by a video camera hidden beneath the
computer screen. The head doesn't need to be attached in any way. The camera lens's center houses a tiny light-emitting diode
(LED) that illuminates the eye. The strong pupil image and corneal reflection caused by the LED improve the camera's ability to
capture the pupil.
Cognitive skills
THE TYPE WRITER PROGRAM
The Typewriter application can be used for straightforward word processing. On visual keyboards, the user types by glancing at the
keys. There are four different keyboard setups, from basic to complex. On the screen above the keyboard display, typed text can be
seen. The user can either publish what he has typed or "say" it. Also, he has the option of saving his material to a file for later
retrieval. The text that was retrieved can be spoken, altered, or printed.
IV. IMPLEMENTATION
4.1 IR Sensor
It employs infrared light to identify people in its area of view, at which point the camera module is activated to take
pictures.
4.3 Monitor
The monitor is interfaced with the HDMI port of Raspberry Pi. It is used to display the Graphical User Interface designed.
The monitor is set up in front of the subject which has soft keys that they can navigate through using iris movement which will aid
assistance to the subject.
Fig-7: Block diagram for integration of Iris Detection with Graphical User Interface
VI. CONCLUSION
The human eye-gaze can be recorded using standard methods thanks to modern technology. Eye- gaze-based user interfaces are
primarily used because they offer a potential window into cognitive processes and communication with the eyes, which is faster
than any other form of human communication. The user's attention will be drawn to the eye tracking system as its quality and
accuracy increase and its price decreases. Eye movement should be used with caution because it involves both voluntary and
involuntary cognitive processes in the human brain. In conclusion, the application's primary goal is to reduce the physical strains
patients must endure when speaking, resulting in the creation of a practical and cutting-edge method of communication for
patients with diverse abilities. In conclusion, a futuristic eye-gaze system for patients with paralysis is being presented. It
incorporates cutting-edge technologies including BCIs, VR, AI, and wireless communication. These developments might
considerably improve the eye gazing technology's usability and functionality, enabling paralyzed people to engage with their
surroundings and communicate with others in a more organic and intuitive way. Eye gazing technology is projected to grow and
become more widely used as technology develops, opening up new possibilities for people with impairments to lead more active,
independent lives.
REFERENCES
[1] Kyung-Nam Kim, R.S. Ramakrishna, “Vision Base Eye Gaze Tracking For Human Computer Interaction”, Department
Of Information And Communication, Kwangju Institute OfScience And Technology, Kwangju, 300-712, Korea (ROK).
[2] Manu Kumar, “Gaze-Enhanced User Interface Design, A Dissertation Submitted To The Department of Computer Science
And The Committee on Graduate Studies, Stanford University, 2007.
[3] https://en.m.wikipedia.org/wiki/Eye_tracking
[4] https://tech.tobii.com/technology/what-is-eye-tracking/
[6] Kristie Huda, Mohiuddin Ahmad and Shazzad Hossain,“Command the Computer with youreye- an Electro-ocular-based
approach”. In International Conference on Software Knowledge Information, Industrial Management and Applications
(SKIMA) 2015.
[7] Alex Poole And Linden J. Ball, “Eye Tracking In Human-Computer Interaction And Usability Research: Current Status
And Future Prospects”, Psychology Department, LancasterUniversity, UK.