VTS Final (Ready)
VTS Final (Ready)
VTS Final (Ready)
(i)
INDEX
1. Abstract
2. Aim
3. Scope
4. Introduction 1
5. History 2
9. Methodology 6
Technological Aspects 12
Conclusion 13
(ii)
LIST OF FIGURE
1
ABSTRACT- Touch screen displays are setup informal all over the map. The touch screen display
bestow a greater suppleness to end user but year after year touch screen display becomes less
intuitive which causes failure of touch on touch screen display. If screen protector is put to use
still dirty marks will be present on the display to give a wide berth to this problem an effortless
user interface for touch less control of electrically set off equipment is being evolved. This paper
gains mastery over the impediment of touch screen display by presuming touch less display, also
this paper entrusts a study of touch less display, history of touch screen working of touch less
technology with application.
Abstract-
It was the touch screens which initially created a great outbreak. Gone are the
days when you have to fiddle with the touch screens and end scratching up.
Touch screen displays are ubiquitous worldwide. Frequent touching a touchscreen
display with a pointing device such as a finger or if there is any scratch caused
due to major problems can result in the gradual de-sensitization of the
touchscreen to input and can ultimately lead to malfunction of the touchscreen. To
avoid this, a simple user interface for Touchless control of electrically operated
equipmentis being developed. Elliptic Labs innovative technology lets you control
your gadgets like Computers, MP3 players or mobile phones without touching
them. A simple user interface for Touchless control of electrically operated
equipment.Unlike other systems which depend on distance to the sensor or
sensor selection this system depends on hand and or finger motions, a hand wave
in a certain direction, or a flick of the hand in one area, or holding the hand in one
area or pointing with one finger for example. The device is based on optical
pattern recognition using a solid state optical matrix sensor with a lens to detect
hand motions. This sensor is then connected to a digital image processor, which
interprets the patterns of motion and outputs the results as signals to control
fixtures, appliances, machinery, or any device controllable through electrical
signals.
INTRODUCTION
1. This screen is made by TouchKo, White Electronics Designs, and Group 3D. it works by
detecting gestures as form of input. It has no need of touching screen.
This technology is high-end technology, that uses hand waves and hand
flicks. Objective behind building such technology is making it even more
comfortable and convenient for users to use their devices. It does not
need touching of screen rather system detects hand movements in front
of it by making use of various sensors. This technology looks visually
fascinating and is depicted in various Sci-fi movies such as Minority
Report and Matrix Revolutions.
The touchless touchscreen technology is divided into following parts for its
working.
a. Movement detection –
This itself is technology that detects change in position of an object with
respect to its surrounding and also change in position of surroundings
with respect to object. The motion or movement detection can be either
mechanical or electronic in nature. This is done by system called
motion detector that is used to know changes in positions of objects or
2
d. Screen pointing –
Screen pointing mechanism allows users to point at items and icons
present on screen without physically touching screen. This mechanism
interprets several human hand gestures to make decision of what task
user wants to perform.
4. Touch Screen is an important source of input or output device normally layered on top of an
electronic visual device. A user gives the input or control the information processing through single or
multi-touch gestures by touching the screen. It enables the user to interact directly with what is displayed,
rather than using any intermediate device. . Development of touch screens started in the
second half of the 1960s. Early work was done at IBM, by the University of Illinois,
and Ottawa Canada. In 1971, several different techniques had been disclosed.
Touch screens have subsequently become familiar in everyday life. Touch screen
technology is widely available and used in many facets of society. The touch
screen can be found in use as public information displays, retail sector, restaurant
systems, POS, ATM, computer-based training systems, customer self-service
aids, traffic control systems, assistive technology for the Disabled, tourism kiosks,
GPS systems, phones, tablets, game consoles, and continues to appear in newer
technologies. Touch screens have reached every industry, every product type,
every size, and every application at every price point.
6. Touch technologies are classified into four main types viz Resistive,
Capacitive, Acoustic and Optical. Each type has its different specific technologies.
CAPACITIVE
In capacitive system, a layer that stores electric charge constructed from materials like copper or indium-tin
oxide is used. Sensors at corners and protective casing complete the whole setup. A minute amount of
voltage is applied to all corners of the touch screen
SAW
Surface acoustic wave detects fingers using sound instead of light. Ultrasonic sounds which are too high
pitched for humans to hear are reflected back and forthacross its surface.When the screen is touched the
user interrupts the sound beam, and the location of the touch is calculated. OPTICAL
4
Infrared Touch screens It is the less common and less precise one. It consists of LEDs and Light detecting
photocells arranged on the opposite sides of the screen. The LEDs shine infrared light in front of the screen
– a bit like an invisible spider’s web.
(a) Front Panel or Bezel. bezels are the border between a device’s display and
its physical frame. Bezels have become one of the most important considerations when
it comes to designing electronics. Depending on the device, bezels can be made of
glass, metal, or hard plastic, and can serve the function of making a device easier to grip
without accidentally touching the screen. They can also improve a device’s durability
and protect glass displays from damage.
(a) Compared to other devices, touchscreen devices have low battery life. You need to recharge your
device often to continue to use it.
Touch screen devices usually consume a lot of energy and it can gap up your electricity bills
Most touch screen devices have inbuilt battery and other components, and repair may be costly.
It is almost impossible to use a touchscreen device when you are out in the sun. Exposure to direct sunlight
minimizes screen visibility. But like i said and as you’re familiar with touch screen devices, you can adjust
the brightness, contrast and others.
You need to be too close to touch screen display before you can use touchscreen devices and it can be
disturbing sometimes.
These devices require extra care. The screen can break or scratch easily since it’s made of glass and the
functionalities of the device can be limited if the screen is cracked or broken.
You may not feel click when you touch screen unless maybe you set a dial pad tone.
Touch screen devices are more expensive than push-button devices.
The user's hand may obscure the screen.
(f) Touch screen devices usually have no additional keys and this
means when an application crashes, without crashing the OS, you can't
get to the main menu as the whole screen becomes unresponsive.
6
Photodiodes
inside the pixel
converts
Signal processed by The sensor incoming signal
a DIP to provide generates electric into electric
output to the device signal charge
METHODOLOGY
obtain a background image (BG) excluding the operator. When the operator
interacts with the projector screen, we obtain the foreground image (FG) which
includes the operator's hand. Then a blur filter of a 5x5 kernel is applied to BG and
FG to reduce the noise caused by the camera lens. As BG and FG are in form of
matrices, we perform the following matrix operations to isolate the hand into image
B, under indoor lighting conditions; B = (BG – FG) X 4. B is then converted into a
grey-scale image and the threshold at the intensity level of 100 to obtain the
binary image. The largest contour in the resulting image can be considered the
operator's hand.
14. The scan line on the actual screen is assumed tobe L and the scan line on
the distorted screen be Ld. L is the independent variable, while Ld is the dependent
variable. As we elevate L on the actual screen, we record the position of L d
against the position of L. An algorithm was used to obtain L d's position which
performs a simple background subtraction and finds the longest horizontal line on
the distorted screen. Once the recording process is complete, we have a data set
that reveals the relation between L and Ld.
15. The data set is then smoothened by running through a cosine interpolation
algorithm. Given the y coordinate of a fingertip on the distorted screen, now we
can obtain the analogous Y coordinate on the actual screen using the relation. As
Yd is typically a smaller dimension than Y, units on Y d should be mapped to Y on
one-to-a-many bases. This will cause spatial discrimination on the virtual touch
screen which leads to a loss of precision.
9
17. The boundary data a, b, and Ymax are already learned by the system in
the calibration phase. Note that when level L increases, WL (the relative width of
the projector screen at L) decreases. L and WL have an inverse linear relationship
and can be written as a function:
18. Then the system can obtain the x-coordinate of the computer screen from
the following function.
Where,
Wc – Width of the computer screen
10
21. A dynamic gesture is expected to change over a time frame though a static
signal is seen at the spurt of time. A waving hand implies farewell is a case of
element motion and the stop sign is a case of static motion. In this paper, we
primarily concentrate on close-by gestures. The objective is to make a framework
which can recognize particular hand signals and utilize them to pass on data to
other systems.
24. In this area, we mapped the hand to perform gestures. Through this when
the user holds his hand above the camera, the camera will be sending the frames
to the system and the system will recognize the number of folded fingers of the
hand and identify it as a gesture. When the user combines fingers, the
combination acts as a unique gesture. For example, if the system recognizes a
hand gesture with 4 fingertips It will be a left mouse click event. If it is a gesture
with 3 fingertips it will be recognized as a right mouse click event. Likewise, we
were able to map different gestures to unfolded fingertip patterns.
Swipe-right/next Swipe-left/previous
Later we calculate the convex hull of the hand contour, The convex hull will wrap
the convexness of the contour.
25. As of now, we have the points of the convex hull as well as the points of
the fingertips given by the Tipy algorithm. When the fingers are unfolded the
fingertips lie on the border of the convex hull, but if we fold one finger the tip of
12
that fingertip falls below the border of the convex hull. So, the distance to the
convex hull from the fingertip will be larger than the others. So, we can accurately
calculate how many fingertips have been folded in one hand.
26. Then we keep track of all 5 fingers of the hand now we can monitor which
fingers have been folded. Based on the folded fingertips we define gestures as
below. Also, through the system, we recognize zoom-in and zoom-out gestures to
zoom pictures or relevant areas. The user must use both hands to perform zoom
gestures. Two fingertips will perform zoom in action and one fingertip will perform
zoom out gesture.
28. Virtual Touch Panel (V Touch). A Virtual Touch Panel can replace
touch panels attached to traditional displays. Virtual Touch Panel consisting of a
3D camera, board AI S/W. Simply putting a Virtual touch camera on top of
displays will allow touchless touch control of any type of display up to 65" at a
distance of 0m to 1.2m. The virtual touch panel is a next-generation technology
that makes the front space (3D) of the display a control area, which is higher than
13
the existing touch panel, where only the touchable surface (2D) is the control area.
TECHNOLOGICAL ASPECTS
32. The use of a virtual touch screen in the military environment to simplify and
speed up the handling of technical applications. Military-technical applications are
constantly exposed to dirty or dusty environments and must not suffer any
damage as a result thereof. Particularly missions in desert regions require a touch
screen, which must not get scratched due to sand, so eliminating the screen is a
better option also a 3D image of the war zone is easy to understand. This
particularly helps the armed forces on deployment. Virtual touch screens are not
only extremely quick and easy to operate, but they also do not require special care
concerning their storage and operation.
CONCLUSION
33. A method was developed in which gestures are detected and interact with
the screen with the help of IR sensors and computer systems. The accuracy of the
virtual touch screen significantly relied on hardware components such as the
camera sensor and constant lighting conditions. Given the perfect test conditions,
we noticed the accuracy is inversely proportional to the screen size. However, the
system had an acceptable response to touch gestures with an error margin of
15%-20%. Nettle Box Hologram table and Projection keyboard are identified as an
application of touchless technology. This technology has many future aspects and
can be used effectively in computers, cell phones, webcams and laptops. Virtual
Touch screen technology is still developing.
34. Few years down the line, our body can be transformed into a virtual mouse
or virtual keyboard, and our body may be turned into an input device.
16
REMARKS BY ADS
17
REMARKS BY STUDENT DS
18
REMARKS BY DS
19
REMARKS BY HOD
20
REMARKS BY DEAN
21
REMARKS BY COMDT