Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

VTS Final (Ready)

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 23

1

(i)

INDEX

Ser No Contents Page No Remarks

1. Abstract

2. Aim

3. Scope

4. Introduction 1

5. History 2

6. Anatomy of a Touch Screen 3


Working Principle of Virtual
7. 5
Touch Screen
8. Block Diagram 5

9. Methodology 6

Technological Aspects 12

Military Uses and Advantages 13

Conclusion 13

(ii)

LIST OF FIGURE
1

VIRTUAL TOUCH SCREEN

ABSTRACT- Touch screen displays are setup informal all over the map. The touch screen display
bestow a greater suppleness to end user but year after year touch screen display becomes less
intuitive which causes failure of touch on touch screen display. If screen protector is put to use
still dirty marks will be present on the display to give a wide berth to this problem an effortless
user interface for touch less control of electrically set off equipment is being evolved. This paper
gains mastery over the impediment of touch screen display by presuming touch less display, also
this paper entrusts a study of touch less display, history of touch screen working of touch less
technology with application.

Abstract-
It was the touch screens which initially created a great outbreak. Gone are the
days when you have to fiddle with the touch screens and end scratching up.
Touch screen displays are ubiquitous worldwide. Frequent touching a touchscreen
display with a pointing device such as a finger or if there is any scratch caused
due to major problems can result in the gradual de-sensitization of the
touchscreen to input and can ultimately lead to malfunction of the touchscreen. To
avoid this, a simple user interface for Touchless control of electrically operated
equipmentis being developed. Elliptic Labs innovative technology lets you control
your gadgets like Computers, MP3 players or mobile phones without touching
them. A simple user interface for Touchless control of electrically operated
equipment.Unlike other systems which depend on distance to the sensor or
sensor selection this system depends on hand and or finger motions, a hand wave
in a certain direction, or a flick of the hand in one area, or holding the hand in one
area or pointing with one finger for example. The device is based on optical
pattern recognition using a solid state optical matrix sensor with a lens to detect
hand motions. This sensor is then connected to a digital image processor, which
interprets the patterns of motion and outputs the results as signals to control
fixtures, appliances, machinery, or any device controllable through electrical
signals.
INTRODUCTION

1. This screen is made by TouchKo, White Electronics Designs, and Group 3D. it works by
detecting gestures as form of input. It has no need of touching screen.
This technology is high-end technology, that uses hand waves and hand
flicks. Objective behind building such technology is making it even more
comfortable and convenient for users to use their devices. It does not
need touching of screen rather system detects hand movements in front
of it by making use of various sensors. This technology looks visually
fascinating and is depicted in various Sci-fi movies such as Minority
Report and Matrix Revolutions.
The touchless touchscreen technology is divided into following parts for its
working.
a. Movement detection –
This itself is technology that detects change in position of an object with
respect to its surrounding and also change in position of surroundings
with respect to object. The motion or movement detection can be either
mechanical or electronic in nature. This is done by system called
motion detector that is used to know changes in positions of objects or
2

their surroundings. Most important element associated with motion


detector is sensor located near its screens. It understands changes by
their interaction with line of sight that is interrupted in case of any sort of
motion.
b. Optical pattern recognition –
Optical pattern recognition is system that is based on optical patterns
with the help of solid state particle matrix so as to understand and
detect motions.

c. Motion pattern interruption –


Motion Pattern interruption is sensor. It is connected with digital image
processor that interprets motion patterns. The digital image processor
sends signals to devices, machinery or appliances. These devices in
turn are controlled by making use of electrical signals.

d. Screen pointing –
Screen pointing mechanism allows users to point at items and icons
present on screen without physically touching screen. This mechanism
interprets several human hand gestures to make decision of what task
user wants to perform.

2. A popular virtual touch screen system is the Kinect system made


commercially available by Microsoft. The Kinect system is primarily used
for consumer entertainment such as playing games and using the menus
found on the Xbox system.

Figure No 1. Virtual Touch Screen

Virtual Touch Screen Interfaces.

3. A virtual touch screen will be perceived by Mixed Reality (MR) users as


several Graphical User Interface (GUI) elements floating in front of them. MR
users can interact with these elements using their bare hands by clicking with their
fingers on windows, icons, menus, etc. In the same way as popular GUI like MS
Windows could be operated via a real touch screen. Some advantages of these
3

types of interfaces are.

(a) Projection and Mobility. Virtual Touch Screen Can be


projected and moved to wherever necessary.
(b) Economical Advantage. It can be comparatively cheaper than
using screens, keyboards, mouse, or data gloves, for the amount of
functionality that such a system could offer.

(c) Suitability in Harsh Working Conditions. Virtual Touch Screen is


suitable for use in harsh working conditions. For example, in outdoors or
some other environment potentially harmful to the keyboard, mouse, or
screen, like dusty, humid, or underwater environments and even in
situations where physical contact with the user is not appropriate. For
example, in a hospital operating theatre, a doctor might be in the middle of
an operation on a patient and need to check for some information or select
other functions of a medical AR system without the risk of contaminating
the hands.

HISTORY OF TOUCH SCREEN TECHNOLOGY

4. Touch Screen is an important source of input or output device normally layered on top of an
electronic visual device. A user gives the input or control the information processing through single or
multi-touch gestures by touching the screen. It enables the user to interact directly with what is displayed,
rather than using any intermediate device. . Development of touch screens started in the
second half of the 1960s. Early work was done at IBM, by the University of Illinois,
and Ottawa Canada. In 1971, several different techniques had been disclosed.
Touch screens have subsequently become familiar in everyday life. Touch screen
technology is widely available and used in many facets of society. The touch
screen can be found in use as public information displays, retail sector, restaurant
systems, POS, ATM, computer-based training systems, customer self-service
aids, traffic control systems, assistive technology for the Disabled, tourism kiosks,
GPS systems, phones, tablets, game consoles, and continues to appear in newer
technologies. Touch screens have reached every industry, every product type,
every size, and every application at every price point.
6. Touch technologies are classified into four main types viz Resistive,
Capacitive, Acoustic and Optical. Each type has its different specific technologies.

Resistive Touch Screens


One of the most basic systems mostly used in ATM’s is the resistive touch screen system. It consists of two
electrically conductive layers, one of which is resistive and the other one is a conductive layer. These two
layers are separated by spacers, which keeps them apart until you touch it. A scratch resistant on top
completes the whole setup.

CAPACITIVE
In capacitive system, a layer that stores electric charge constructed from materials like copper or indium-tin
oxide is used. Sensors at corners and protective casing complete the whole setup. A minute amount of
voltage is applied to all corners of the touch screen

SAW
Surface acoustic wave detects fingers using sound instead of light. Ultrasonic sounds which are too high
pitched for humans to hear are reflected back and forthacross its surface.When the screen is touched the
user interrupts the sound beam, and the location of the touch is calculated. OPTICAL
4

Infrared Touch screens It is the less common and less precise one. It consists of LEDs and Light detecting
photocells arranged on the opposite sides of the screen. The LEDs shine infrared light in front of the screen
– a bit like an invisible spider’s web.

Figure No 2. Touch Screen Technology

Texture of a Touch screen

7. Knowing what you need is an important first step in designing a touch


screen product. Vendors in the touch screen supply chain frequently offer different
pieces of the puzzle, oftentimes combining several to create a value chain for the
end customer. There are four key elements.

Figure No 3. Structure of a Touch screen

(a) Front Panel or Bezel. bezels are the border between a device’s display and
its physical frame. Bezels have become one of the most important considerations when
it comes to designing electronics. Depending on the device, bezels can be made of
glass, metal, or hard plastic, and can serve the function of making a device easier to grip
without accidentally touching the screen. They can also improve a device’s durability
and protect glass displays from damage.

(b) Controller. The touch controller is generally a small microcontroller-


based IC that sits between the touch sensor and the embedded system
controller. This IC can either be located on a controller board inside the
5

system or it can be located on a flexible printed circuit (FPC) affixed to the


glass touch sensor. This touch controller takes information from the touch
sensor and translates it into information that the PC or embedded system
controller can understand.

(c) Touch Sensor. A touch screen "sensor" is a clear glass panel


with a touch-responsive surface. This sensor may be placed over an LCD
(Like resistive and capacitive systems) or on the frame (in SAW and
Infrared touch systems) so that the touch area of the panel covers the
viewable area of the video screen. There are many different touch-sensor
technologies on the market today, each using a different method to detect
touch input. Fundamentally, most technologies use an electrical current
running through the panel that, when touched, causes a voltage or signal
change. This voltage change is sensed by the touch controller to
determine the location of the touch on the screen.

(d) System Software. This software allows the touch screen


sensor and system controller to work together and tells the product's
operating system how to interpret the touch-event information that is sent
from the controller.

8. Disadvantages of Touch Screen.

 (a) Compared to other devices, touchscreen devices have low battery life. You need to recharge your
device often to continue to use it.
 Touch screen devices usually consume a lot of energy and it can gap up your electricity bills
 Most touch screen devices have inbuilt battery and other components, and repair may be costly.
 It is almost impossible to use a touchscreen device when you are out in the sun. Exposure to direct sunlight
minimizes screen visibility. But like i said and as you’re familiar with touch screen devices, you can adjust
the brightness, contrast and others.
 You need to be too close to touch screen display before you can use touchscreen devices and it can be
disturbing sometimes.
 These devices require extra care. The screen can break or scratch easily since it’s made of glass and the
functionalities of the device can be limited if the screen is cracked or broken.
 You may not feel click when you touch screen unless maybe you set a dial pad tone.
 Touch screen devices are more expensive than push-button devices.
The user's hand may obscure the screen.

(b) Screens need to be installed at a lower position and tilted to reduce


arm fatigue.

(c) Reduction in image brightness may occur.

(d) The cost is more than alternative devices.

(e) These devices require massive computing power which leads to


slow devices and low battery life.

(f) Touch screen devices usually have no additional keys and this
means when an application crashes, without crashing the OS, you can't
get to the main menu as the whole screen becomes unresponsive.
6

WORKING PRINCIPLE OF VIRTUAL TOUCH SCREEN

9. The system is capable of detecting movements in 3-dimensions without


ever having to put your fingers on the screen. Sensors are mounted around the
screen that is being used, by interacting in the line-of-sight of these sensors the
motion is detected and interpreted into on-screen movements. The device is
based on optical pattern recognition using a solid-state optical matrix sensor with
a lens to detect hand motions. Due to this arrangement, the camera receives a
distorted view of the projector screen.

10. This sensor is then connected to a digital image processor, which


interprets the patterns of motion and outputs the results as signals to control
fixtures, appliances, machinery, or any device controllable through electrical
signals. You just point at the screen (from as far as 5 feet away) and you can
manipulate objects in 3D.

Moving image Light enters


Detected by to the
in front of
the Sensors Sensors and
sensors
Hit the
sensors

Photodiodes
inside the pixel
converts
Signal processed by The sensor incoming signal
a DIP to provide generates electric into electric
output to the device signal charge

Figure no 4. Basic block diagram of virtual touch screen


7

METHODOLOGY

11. Pre-Processing. First, it is essential to isolate the operator's hand from


the scene as it provides key factors for calculating the touched coordinates.
Assuming that there are no moving objects in the scene (i.e., ceiling fans) we
8

obtain a background image (BG) excluding the operator. When the operator
interacts with the projector screen, we obtain the foreground image (FG) which
includes the operator's hand. Then a blur filter of a 5x5 kernel is applied to BG and
FG to reduce the noise caused by the camera lens. As BG and FG are in form of
matrices, we perform the following matrix operations to isolate the hand into image
B, under indoor lighting conditions; B = (BG – FG) X 4. B is then converted into a
grey-scale image and the threshold at the intensity level of 100 to obtain the
binary image. The largest contour in the resulting image can be considered the
operator's hand.

12. Calibration. Calibration is performed to establish a relation between two


screens S and Sd. Each screen has two-dimensional. The actual screen's
dimensions X and Y should be mapped to their respective dimensions X d and Yd
on the distorted screen. Once the mapping process is complete, touchpoints on
the distorted screen can be converted into the touchpoints on the actual screen.

13. Mapping Y-dimensions. Y on the actual screen in a linear


dimension. However, Yd on the distorted screen becomes a non-linear dimension
due to the position of the camera. To map these two dimensions, we use a solid
horizontal scan line. The scan line is drawn on a white background and is
translated from bottom to top on the actual screen at a constant speed. The
translation is also visible on the distorted screen.

14. The scan line on the actual screen is assumed tobe L and the scan line on
the distorted screen be Ld. L is the independent variable, while Ld is the dependent
variable. As we elevate L on the actual screen, we record the position of L d
against the position of L. An algorithm was used to obtain L d's position which
performs a simple background subtraction and finds the longest horizontal line on
the distorted screen. Once the recording process is complete, we have a data set
that reveals the relation between L and Ld.

15. The data set is then smoothened by running through a cosine interpolation
algorithm. Given the y coordinate of a fingertip on the distorted screen, now we
can obtain the analogous Y coordinate on the actual screen using the relation. As
Yd is typically a smaller dimension than Y, units on Y d should be mapped to Y on
one-to-a-many bases. This will cause spatial discrimination on the virtual touch
screen which leads to a loss of precision.
9

16. Estimating the X-coordinate. Estimating X-coordinate on the computer


screen is rather straightforward. Consider the below image.

a – Relative width of the top edge of the projector screen.


b – Relative width of the bottom edge of the projector screen.
WL – Relative width of the projector screen at the given y-coordinate L.
q – Relative distance to the fingertip from the left edge of the projector.
Lmax – Projector screen's top edge level relative to Y dimension.
L – Level of the fingertip relative to the Y dimension.

17. The boundary data a, b, and Ymax are already learned by the system in
the calibration phase. Note that when level L increases, WL (the relative width of
the projector screen at L) decreases. L and WL have an inverse linear relationship
and can be written as a function:

Where the gradient,

18. Then the system can obtain the x-coordinate of the computer screen from
the following function.

Where,
Wc – Width of the computer screen
10

19. The final CPU-optimized function for obtaining X- coordinates can be


written as.

20. Gesture Recognition. Gestures are substantial activities made by


people to pass on important data to others. They are also signals that computers
can percept. A gesture can be narrowed down into two particular classes;
Dynamic and static.

21. A dynamic gesture is expected to change over a time frame though a static
signal is seen at the spurt of time. A waving hand implies farewell is a case of
element motion and the stop sign is a case of static motion. In this paper, we
primarily concentrate on close-by gestures. The objective is to make a framework
which can recognize particular hand signals and utilize them to pass on data to
other systems.

22. Gesture Recognition Algorithm. Through this research, we proposed a


novel technique of hand gesture recognition and fingertip tracking based on the
detection of some shape-based features. Then we came up with an algorithm
called "Tipy". Tipy can track 10 fingertips. First, this algorithm separates the
largest contour which is probably the hand.Then we scan through the image to
find white pixels and we scan the consecutive white pixels. When the number of
white pixels exceeds a certain threshold, the algorithm counts the contour as a
fingertip. Going back to the initial white pixel the algorithm travels to the topmost
pixel where the Y value is minimum. The starting point and the ending point of the
white pixel are stored in the range table. The rest of the fingertips are scanned in
the same manner. Before the algorithm attempts to scan the previously scanned
fingertip again it checks entries in the range table if the new existing range is a
superset of any previous range the algorithm will ignore the fingertip and add the
new range to the new range table. This way we can get all the points at our
fingertips.

23. Gestures. After tracking and recognizing fingertips through Tipy


algorithm, the second task is to recognize different gestures and perform an action
for every gesture. According to the research, we must keep our hand above the
camera and it is not essential to touch the projector screen to perform primary
gestures. When the hand starts moving, the movement of the hand is replicated
by the cursor. The size of the virtual touch screen didn't harm gesture recognition.
When the user moves his hand, the cursor moves along with it.
11

24. In this area, we mapped the hand to perform gestures. Through this when
the user holds his hand above the camera, the camera will be sending the frames
to the system and the system will recognize the number of folded fingers of the
hand and identify it as a gesture. When the user combines fingers, the
combination acts as a unique gesture. For example, if the system recognizes a
hand gesture with 4 fingertips It will be a left mouse click event. If it is a gesture
with 3 fingertips it will be recognized as a right mouse click event. Likewise, we
were able to map different gestures to unfolded fingertip patterns.

Swipe-right/next Swipe-left/previous

Later we calculate the convex hull of the hand contour, The convex hull will wrap
the convexness of the contour.

Figure No 5. Calculating The Convex Hull of Hand Contour

25. As of now, we have the points of the convex hull as well as the points of
the fingertips given by the Tipy algorithm. When the fingers are unfolded the
fingertips lie on the border of the convex hull, but if we fold one finger the tip of
12

that fingertip falls below the border of the convex hull. So, the distance to the
convex hull from the fingertip will be larger than the others. So, we can accurately
calculate how many fingertips have been folded in one hand.

26. Then we keep track of all 5 fingers of the hand now we can monitor which
fingers have been folded. Based on the folded fingertips we define gestures as
below. Also, through the system, we recognize zoom-in and zoom-out gestures to
zoom pictures or relevant areas. The user must use both hands to perform zoom
gestures. Two fingertips will perform zoom in action and one fingertip will perform
zoom out gesture.

27. Mapping Physical Screen to Touch screen. The laptop or computer


screens pixel ratio is different from the projector screen. Every pixel in the
computer map into the projector screen. To do mapping the projector screen into a
virtual touch screen we are using simple machine learning techniques. The
system will send the scan line to the screen pixel by pixel and the camera will
observe the scan line frame by frame. Then the system will remove the
background and process the scan line using threshold algorithms based on the X
and Y coordinates will map. All the pixels will map by machine learning
techniques.

Figure No 6. X and Y axis Scan line

Figure No 7. Mapping Result

28. Virtual Touch Panel (V Touch). A Virtual Touch Panel can replace
touch panels attached to traditional displays. Virtual Touch Panel consisting of a
3D camera, board AI S/W. Simply putting a Virtual touch camera on top of
displays will allow touchless touch control of any type of display up to 65" at a
distance of 0m to 1.2m. The virtual touch panel is a next-generation technology
that makes the front space (3D) of the display a control area, which is higher than
13

the existing touch panel, where only the touchable surface (2D) is the control area.

Figure No 8. Virtual Touch Panel (V Touch)

29. Projection Keyboard. A projection keyboard is a form of the computer


input device, which project the image of a virtual keyboard onto a surface; when a
user touches the surface covered by an image of a key, the device records the
corresponding keystroke.

Figure No 9. Projection Keyboard

30. Hologram Table. A Hologram table is a holographic panel displaying a


precise 3D model of buildings or campuses. The technology users can interact
with the content, get immersed in the design and focus on critical details. Turn,
zoom in or out, 'take off the roof and look at floor and room layouts, see how
future residents will park their cars in the parking lot, and enjoy a bird's-eye view
of their entire district.
14

Figure No 10. Nettle Box Hologram Table

TECHNOLOGICAL ASPECTS

31. This elaborates on a technology that allows smartphones to generate a


touch-sensitive region in three-dimensional space. This allows us to keep a
smartphone under a projector screen and transform the projector screen into an
interactive virtual touch screen. When the projector screen is touched by an
operator, the scene is observed by the phone's camera. Then the scene is sent to
an image processing server to estimate the location of the touched points. These
measurements are then enhanced by a regression-based machine-learning
model. Having known the touched coordinates, we can trigger the actual touch
gestures at the operating system level. This system is a collaboration of a
smartphone, a personal computer and a physical screen. Ultimately, this virtual
touch screen aims to become a low-cost human interface device for computer
systems that can be used to enhance teaching, presenting and gaming exp.
15

Military Uses and Advantages.

32. The use of a virtual touch screen in the military environment to simplify and
speed up the handling of technical applications. Military-technical applications are
constantly exposed to dirty or dusty environments and must not suffer any
damage as a result thereof. Particularly missions in desert regions require a touch
screen, which must not get scratched due to sand, so eliminating the screen is a
better option also a 3D image of the war zone is easy to understand. This
particularly helps the armed forces on deployment. Virtual touch screens are not
only extremely quick and easy to operate, but they also do not require special care
concerning their storage and operation.

CONCLUSION

33. A method was developed in which gestures are detected and interact with
the screen with the help of IR sensors and computer systems. The accuracy of the
virtual touch screen significantly relied on hardware components such as the
camera sensor and constant lighting conditions. Given the perfect test conditions,
we noticed the accuracy is inversely proportional to the screen size. However, the
system had an acceptable response to touch gestures with an error margin of
15%-20%. Nettle Box Hologram table and Projection keyboard are identified as an
application of touchless technology. This technology has many future aspects and
can be used effectively in computers, cell phones, webcams and laptops. Virtual
Touch screen technology is still developing.

34. Few years down the line, our body can be transformed into a virtual mouse
or virtual keyboard, and our body may be turned into an input device.
16

REMARKS BY ADS
17

REMARKS BY STUDENT DS
18

REMARKS BY DS
19

REMARKS BY HOD
20

REMARKS BY DEAN
21

REMARKS BY DY COMDT & CI


22

REMARKS BY COMDT

You might also like