Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Touchless Touchscreen Technology

Download as pdf or txt
Download as pdf or txt
You are on page 1of 28

Touchless Touchscreen

Technology

MARIYA JAMES
B Batch
Roll.no:24
Touchscreen Technology
➢ The idea of the touchscreen interface was recorded in October 1965 when an engineer in

Malvern, England, specifically at the Royal Radar Establishment, aimed to develop a

touchscreen to aid traffic control. His name was Eric Arthur Johnson.

➢ Traditional touchscreen technology requires direct physical contact with the screen. Users

interact with the display by touching it with their fingers or a pen. The screen detects the

touch and responds accordingly, allowing users to input commands, navigate menus, or

interact with applications.


HISTORY OF TOUCHLESS TOUCHSCREEN
➢ Early touchless technology originated in the video game industry with sensor-based gesture
recognition interfaces.
➢ Microsoft's Surface, launched in 2007, advanced touchless technology with its gesture-
based multi-touch interface.
➢ Near Field Communication (NFC) technology transitioned from luggage scanning to
contactless payment systems.
➢ Google introduced a custom mobile payment system for Android devices using NFC
technology in 2011, followed by Apple Pay in 2014.
➢ Contactless payment solutions, like Apple Pay, allow users to make payments by holding
their device near a payment reader and verifying with a fingerprint or face scan.
Introduction

➢ Touchless Touchscreen was made by Touch Ko ,White Electronics Design and Group 3D.

➢ Touchless control is developed by Elliptic Lab.

➢ This technology lets you control your gadgets like Computers,MP3 Players or Mobile Phones without

Touching them.
TOUCHLESS TOUCHSCREEN

➢ Uses gesturing as form of input.

➢ Pretty unique and interesting invention.

➢ No need of touching screen.

➢ Detects hand movements by using sensors.

➢ High end technology .

➢ Uses hand waves and hand flicks.


WORKING
➢ The system is capable of detecting movements in 3-dimensions

Without ever having to put your finger on the screen.

➢ Sensors are mounted around the screen that is be used , by

Interacting in the line-of-sight of these sensors the motion is

Detected and interpreted into on-screen movements.

➢ The device is based on optical pattern recognition using a solid state

Optical matrix sensor with a lens to detect hand motions.


➢ This sensor is then connected to a digital image processor , which interprets the patterns of

motion and outputs the results as signals.

➢ You just point at the screen (from as far as 5 feet away), and you can manipulate objects in 3D.
TECHNOLOGY BEHIND THIS:

➢ This is based on optical pattern recognition using a solid

state optical matrix sensor.

➢ This sensor is then connected to a digital image

processor , which interprets the patterns of motion and

outputs the results as signals.


➢ Each of these sensors contains matrix of pixels.

➢ Each pixel is coupled to photodiodes to charge storage regions.


WORK FLOW
Example:
➢In workplaces, touchless technology for attendance and time tracking includes RFID/NFC systems

where employees tap cards or wave devices near a reader, and biometric systems like fingerprint or

facial recognition. While touchless touchscreen technology is possible, it's less common due to cost

and complexity.

➢Automatic open gates rely on sensors like motion, infrared, or proximity sensors to detect

vehicles or individuals, triggering the gate to open automatically without physical contact. While

not touchless touchscreen technology, some advanced systems may integrate touchless interfaces

for configuration or monitoring purposes.


Technologies:

➢ Air Writing
➢ Eye Sight
➢ Elliptic Labs
➢ Mauz
➢ Leap Motion
➢ Microsoft Kinect
➢ Point Grab
➢ Myoelectric Arm Band
➢AIRWRITING

Airwriting is a technology that allows you to write text messages or compose emails by writing
in the air. Airwriting comes in the form of a glove which recognizes the path your hands and
fingers move in as you write. The glove contains sensors that can record hand
movements.
What happen is, when the user starts ‘airwriting’, the glove will detect it and send it to the
computer via wireless connection. The computer will capture it and decode the movements.
➢EYE SIGHT:

❖ EyeSight is a gesture technology which allows you to navigate through your devices by
just pointing at it. Much like how you use a remote to navigate your TV, you don’t have
to touch the screen.And get this, the basic requirement for eyesight to work is to
have a basic 2D webcam (even the built-in ones work) and the software. Your screen
need not even be one with touch technology.

➢ Elliptic Labs:
Elliptic Labs allows you to operate your computer without touching it with the Windows 8 Gesture
Suite. It uses ultrasound so it works not with cameras but with your audio tools. Ideally, you need 6
speakers and 8 microphones but the dedicated speakers on laptops and a normal microphone could work
too.
The speaker will emit ultrasound which will bounce to microphones so that it could track a user’s
hand movements which will be interpreted by the Elliptic Labs software.
This technology is designed to work on the Windows 8 platform and is expected to work on tablets,
smartphones and even cars. Elliptic Labs is not out for consumers to buy as the company is focusing on
marketing it to Original Equipment Manufacturers (OEM).
➢ Mauz:
Mauz is a third party device that turns your iPhone into a trackpad or mouse. Download the driver
into your computer and the app to your iPhone then connect the device to your iPhone via the
charger port. Mauz is connected to the computer via Wi-Fi connection. Start navigating through
your computer like you would a regular mouse: left click, right click and scroll as normal.

Now comes the fun part, you can use gestures with Mauz too. With the iPhone camera on, move
your hands to the left to bring you a page back on your browser and move it right to bring yourself a
page forward. If there’s an incoming call or a text message simply intercept it and resume
using Mauz right after. Unfortunately, Mauz is not out for consumers to buy just yet.
➢Leap Motion:
Leap Motion is a motion sensor device that recognizes the user’s fingers with its infrared LEDs and cameras. As
it works by recognizing only your fingers, when you hover over it to type on the keyboard, nothing registers. But
when you hover your fingers above it, you can navigate your desktop like you would your smartphone or tablet:
flick to browse pages or pinch to zoom, etc.

It’s a small USB device that works the moment you connect it to your computer. You don’t need to charge it and
it works even with non-touch sensitive screens. Leap Motion works well with gaming and 3D-related softwares.
You can pre-order Leap Motion for $79.99.
➢Microsoft Kinect:

Speaking of gaming, Microsoft Kinect is takes gaming with the Xbox to the next stage. It detects and recognizes a

user’s body movement and reproduces it within the video game that is being played. It also recognizes the user’s

face and voice. With the update Kinect can now read small gesture controls such as pinch to zoom, opening and

closing of hands, so you can actually use Microsoft Kinect for other purposes apart from gaming.
➢PointGrab:
PointGrab is something similar to eyeSight, in that it
enables users to navigate on their computer just by
pointing at it. PointGrab comes in the form of a software
and only needs a 2D webcam. The camera will detect your
hand movements and with that you can control your
computer. PointGrab works with computers that run on
Windows 7 and 8, smartphones, tablets and television.
➢Myoelectric Armband:
Myoelectric armband or MYO armband is a gadget
that allows you to control your other bluetooth
enabled devices using your finger or your hands. How
it works is that, when put on, the armband will detect
movements in your muscle and translate that into
gestures that interact with your computer.

By moving your hands up/down it will scroll the page you


are browsing in. By waving it, it will slide through pictures
in a photo album or switch between applications running
in your system. What would this be good for? At the very
least, it will be very good for action games.
Applications:

➢ Touchless Monitor

➢ Glimpses of GBUI

➢ Touchless SDK

➢ Touchless UI
➢Touchless Monitor:
❖ This monitor is made by TouchKo,White Electronic design.
❖ Touch less touch screen your hand doesn’t have to come in contact with the screen at all , it works
by detecting your hand movements in front of it .
❖ Point your finger in the air towards the device and move it accordingly to control the navigation in
the device.
❖ Designed for applications where touch may be difficult , such as for doctors who might be wearing
surgical gloves.
➢Glimpses of GBUI:(gesture-based
graphical user interface)

❖ A movement of part of the body , especially a hand or


the head , to express an idea or meaning based
graphical user interphase.
➢Touchless SDK:
❖ The touchless SDK is an open source SDK for .NET
applications.
❖ It enables developers to create multi-touch based
applications using a webcam for input.
➢Touchless UI:

❖ The basic idea described in the patent is that there

would be sensors arrayed around the perimeter of the

device Capable of sensing finger movements in 3-D

space.
Advantages:

❖ More durable and less damages.

❖ Replacement is very easy.

❖ No drivers required.

❖ Satisfactory experience.

❖ Gestures and rich than the traditional input methods.


DISADVANTAGES

❖ Limited Sensitivity

❖ Complexity of Gestures

❖ Cost of Implementation

❖ Limited Compatibility
THANK YOU

You might also like