Touchless Touchscreen Technology
Touchless Touchscreen Technology
Touchless Touchscreen Technology
Technology
MARIYA JAMES
B Batch
Roll.no:24
Touchscreen Technology
➢ The idea of the touchscreen interface was recorded in October 1965 when an engineer in
touchscreen to aid traffic control. His name was Eric Arthur Johnson.
➢ Traditional touchscreen technology requires direct physical contact with the screen. Users
interact with the display by touching it with their fingers or a pen. The screen detects the
touch and responds accordingly, allowing users to input commands, navigate menus, or
➢ Touchless Touchscreen was made by Touch Ko ,White Electronics Design and Group 3D.
➢ This technology lets you control your gadgets like Computers,MP3 Players or Mobile Phones without
Touching them.
TOUCHLESS TOUCHSCREEN
➢ You just point at the screen (from as far as 5 feet away), and you can manipulate objects in 3D.
TECHNOLOGY BEHIND THIS:
where employees tap cards or wave devices near a reader, and biometric systems like fingerprint or
facial recognition. While touchless touchscreen technology is possible, it's less common due to cost
and complexity.
➢Automatic open gates rely on sensors like motion, infrared, or proximity sensors to detect
vehicles or individuals, triggering the gate to open automatically without physical contact. While
not touchless touchscreen technology, some advanced systems may integrate touchless interfaces
➢ Air Writing
➢ Eye Sight
➢ Elliptic Labs
➢ Mauz
➢ Leap Motion
➢ Microsoft Kinect
➢ Point Grab
➢ Myoelectric Arm Band
➢AIRWRITING
Airwriting is a technology that allows you to write text messages or compose emails by writing
in the air. Airwriting comes in the form of a glove which recognizes the path your hands and
fingers move in as you write. The glove contains sensors that can record hand
movements.
What happen is, when the user starts ‘airwriting’, the glove will detect it and send it to the
computer via wireless connection. The computer will capture it and decode the movements.
➢EYE SIGHT:
❖ EyeSight is a gesture technology which allows you to navigate through your devices by
just pointing at it. Much like how you use a remote to navigate your TV, you don’t have
to touch the screen.And get this, the basic requirement for eyesight to work is to
have a basic 2D webcam (even the built-in ones work) and the software. Your screen
need not even be one with touch technology.
❖
➢ Elliptic Labs:
Elliptic Labs allows you to operate your computer without touching it with the Windows 8 Gesture
Suite. It uses ultrasound so it works not with cameras but with your audio tools. Ideally, you need 6
speakers and 8 microphones but the dedicated speakers on laptops and a normal microphone could work
too.
The speaker will emit ultrasound which will bounce to microphones so that it could track a user’s
hand movements which will be interpreted by the Elliptic Labs software.
This technology is designed to work on the Windows 8 platform and is expected to work on tablets,
smartphones and even cars. Elliptic Labs is not out for consumers to buy as the company is focusing on
marketing it to Original Equipment Manufacturers (OEM).
➢ Mauz:
Mauz is a third party device that turns your iPhone into a trackpad or mouse. Download the driver
into your computer and the app to your iPhone then connect the device to your iPhone via the
charger port. Mauz is connected to the computer via Wi-Fi connection. Start navigating through
your computer like you would a regular mouse: left click, right click and scroll as normal.
Now comes the fun part, you can use gestures with Mauz too. With the iPhone camera on, move
your hands to the left to bring you a page back on your browser and move it right to bring yourself a
page forward. If there’s an incoming call or a text message simply intercept it and resume
using Mauz right after. Unfortunately, Mauz is not out for consumers to buy just yet.
➢Leap Motion:
Leap Motion is a motion sensor device that recognizes the user’s fingers with its infrared LEDs and cameras. As
it works by recognizing only your fingers, when you hover over it to type on the keyboard, nothing registers. But
when you hover your fingers above it, you can navigate your desktop like you would your smartphone or tablet:
flick to browse pages or pinch to zoom, etc.
It’s a small USB device that works the moment you connect it to your computer. You don’t need to charge it and
it works even with non-touch sensitive screens. Leap Motion works well with gaming and 3D-related softwares.
You can pre-order Leap Motion for $79.99.
➢Microsoft Kinect:
Speaking of gaming, Microsoft Kinect is takes gaming with the Xbox to the next stage. It detects and recognizes a
user’s body movement and reproduces it within the video game that is being played. It also recognizes the user’s
face and voice. With the update Kinect can now read small gesture controls such as pinch to zoom, opening and
closing of hands, so you can actually use Microsoft Kinect for other purposes apart from gaming.
➢PointGrab:
PointGrab is something similar to eyeSight, in that it
enables users to navigate on their computer just by
pointing at it. PointGrab comes in the form of a software
and only needs a 2D webcam. The camera will detect your
hand movements and with that you can control your
computer. PointGrab works with computers that run on
Windows 7 and 8, smartphones, tablets and television.
➢Myoelectric Armband:
Myoelectric armband or MYO armband is a gadget
that allows you to control your other bluetooth
enabled devices using your finger or your hands. How
it works is that, when put on, the armband will detect
movements in your muscle and translate that into
gestures that interact with your computer.
➢ Touchless Monitor
➢ Glimpses of GBUI
➢ Touchless SDK
➢ Touchless UI
➢Touchless Monitor:
❖ This monitor is made by TouchKo,White Electronic design.
❖ Touch less touch screen your hand doesn’t have to come in contact with the screen at all , it works
by detecting your hand movements in front of it .
❖ Point your finger in the air towards the device and move it accordingly to control the navigation in
the device.
❖ Designed for applications where touch may be difficult , such as for doctors who might be wearing
surgical gloves.
➢Glimpses of GBUI:(gesture-based
graphical user interface)
space.
Advantages:
❖ No drivers required.
❖ Satisfactory experience.
❖ Limited Sensitivity
❖ Complexity of Gestures
❖ Cost of Implementation
❖ Limited Compatibility
THANK YOU