Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1667146.1667160acmconferencesArticle/Chapter ViewAbstractPublication Pagessiggraph-asiaConference Proceedingsconference-collections
research-article

SixthSense: a wearable gestural interface

Published: 16 December 2009 Publication History

Abstract

In this note, we present SixthSense, a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information. By using a tiny projector and a camera coupled in a pendant like mobile wearable device, SixthSense sees what the user sees and visually augments surfaces, walls or physical objects the user is interacting with; turning them into just-in-time information interfaces. SixthSense attempts to free information from its confines by seamlessly integrating it with the physical world.

Reference

[1]
Mistry, P., Maes, P. and Chang, L. 2009. WUW - Wear Ur World - A Wearable Gestural Interface. In the CHI '09 extended abstracts on Human factors in computing systems. Boston, USA.

Cited By

View all
  • (2024)Understanding Novice Users' Mental Models of Gesture Discoverability and Designing Effective OnboardingCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678370(290-295)Online publication date: 5-Oct-2024
  • (2024)Do I Just Tap My Headset?Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314517:4(1-28)Online publication date: 12-Jan-2024
  • (2024)Designing Haptic Feedback for Sequential Gestural InputsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642735(1-17)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGGRAPH ASIA '09: ACM SIGGRAPH ASIA 2009 Sketches
December 2009
45 pages
ISBN:9781450379366
DOI:10.1145/1667146
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 16 December 2009

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. augmented reality
  2. gestural interaction
  3. multi-touch interaction
  4. object augmentation
  5. wearable computing

Qualifiers

  • Research-article

Conference

SA09
Sponsor:
SA09: SIGGRAPH ASIA 2009
December 16 - 19, 2009
Yokohama, Japan

Acceptance Rates

Overall Acceptance Rate 178 of 869 submissions, 20%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)33
  • Downloads (Last 6 weeks)4
Reflects downloads up to 26 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Understanding Novice Users' Mental Models of Gesture Discoverability and Designing Effective OnboardingCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678370(290-295)Online publication date: 5-Oct-2024
  • (2024)Do I Just Tap My Headset?Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314517:4(1-28)Online publication date: 12-Jan-2024
  • (2024)Designing Haptic Feedback for Sequential Gestural InputsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642735(1-17)Online publication date: 11-May-2024
  • (2024)LookCursorAI: Machine Learning-Enhanced Eye-Powered Interaction2024 IEEE 3rd World Conference on Applied Intelligence and Computing (AIC)10.1109/AIC61668.2024.10730891(606-610)Online publication date: 27-Jul-2024
  • (2023)TicTacToes: Assessing Toe Movements as an Input ModalityProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580954(1-17)Online publication date: 19-Apr-2023
  • (2023)The concept of “interaction” in debates on human–machine interactionHumanities and Social Sciences Communications10.1057/s41599-023-02060-810:1Online publication date: 2-Sep-2023
  • (2023)Real-time hand ownership decision in egocentric view using kinematic approachJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-023-04520-014:3(2759-2769)Online publication date: 26-Jan-2023
  • (2023)CobotTouch: AR-Based Interface with Fingertip-Worn Tactile Display for Immersive Control of Collaborative RobotsHaptic Interaction10.1007/978-3-031-46839-1_14(176-188)Online publication date: 8-Nov-2023
  • (2023)Augmented Reality for Maintenance and RepairSpringer Handbook of Augmented Reality10.1007/978-3-030-67822-7_24(597-616)Online publication date: 1-Jan-2023
  • (2022)Augmented Reality-Based Interface for Bimanual Robot TeleoperationApplied Sciences10.3390/app1209437912:9(4379)Online publication date: 26-Apr-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media