Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2582051.2582090acmotherconferencesArticle/Chapter ViewAbstractPublication PagesahConference Proceedingsconference-collections
research-article

Anywhere surface touch: utilizing any surface as an input area

Published: 07 March 2014 Publication History

Abstract

The current trend towards smaller and smaller mobile devices may cause considerable difficulties in using them. In this paper, we propose an interface called Anywhere Surface Touch, which allows any flat or curved surface in a real environment to be used as an input area. The interface uses only a single small camera and a contact microphone to recognize several kinds of interaction between the fingers of the user and the surface. The system recognizes which fingers are interacting and in which direction the fingers are moving. Additionally, the fusion of vision and sound allows the system to distinguish the contact conditions between the fingers and the surface. Evaluation experiments showed that users became accustomed to our system quickly, soon being able to perform input operations on various surfaces.

References

[1]
Ahmad, F., and Musilek, P. Ubihand: a wearable input device for 3d interaction. In ACM SIGGRAPH '06 Research posters (2006).
[2]
Amento, B., Hill, W., and Terveen, L. The sound of one hand: A wrist-mounted bio-acoustic fingertip gesture interface. In CHI '02 Extended Abstracts (2002), 724--725.
[3]
Atrey, P., Hossain, M., El Saddik, A., and Kankanhalli, M. Multimodal fusion for multimedia analysis: a survey. Multimedia Systems 16, 6 (2010), 345--379.
[4]
Deyle, T., Palinko, S., Poole, E., and Starner, T. Hambone: A bio-acoustic gesture interface. In Proc. ISWC '07 (2007), 3--10.
[5]
Harrison, C., Benko, H., and Wilson, A. D. Omnitouch: wearable multitouch interaction everywhere. In Proc. UIST '11 (2011), 441--450.
[6]
Harrison, C., and Hudson, S. E. Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces. In Proc. UIST '08 (2008), 205--208.
[7]
Harrison, C., and Hudson, S. E. Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices. In Proc. UIST '09 (2009), 121--124.
[8]
Harrison, C., and Hudson, S. E. Minput: enabling interaction on small mobile devices with high-precision, low-cost, multipoint optical tracking. In Proc. CHI '10 (2010), 1661--1664.
[9]
Harrison, C., Tan, D., and Morris, D. Skinput: appropriating the body as an input surface. In Proc. CHI '10 (2010), 453--462.
[10]
Kim, D., Hilliges, O., Izadi, S., Butler, A. D., Chen, J., Oikonomidis, I., and Olivier, P. Digits: freehand 3d interactions anywhere using a wrist-worn gloveless sensor. In Proc. UIST '12 (2012), 167--176.
[11]
Mistry, P., and Maes, P. Sixthsense: a wearable gestural interface. In ACM SIGGRAPH ASIA '09 Sketches (2009), 11:1--11:1.
[12]
Nakatsuma, K., and Shinoda, H. Wristband-shaped input interface using user's back of hand. In WHC '11 Demo (2011).
[13]
Nanayakkara, S., Shilkrot, R., Yeo, K. P., and Maes, P. Eyering: A finger worn input device for seamless interactions with our surroundings. In Proc. AH '13 (2013), 13--20.
[14]
Niikura, T., Hirobe, Y., Cassinelli, A., Watanabe, Y., Komuro, T., and Ishikawa, M. In-air typing interface for mobile devices with vibration feedback. In ACM SIGGRAPH '10 Emerging Technologies (2010), 15:1--15:1.
[15]
Roeber, H., Bacus, J., and Tomasi, C. Typing in thin air: the canesta projection keyboard - a new method of interaction with electronic devices. In CHI '03 Extended Abstracts (2003), 712--713.
[16]
Shiratori, T., Park, H. S., Sigal, L., Sheikh, Y., and Hodgins, J. K. Motion capture from body-mounted cameras. ACM Trans. Graph. 30, 4 (2011), 31:1--31:10.
[17]
Villar, N., Izadi, S., Rosenfeld, D., Benko, H., Helmes, J., Westhues, J., Hodges, S., Ofek, E., Butler, A., Cao, X., and Chen, B. Mouse 2.0: multi-touch meets the mouse. In Proc. UIST '09 (2009), 33--42.
[18]
Watanabe, Y., Hatanaka, T., Komuro, T., and Ishikawa, M. Human gait estimation using a wearable camera. In Proc. WACV '11 (2011), 276--281.
[19]
Wilson, A. D. Playanywhere: a compact interactive tabletop projection-vision system. In Proc. UIST '05 (2005), 83--92.
[20]
Yatani, K., and Truong, K. N. Bodyscope: a wearable acoustic sensor for activity recognition. In Proc. UbiComp '12 (2012), 341--350.

Cited By

View all
  • (2024)SoundScroll: Robust Finger Slide Detection Using Friction Sound and Wrist-Worn MicrophonesProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676614(63-70)Online publication date: 5-Oct-2024
  • (2024)TriPad: Touch Input in AR on Ordinary Surfaces with Hand Tracking OnlyProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642323(1-18)Online publication date: 11-May-2024
  • (2024)MouseRing: Always-available Touchpad Interaction with IMU RingsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642225(1-19)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AH '14: Proceedings of the 5th Augmented Human International Conference
March 2014
249 pages
ISBN:9781450327619
DOI:10.1145/2582051
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • MEET IN KOBE 21st Century: MEET IN KOBE 21st Century

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 March 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. high-speed camera
  2. interaction techniques
  3. mobile devices
  4. touch interface
  5. vision-based UI

Qualifiers

  • Research-article

Conference

AH '14
Sponsor:
  • MEET IN KOBE 21st Century

Acceptance Rates

Overall Acceptance Rate 121 of 306 submissions, 40%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)60
  • Downloads (Last 6 weeks)8
Reflects downloads up to 16 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)SoundScroll: Robust Finger Slide Detection Using Friction Sound and Wrist-Worn MicrophonesProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676614(63-70)Online publication date: 5-Oct-2024
  • (2024)TriPad: Touch Input in AR on Ordinary Surfaces with Hand Tracking OnlyProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642323(1-18)Online publication date: 11-May-2024
  • (2024)MouseRing: Always-available Touchpad Interaction with IMU RingsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642225(1-19)Online publication date: 11-May-2024
  • (2023)ShadowTouch: Enabling Free-Form Touch-Based Hand-to-Surface Interaction with Wrist-Mounted Illuminant by Shadow ProjectionProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606785(1-14)Online publication date: 29-Oct-2023
  • (2022)PressureVision: Estimating Hand Pressure from a Single RGB ImageComputer Vision – ECCV 202210.1007/978-3-031-20068-7_19(328-345)Online publication date: 11-Nov-2022
  • (2021)inDepth: Force-based Interaction with Objects beyond A Physical BarrierProceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3430524.3442447(1-6)Online publication date: 14-Feb-2021
  • (2020)QwertyRingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34322044:4(1-29)Online publication date: 18-Dec-2020
  • (2020)Ready, Steady, Touch!Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/33973094:2(1-25)Online publication date: 15-Jun-2020
  • (2020)WristLensProceedings of the Augmented Humans International Conference10.1145/3384657.3384797(1-8)Online publication date: 16-Mar-2020
  • (2020)FingerTouch: Touch Interaction Using a Fingernail-Mounted Sensor on a Head-Mounted Display for Augmented RealityIEEE Access10.1109/ACCESS.2020.29979728(101192-101208)Online publication date: 2020
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media