Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.5555/2532129.2532165guideproceedingsArticle/Chapter ViewAbstractPublication PagesgiConference Proceedingsconference-collections
research-article
Free access

Haptic target acquisition to enable spatial gestures in nonvisual displays

Published: 29 May 2013 Publication History

Abstract

Nonvisual natural user interfaces can facilitate gesture-based interaction without having to rely on a physical display. Consequently, this may significantly increase available interaction space on mobile devices, where screen real estate is limited. Interacting with invisible objects is challenging though, as such techniques do not provide any spatial feedback but rely entirely on users' visuospatial memory. This paper presents an interaction technique that appropriates a user's arm using haptic feedback to point out the location of nonvisual objects; thereby, allowing for spatial interaction with nonvisual objects. User studies evaluate the effectiveness of two different single-arm target-scanning strategies for selecting an object in 3D and two bimanual target-scanning strategies for selecting an object in 2D. Potential useful applications of our techniques are outlined.

References

[1]
Sony Playstation Move controller, http://us.playstation.com/ps3/playstation-move/, access date: 10-10-2010.
[2]
T. T. Ahmaniemi and V. T. Lantz. Augmented reality target finding based on tactile cues. In Proceedings of the 2009 international conference on Multimodal interfaces, ICMI-MLMI '09, pages 335--342, New York, NY, USA, 2009. ACM.
[3]
L. M. Brown and T. Kaaresoja. Feel who's talking: using tactons for mobile phone alerts. In CHI '06 extended abstracts on Human factors in computing systems, CHI EA '06, pages 604--609, New York, NY, USA, 2006. ACM.
[4]
P. Dourish. Where the Action Is: The Foundations of Embodied Interaction (Bradford Books). The MIT Press, 2004.
[5]
E. Folmer and T. Morelli. Spatial gestures using a tactile-proprioceptive display. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction, TEI '12, pages 139--142, New York, NY, USA, 2012. ACM.
[6]
E. Gardner and J. Martin. Principles of Neural Science, fourth edition., chapter Coding of Sensory Information, pages 411--429. Number 21. McGraw-Hill, 2000.
[7]
T. Guerreiro, P. Lagoa, H. Nicolau, D. Gonalves, and J. Jorge. From tapping to touching: Making touch screens accessible to blind users. Multimedia, IEEE, 15(4):48--50, oct.-dec. 2008.
[8]
S. Gustafson, D. Bierwirth, and P. Baudisch. Imaginary interfaces: spatial interaction with empty hands and without visual feedback. In Proceedings of the 23nd annual ACM symposium on User interface software and technology, UIST '10, pages 3--12, New York, NY, USA, 2010. ACM.
[9]
S. Gustafson, C. Holz, and P. Baudisch. Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device. In Proceedings of the 24th annual ACM symposium on User interface software and technology, UIST '11, pages 283--292, New York, NY, USA, 2011. ACM.
[10]
C. Harrison and S. E. Hudson. Minput: enabling interaction on small mobile devices with high-precision, low-cost, multipoint optical tracking. In CHI '10: Proceedings of the 28th international conference on Human factors in computing systems, pages 1661--1664, New York, NY, USA, 2010. ACM.
[11]
C. Harrison, D. Tan, and D. Morris. Skinput: appropriating the body as an input surface. In CHI '10: Proceedings of the 28th international conference on Human factors in computing systems, pages 453--462, New York, NY, USA, 2010. ACM.
[12]
S. K. Kane, J. P. Bigham, and J. O. Wobbrock. Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility, Assets '08, pages 73--80, New York, NY, USA, 2008. ACM.
[13]
J. Keränen, J. Bergman, and J. Kauko. Gravity sphere: gestural audio-tactile interface for mobile music exploration. In Proceedings of the 27th international conference on Human factors in computing systems, CHI '09, pages 1531--1534, New York, NY, USA, 2009. ACM.
[14]
J. Kim, J. He, K. Lyons, and T. Starner. The gesture watch: A wireless contact-free gesture based wrist interface. In ISWC '07: Proceedings of the 2007 11th IEEE International Symposium on Wearable Computers, pages 1--8, Washington, DC, USA, 2007. IEEE Computer Society.
[15]
K. Kim, J. E. Colgate, M. A. Peshkin, J. J. Santos-Munné, and A. Makhlin. A miniature tactor design for upper extremity prosthesis. In Proceedings of the 2007 Frontiers in the Convergence of Bioscience and Information Technologies, FBIT '07, pages 537--542, Washington, DC, USA, 2007. IEEE Computer Society.
[16]
C. Latulipe, C. S. Kaplan, and C. L. A. Clarke. Bimanual and unimanual image alignment: an evaluation of mouse-based techniques. In Proceedings of the 18th annual ACM symposium on User interface software and technology, UIST '05, pages 123--131, New York, NY, USA, 2005. ACM.
[17]
F. C. Y. Li, D. Dearman, and K. N. Truong. Virtual shelves: interactions with orientation aware devices. In UIST '09: Proceedings of the 22nd annual ACM symposium on User interface software and technology, pages 125--128, New York, NY, USA, 2009. ACM.
[18]
F. C. Y. Li, D. Dearman, and K. N. Truong. Leveraging proprioception to make mobile phones more accessible to users with visual impairments. In Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility, ASSETS '10, pages 187--194, New York, NY, USA, 2010. ACM.
[19]
J. Luk, J. Pasquero, S. Little, K. MacLean, V. Levesque, and V. Hayward. A role for haptics in mobile interaction: initial design using a handheld tactile display prototype. In CHI '06: Proceedings of the SIGCHI conference on Human Factors in computing systems, pages 171--180, New York, NY, USA, 2006. ACM.
[20]
K. Maclean and M. Enriquez. Perceptual design of haptic icons. In In Proceedings of Eurohaptics, pages 351--363, 2003.
[21]
C. Magnusson, M. Molina, K. Rassmus-Gröhn, and D. Szymczak. Pointing for non-visual orientation and navigation. In Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries, NordiCHI '10, pages 735--738, New York, NY, USA, 2010. ACM.
[22]
P. Mistry and P. Maes. Sixthsense: a wearable gestural interface. In ACM SIGGRAPH ASIA 2009 Sketches, SIGGRAPH ASIA '09, pages 11:1--11:1, New York, NY, USA, 2009. ACM.
[23]
T. Morelli, J. Foley, and E. Folmer. VI-Bowling: a tactile spatial exergame for individuals with visual impairments. In Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility (ASSETS'10), pages 179--186, Orlando, Florida, USA, 2010.
[24]
T. Morelli and E. Folmer. Twuist: A discrete tactile-proprioceptive display for eye and ear free output on mobile devices. In Proceedings of Haptics Symposium 2012 (HAPTICS'12), pages 443--450, Vancouver, Canada, 2012.
[25]
T. Oron-Gilad, J. L. Downs, R. D. Gilson, and P. A. Hancock. Vibrotactile guidance cues for target acquisition. IEEE Transactions on Systems, Man, and Cybernetics, Part C, 37(5):993--1004, 2007.
[26]
L. J. Post, I. C. Zompa, and C. E. Chapman. Perception of vibrotactile stimuli during motor activity in human subjects. Exp Brain Res, 100(1):107--20, 1994.
[27]
I. Poupyrev, S. Maruyama, and J. Rekimoto. Ambient touch: designing tactile interfaces for handheld devices. In UIST '02: Proceedings of the 15th annual ACM symposium on User interface software and technology, pages 51--60, New York, NY, USA, 2002. ACM.
[28]
S. Robinson, P. Eslambolchilar, and M. Jones. Sweep-shake: finding digital resources in physical environments. In Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI '09, pages 12:1--12:10, New York, NY, USA, 2009. ACM.
[29]
K. Squire. Changing the game: What happens when video games enter the classroom? Innovate: Journal of Online Education, 1(6), 2005.
[30]
D. Wigdor and D. Wixon. Brave NUI World: Designing Natural User Interfaces for Touch and Gesture. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 1st edition, 2011.

Cited By

View all
  • (2015)Tactile cues for improving target localization in subjects with tunnel visionProceedings of the 27th Conference on l'Interaction Homme-Machine10.1145/2820619.2820625(1-10)Online publication date: 27-Oct-2015

Index Terms

  1. Haptic target acquisition to enable spatial gestures in nonvisual displays

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    GI '13: Proceedings of Graphics Interface 2013
    May 2013
    243 pages
    ISBN:9781482216806

    Sponsors

    • The Canadian Human-Computer Communications Society / Société Canadienne du Dialogue Humaine Machine (CHCCS/SCDHM)

    Publisher

    Canadian Information Processing Society

    Canada

    Publication History

    Published: 29 May 2013

    Author Tags

    1. gestural input
    2. haptic devices
    3. natural language interfaces
    4. pointing

    Qualifiers

    • Research-article

    Acceptance Rates

    Overall Acceptance Rate 206 of 508 submissions, 41%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)52
    • Downloads (Last 6 weeks)13
    Reflects downloads up to 30 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2015)Tactile cues for improving target localization in subjects with tunnel visionProceedings of the 27th Conference on l'Interaction Homme-Machine10.1145/2820619.2820625(1-10)Online publication date: 27-Oct-2015

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media