Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1095034.1095041acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
Article

Distant freehand pointing and clicking on very large, high resolution displays

Published: 23 October 2005 Publication History
  • Get Citation Alerts
  • Abstract

    We explore the design space of freehand pointing and clicking interaction with very large high resolution displays from a distance. Three techniques for gestural pointing and two for clicking are developed and evaluated. In addition, we present subtle auditory and visual feedback techniques to compensate for the lack of kinesthetic feedback in freehand interaction, and to promote learning and use of appropriate postures.

    References

    [1]
    Bolt, R. (1980). Put-that-there: Voice and gesture at the graphics interface. Computer Graphics, 14(3). p. 262--270.
    [2]
    Bolt, R. (1981). Gaze-orchestrated dynamic windows. Computer Graphics, 15(3). p. 109--119.
    [3]
    Bowman, D. and Hodges, L. (1997). An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments. ACM Symposium on Interactive 3D Graphics. p. 35--38.
    [4]
    Corradini, A. and Cohen, P. (2002). Multimodal speech-gesture interface for hands-free painting on virtual paper using partial recurrent neural networks for gesture recognition. International Joint Conference on Neural Networks. p. 2293--2298.
    [5]
    Fitts, P.M. (1954). The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 47. p. 381--391.
    [6]
    Fono, D. and Vertegaal, R. (2005). EyeWindows: Evaluation of eye-controlled zooming windows for focus selection. ACM CHI Conference. p. 151--160.
    [7]
    Grossman, T., Wigdor, D., and Balakrishnan, R. (2004). Multi finger gestural interaction with 3D volumetric displays. ACM UIST Symposium. p. 61--70.
    [8]
    Hansen, J., Andersen, A., and Roed, P. (1995). Eye-gaze control of multimedia systems. ACM Symposium on Eye Tracking Research & Applications. p. 115--122.
    [9]
    Hinckley, K., Pausch, R., Goble, J.C., and Kassell, N. (1994). A survey of design issues in spatial input. ACM UIST Symposium. p. 213--222.
    [10]
    Kendon, A. (2004). Gesture: visible action as utterance. 2004: Cambridge University Press.
    [11]
    Krueger, M. (1991). VIDEOPLACE and the interface of the future, in The art of human computer interface design, B. Laurel, Editor. Addison Wesley. p. 417--422.
    [12]
    MacKenzie, I. and Jusoh, D. (2001). An evaluation of two input devices for remote pointing. Eighth IFIP Working Conference on Engineering for Human-Computer Interaction. p. 235--249.
    [13]
    MacKenzie, S. (1992). Fitts' law as a research and design tool in human-computer interaction. Human-Computer Interaction, 7. p. 91-139.
    [14]
    Matveyev, S. and Göbel, M. (2003). The Optical Tweezers: multiple-point interaction technique. Virtual Reality Software and Technology. p. 184--188.
    [15]
    Matveyev, S., Göbel, M., and Frolov, P. (2003). Laser pointer interaction with hand tremor elimination. HCI International. p. 376-740.
    [16]
    Microsoft. (2005). Pointer ballistics for Windows XP. Accessed on 14 Feb 2005, www.microsoft.com/whdc/device/input/pointer-bal.mspx.
    [17]
    Millodot, M. (1997). Dictionary of Optometry and Visual Science. Butterworth-Heinemann. p. 8, 44.
    [18]
    Myers, B., Bhatnagar, R., Nichols, J., Peck, C.H., Kong, D., Miller, R., and Long, C. (2002). Interacting at a distance: measuring the performance of laser pointers and other devices. ACM CHI Conference. p. 33--40.
    [19]
    Nickel, K. and Stiefelhagen, R. (2003). Pointing gesture recognition based on 3D-tracking of face, hands and head orientation. International Conference on Multimodal Interfaces. p. 140--146.
    [20]
    Oh, J.-Y. and Stuerzlinger, W. (2002). Laser pointers as collaborative pointing devices. Graphics Interface. p. 141--149.
    [21]
    Olsen, D.R. and Nielsen, T. (2001). Laser pointer interaction. ACM CHI Conference. p. 17-22.
    [22]
    Parker, J.K., Mandryk, R.L., and Inkpen, K.M. (2005). TractorBeam: Seamless integration of remote and local pointing for tabletop displays. Graphics Interface. p. 33--40.
    [23]
    Peck, C. (2001). Useful parameters for the design of laser pointer interaction techniques. Extended Abstracts of the ACM CHI Conference. p. 461-462.
    [24]
    Pierce, J., Forsberg, A., Conway, M., Hong, S., and Zeleznik, R. (1997). Image plane interaction techniques in 3D immersive environments. ACM Symposium on Interactive 3D Graphics. p. 39--43.
    [25]
    Pierce, J., Stearns, B., and Pausch, R. (1999). Two handed manipulation of voodoo dolls in virtual environments. ACM Symposium on Interactive 3D Graphics. p. 141--145.
    [26]
    Poupyrev, I. and Ichikawa, T. (1999). Manipulating objects in virtual worlds: categorization and empirical evaluation of interaction techniques. Journal of Visual Languages and Computing, 10. p. 19--35.
    [27]
    Skaburskis, A., Vertegaal, R., and Shell, J. (2004). Auramirror: Reflections on attention. ACM Symposium on Eye Tracking Research & Applications. p. 101--108.
    [28]
    Vogel, D. and Balakrishnan, R. (2004). Interactive public ambient displays: Transitioning from implicit to explicit, public to personal, interaction with multiple users. ACM UIST Symposium. p. 137--146.
    [29]
    Wang, Y. and MacKenzie, C. (2000). The role of contextual haptic and visual constraints on object manipulation in virtual environments. ACM CHI Conference. p. 532--539.
    [30]
    Ware, C. and Balakrishnan, R. (1994). Reaching for objects in VR displays: Lag and frame rate. ACM Tranactions on Computer-Human Interaction, 1(4). p. 331--356.
    [31]
    Wilson, A. and Pham, H. (2003). Pointing in intelligent environments with the World Cursor. INTERACT 2003.
    [32]
    Zhai, S. (1998). User performance in relation to 3D input device design. Computer Graphics, 32(4). p. 50--54.
    [33]
    Zhai, S., Morimoto, C., and Ihde, S. (1999). Manual and gaze input cascaded (MAGIC) pointing. ACM CHI Conference. p. 246--253.

    Cited By

    View all
    • (2024)How People Prompt Generative AI to Create Interactive VR ScenesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661547(2319-2340)Online publication date: 1-Jul-2024
    • (2024)MouseRing: Always-available Touchpad Interaction with IMU RingsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642225(1-19)Online publication date: 11-May-2024
    • (2024)Interactive Visualization on Large High‐Resolution Displays: A SurveyComputer Graphics Forum10.1111/cgf.15001Online publication date: 30-Apr-2024
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '05: Proceedings of the 18th annual ACM symposium on User interface software and technology
    October 2005
    270 pages
    ISBN:1595932712
    DOI:10.1145/1095034
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 October 2005

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. freehand gestures
    2. pointing
    3. very large displays
    4. whole hand interaction

    Qualifiers

    • Article

    Conference

    UIST05

    Acceptance Rates

    UIST '05 Paper Acceptance Rate 31 of 159 submissions, 19%;
    Overall Acceptance Rate 842 of 3,967 submissions, 21%

    Upcoming Conference

    UIST '24

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)121
    • Downloads (Last 6 weeks)11

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)How People Prompt Generative AI to Create Interactive VR ScenesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661547(2319-2340)Online publication date: 1-Jul-2024
    • (2024)MouseRing: Always-available Touchpad Interaction with IMU RingsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642225(1-19)Online publication date: 11-May-2024
    • (2024)Interactive Visualization on Large High‐Resolution Displays: A SurveyComputer Graphics Forum10.1111/cgf.15001Online publication date: 30-Apr-2024
    • (2024)Examining Effects of Technique Awareness on the Detection of Remapped Hands in Virtual RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337205430:5(2651-2661)Online publication date: May-2024
    • (2024)User-Defined Interactions for Visual Data Exploration With the Combination of Smartwatch and Large DisplayIEEE Access10.1109/ACCESS.2024.340487612(78657-78679)Online publication date: 2024
    • (2024)Dual-Thumb pointing and command selection techniques for tabletsInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103203184:COnline publication date: 1-Apr-2024
    • (2023)Mid-Air Gestural Interaction with a Large FogscreenMultimodal Technologies and Interaction10.3390/mti70700637:7(63)Online publication date: 24-Jun-2023
    • (2023)WorldPoint: Finger Pointing as a Rapid and Natural Trigger for In-the-Wild Mobile InteractionsProceedings of the ACM on Human-Computer Interaction10.1145/36264787:ISS(357-375)Online publication date: 1-Nov-2023
    • (2023)Commonsense Knowledge-Driven Joint Reasoning Approach for Object Retrieval in Virtual RealityACM Transactions on Graphics10.1145/361832042:6(1-18)Online publication date: 5-Dec-2023
    • (2023)Evaluation of a Multimodal Interaction System for Big DisplaysProceedings of the 15th Biannual Conference of the Italian SIGCHI Chapter10.1145/3605390.3605411(1-9)Online publication date: 20-Sep-2023
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media