Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2674396.2674443acmotherconferencesArticle/Chapter ViewAbstractPublication PagespetraConference Proceedingsconference-collections
research-article

Target reverse crossing: a selection method for camera-based mouse-replacement systems

Published: 27 May 2014 Publication History

Abstract

We propose a selection method, "target reverse crossing," for use with camera-based mouse-replacement for people with motion impairments. We assessed the method by comparing it to the selection mechanism "dwell-time clicking," which is widely used by camera-based mouse-replacement systems. Our results show that target reverse crossing is more efficient than dwell-time clicking, while its one-time success accuracy is lower. We found that target directions have effects on the accuracy of reverse crossing. We also show that increasing the target size improves the performance of reverse crossing significantly, which provides future interface design implications for this selection method.

References

[1]
Accot, J., and Zhai, S. More than dotting the i's --- foundations for crossing-based interfaces. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI '02), 73--80.
[2]
Betke, M., Gips, J., and Fleming, P. The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE Trans. Neural Systems and Rehabilitation Engineering, 10(1):1--10, 2002.
[3]
Camera Mouse 2013 website: http://www.cameramouse.org. Accessed March 2013.
[4]
Epstein, S., Missimer, E., and Betke, M. Using kernels for a video-based mouse-replacement interface. Personal and Ubiquitous Computing. 18: 47--60, January 2014.
[5]
Feng, W., Chen, M. and Betke, M. Preliminary investigation of the impact of visual feedback on a camera-based mouse-replacement system. In Proc. of the 6th Int. Conf. on PErvasive Technologies Related to Assistive Environments (PETRA '13). ACM, Article 24, 4 pages.
[6]
Kabbash, P. and Buxton, W. A. S. The "prince" technique: Fitts' law and selection using area cursors. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI '95), 273--279.
[7]
Findlater, L., Jansen, A., Shinohara, K., Dixon, M., Kamb, P., Rakita, J. and Wobbrock, J. O. Enhanced area cursors: reducing fine pointing demands for people with motor impairments. In Proc. of the 23nd Annual ACM Symposium on User Interface Software and Technology (UIST '10), 153--162.
[8]
Gorodnichy, D. and Roth, G. Nouse 'use your nose as a mouse' perceptual vision technology for hands-free games and interfaces. Image and Vision Computing, 22(12):931--942, October 2004.
[9]
Grossman, T. and Balakrishnan, R. The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI '05).
[10]
Lombardi, J. and Betke, M. A camera-based eyebrow tracker for hands-free computer control via a binary switch. In Proceedings of the 7th ERCIM Workshop on User Interfaces for All, Paris, France, pp. 199--200. 2002.
[11]
Meyer, D. E., Smith, J. E. K., Kornblum, S., Abreams, R. A. and Wright, C. E. Speed-accuracy tradeoffs in aimed movements: Towards a theory of rapid voluntary action. Attention and Performance XIII. Pp. 173--226. 1990.
[12]
Missimer, M. and Betke, M. Blink and wink detection for mouse pointer control. In Proc. of the 3rd International Conference on PErvasive Technologies Related to Assistive Environments (PETRA '10). ACM, Article 23, 8 pages.
[13]
Morris, T. and Chauhan, V. Facial feature tracking for cursor control. J. Network and Computer Applications. 29(1):62--80, January 2006.
[14]
Poulton, E. C. Range Effects in Experiments on People. In The American Journal of Psychology, 88(1):3--32, 1975.
[15]
Varona, J., Manresa-Yee, C. and Perales, F. J. Hands-free vision-based interface for computer accessibility, Journal of Network and Computer Applications, 31(4):357--374, 2008.
[16]
Wobbrock, J. O. and Gajos, K. Z. A comparison of area pointing and goal crossing for people with and without motor impairments. In Proc. of the 9th international ACM SIGACCESS conference on Computers and accessibility (Assets '07). ACM, 3--10.
[17]
Wobbrock, J. O. and Gajos, K. Z. Goal Crossing with Mice and Trackballs for People with Motor Impairments: Performance, Submovements, and Design Directions. ACM Trans. Access. Comput. 1: 1, Article 4 (May 2008), 37 pp.

Cited By

View all
  • (2024)EyeShadows: Peripheral Virtual Copies for Rapid Gaze Selection and Interaction2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00088(681-689)Online publication date: 16-Mar-2024
  • (2023)GlanceWriter: Writing Text by Glancing Over Letters with GazeProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581269(1-13)Online publication date: 19-Apr-2023
  • (2023)Approximated Match Swiping: Exploring more Ergonomic Gaze-based Text Input for XR2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct60411.2023.00037(141-145)Online publication date: 16-Oct-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
PETRA '14: Proceedings of the 7th International Conference on PErvasive Technologies Related to Assistive Environments
May 2014
408 pages
ISBN:9781450327466
DOI:10.1145/2674396
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • iPerform Center: iPerform Center for Assistive Technologies to Enhance Human Performance
  • CSE@UTA: Department of Computer Science and Engineering, The University of Texas at Arlington
  • HERACLEIA: HERACLEIA Human-Centered Computing Laboratory at UTA
  • U of Tex at Arlington: U of Tex at Arlington
  • NCRS: Demokritos National Center for Scientific Research
  • Fulbrigh, Greece: Fulbright Foundation, Greece

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 May 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. assistive technology
  2. camera-based system interface
  3. interaction techniques
  4. mouse replacement system
  5. reverse crossing
  6. user interfaces

Qualifiers

  • Research-article

Funding Sources

  • National Science Foundation

Conference

PETRA '14
Sponsor:
  • iPerform Center
  • CSE@UTA
  • HERACLEIA
  • U of Tex at Arlington
  • NCRS
  • Fulbrigh, Greece

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)19
  • Downloads (Last 6 weeks)3
Reflects downloads up to 09 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)EyeShadows: Peripheral Virtual Copies for Rapid Gaze Selection and Interaction2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00088(681-689)Online publication date: 16-Mar-2024
  • (2023)GlanceWriter: Writing Text by Glancing Over Letters with GazeProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581269(1-13)Online publication date: 19-Apr-2023
  • (2023)Approximated Match Swiping: Exploring more Ergonomic Gaze-based Text Input for XR2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct60411.2023.00037(141-145)Online publication date: 16-Oct-2023
  • (2022)Design and Evaluation of a Silent Speech-Based Selection Method for Eye-Gaze PointingProceedings of the ACM on Human-Computer Interaction10.1145/35677236:ISS(328-353)Online publication date: 14-Nov-2022
  • (2022)Performance Analysis of Saccades for Primary and Confirmatory Target SelectionProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565619(1-12)Online publication date: 29-Nov-2022
  • (2022)Methodological Standards in Accessibility Research-PLXBCCR- on Motor Impairments: A SurveyACM Computing Surveys10.1145/354350955:7(1-35)Online publication date: 11-Jun-2022
  • (2021)Computer vision applied to improve interaction and communication of people with motor disabilities: A systematic mappingTechnology and Disability10.3233/TAD-200308(1-18)Online publication date: 22-Jan-2021
  • (2021)Pinch, Click, or Dwell: Comparing Different Selection Techniques for Eye-Gaze-Based Pointing in Virtual RealityACM Symposium on Eye Tracking Research and Applications10.1145/3448018.3457998(1-7)Online publication date: 25-May-2021
  • (2021)GazeBar: Exploiting the Midas Touch in Gaze InteractionExtended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411763.3451703(1-7)Online publication date: 8-May-2021
  • (2020)Necessary and Unnecessary Distractor Avoidance Movements Affect User Behaviors in Crossing OperationsACM Transactions on Computer-Human Interaction10.1145/341841327:6(1-31)Online publication date: 8-Nov-2020
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media