Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.5555/2042118.2042130guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Estimating the perceived difficulty of pen gestures

Published: 05 September 2011 Publication History

Abstract

Our empirical results show that users perceive the execution difficulty of single stroke gestures consistently, and execution difficulty is highly correlated with gesture production time. We use these results to design two simple rules for estimating execution difficulty: establishing the relative ranking of difficulty among multiple gestures; and classifying a single gesture into five levels of difficulty. We confirm that the CLC model does not provide an accurate prediction of production time magnitude, and instead show that a reasonably accurate estimate can be calculated using only a few gesture execution samples from a few people. Using this estimated production time, our rules, on average, rank gesture difficulty with 90% accuracy and rate gesture difficulty with 75% accuracy. Designers can use our results to choose application gestures, and researchers can build on our analysis in other gesture domains and for modeling gesture performance.

References

[1]
Appert, C., Zhai, S.: Using strokes as command shortcuts: cognitive benefits and toolkit support. In: Proceedings of CHI 2009, pp. 2289-2298. ACM Press, New York (2009).
[2]
Ashbrook, D., Starner, T.: Magic: a motion gesture design tool. In: Proceedings of the CHI 2010, pp. 2159-2168. ACM Press, New York (2010).
[3]
Bau, O., Mackay, W.E.: Octopocus: a dynamic guide for learning gesture-based command sets. In: Proceedings of UIST 2008, pp. 37-46. ACM Press, New York (2008).
[4]
Cao, X., Zhai, S.: Modeling human performance of pen stroke gestures. In: Proceedings of CHI 2007, pp. 1495-1504. ACM Press, New York (2007).
[5]
Castellucci, S.J., MacKenzie, I.S.: Graffiti vs. unistrokes: an empirical comparison. In: Proceedings of CHI 2008, pp. 305-308. ACM Press, New York (2008).
[6]
Chen, P., Popovich, P.: Correlation: Parametric and nonparametrized measures. Thousand Oaks, Sage (2002).
[7]
Grange, S., Fong, T., Baur, C.: Moris: a medical/operating room interaction system. In: Proceedings ICMI 2004, pp. 159-166. ACM Press, New York (2004).
[8]
Isokoski, P.: Model for unistroke writing time. In: Proceedings of CHI 2001, pp. 357-364. ACM Press, New York (2001).
[9]
Kratz, S., Rohs, M.: A $3 gesture recognizer: simple gesture recognition for devices equipped with 3d acceleration sensors. In: Proc. of IUI 2010, pp. 341-344. ACM Press, New York (2010).
[10]
Kristensson, P.-O., Zhai, S.: Shark2: a large vocabulary shorthand writing system for penbased computers. In: Proceedings of UIST 2004, pp. 43-52. ACM Press, New York (2004).
[11]
Kurtenbach, G., Moran, T.P., Buxton, W.A.S.: Contextual animation of gestural commands. Computer Graphics Forum 13(5), 305-314 (1994).
[12]
Li, Y.: Protractor: a fast and accurate gesture recognizer. In: Proceedings of CHI 2010, pp. 2169-2172. ACM Press, New York (2010).
[13]
Long Jr., A.C., Landay, J.A., Rowe, L.A.: Helping designers create recognition-enabled interfaces. In: Multimodal Interface for Human-Machine Communication, pp. 121-146 (2002).
[14]
Long Jr., A.C., Landay, J.A., Rowe, L.A.: Implications for a gesture design tool. In: Proceedings of CHI 1999, pp. 40-47. ACM Press, New York (1999).
[15]
Long Jr., A.C., Landay, J.A., Rowe, L.A., Michiels, J.: Visual similarity of pen gestures. In: Proceedings of CHI 2000, pp. 360-367. ACM Press, New York (2000).
[16]
Morris, M.R., Wobbrock, J.O., Wilson, A.D.: Understanding users' preferences for surface gestures. In: Proceedings of GI 2010, pp. 261-268. Canadian Inf. Processing Society (2010).
[17]
Nielsen, M., Störring, M., Moeslund, T.B., Granum, E.: A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, pp. 409-420. Springer, Heidelberg (2004).
[18]
Pratt, W.: Digital Image Processing, 3rd edn. John Wiley & Sons, Inc., Chichester (2001).
[19]
Rico, J., Brewster, S.: Usable gestures for mobile interfaces: evaluating social acceptability. In: Proceedings of CHI 2010, pp. 887-896. ACM Press, New York (2010).
[20]
Rubine, D.: Specifying gestures by example. SIGGRAPH Computer Graphics 25(4), 329- 337 (1991).
[21]
Schomaker, L.: From handwriting analysis to pen-computer applications. IEEE Electronics and Communications Engineering Journal 10(3), 93-102 (1998).
[22]
Viviani, P., Flash, T.: Minimum-jerk, two-thirds power law, and isochrony: converging approaches to movement planning. Journal of Experimental Psychology: Human Perception and Performance 21(1), 32-53 (1995).
[23]
Viviani, P., Terzuolo, C.: 32 space-time invariance in learned motor skills. In: Tutorials in Motor Behavior. Advances in Psychology, vol. 1, pp. 525-533. North-Holland, Amsterdam (1980).
[24]
Webb, A.: Statistical Pattern Recognition. John Wiley & Sons, Inc., Chichester (2002).
[25]
Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of CHI 2009, pp. 1083-1092. ACM Press, New York (2009).
[26]
Wobbrock, J.O., Wilson, A.D., Li, Y.: Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In: Proceedings of UIST 2007, pp. 159-168. ACM Press, New York (2007).

Cited By

View all
  • (2022)Clarifying Agreement Calculations and Analysis for End-User Elicitation StudiesACM Transactions on Computer-Human Interaction10.1145/347610129:1(1-70)Online publication date: 7-Jan-2022
  • (2020)Moving Toward an Ecologically Valid Data Collection Protocol for 2D Gestures In Video GamesProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376417(1-11)Online publication date: 21-Apr-2020
  • (2019)GestManProceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems10.1145/3319499.3328227(1-6)Online publication date: 18-Jun-2019
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
INTERACT'11: Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
September 2011
691 pages
ISBN:9783642237706

Sponsors

  • Fundacao para a Ciencia e Tecnologia
  • YDreams
  • AbERTA: AbERTA
  • Microsoft Research: Microsoft Research
  • INSEC-ID: inesc id lisboa

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 05 September 2011

Author Tags

  1. gesture descriptors
  2. gesture-based interfaces
  3. pen input

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 09 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Clarifying Agreement Calculations and Analysis for End-User Elicitation StudiesACM Transactions on Computer-Human Interaction10.1145/347610129:1(1-70)Online publication date: 7-Jan-2022
  • (2020)Moving Toward an Ecologically Valid Data Collection Protocol for 2D Gestures In Video GamesProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376417(1-11)Online publication date: 21-Apr-2020
  • (2019)GestManProceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems10.1145/3319499.3328227(1-6)Online publication date: 18-Jun-2019
  • (2019)Stroke-Gesture Input for People with Motor ImpairmentsProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300445(1-14)Online publication date: 2-May-2019
  • (2018)Predicting stroke gesture input performance for users with motor impairmentsProceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct10.1145/3236112.3236116(23-30)Online publication date: 3-Sep-2018
  • (2018)GATOProceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3229434.3229478(1-11)Online publication date: 3-Sep-2018
  • (2018)KeyTimeProceedings of the 2018 CHI Conference on Human Factors in Computing Systems10.1145/3173574.3173813(1-12)Online publication date: 21-Apr-2018
  • (2018)Designing, Engineering, and Evaluating Gesture User InterfacesExtended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems10.1145/3170427.3170648(1-4)Online publication date: 20-Apr-2018
  • (2017)Human-centered recognition of children's touchscreen gesturesProceedings of the 19th ACM International Conference on Multimodal Interaction10.1145/3136755.3137033(638-642)Online publication date: 3-Nov-2017
  • (2017)Designing a gaze gesture guiding systemProceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3098279.3098561(1-13)Online publication date: 4-Sep-2017
  • Show More Cited By

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media