Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/325737.325852acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
Article
Free access

Expression constraints in multimodal human-computer interaction

Published: 09 January 2000 Publication History

Abstract

Thanks to recent scientific advances, it is now possible to design multimodal interfaces allowing the use of speech and gestures on a touchscreen. However, present speech recognizers and natural language interpreters cannot yet process spontaneous speech accurately. These limitations make it necessary to impose constraints on users' speech inputs. Thus, ergonomic studies are needed to provide user interface designers with efficient guidelines for the definition of usable speech constraints.
We evolved a method for designing oral and multimodal (speech + 2D gestures) command languages, which could be interpreted reliably by present systems, and easy to learn through human-computer interaction (HCI). The empirical study presented here contributes to assessing the usability of such artificial languages in a realistic software environment. Analyses of the multimodal protocols collected indicate that all subjects were able to assimilate rapidly the given expression constraints, mainly while executing simple interactive tasks; in addition, these constraints, which had no noticeable effect on the subjects' activities, had a limited influence on their use of modalities.
These results contribute to the validation of the method we propose for the design of tractable and usable multimodal command languages.

References

[1]
Amalberti, R., Carbonell, N., and Falzon P. User representations of computer systems in human-computer interaction. International Journal of Man-Machine Studies. 38 (January 1993), 547-566.
[2]
Bekker, M.M., Van Nes, F.L., and Juola, J.F. A comparison of mouse and speech input control of a textannotation system Behaviour & information Technology. 14, 1 (1995), 14-22.
[3]
Coutaz, J., and Caelen, J. A taxonomy for multimedia and multimodal user interface. Proceedings 1" ERCIM Workshop on Multimodal Human-Computer Interaction (Lisbon, November 1991), INESC.
[4]
Damper, R.I., and Wood, S.D. Speech versus keying in command and control applications. Int. Journal of Human-Computer Studies. 42 (1995), 289-305.
[5]
Dillon, T.W., and Norcio, A.F. User performance and acceptance of a speech input interface in a health assessment task. International Journal of Man-Machine Studies, 38 (January 1993), 547-566.
[6]
Hauptmann, A.G., and McAvinney, P. Gestures with speech for graphic manipulation. International Journal of Human-Computer Studies, 47(4, 1997), 591-602.
[7]
Koons, D. B., Sparrell, C.J., and Thorisson, K.R. Integrating simultaneous input from speech, gaze and hand gestures. in M. Maybury (Eds.), Intelligent Multimedia Interfaces. MIT Press, 257-276, 1993.
[8]
Mignot, C, and Carbonell, N. "Natural" multimodal HCI: Experimental results on the use of spontaneous speech and hand gestures. Proceedings 2"d ERCIM Workshop on Multimodal Human-Computer Interaction (Nancy, November 1994), INRIA, 97-l 12.
[9]
Murray, A. G., Jones, D. M., Frankish, C.R. Dialogue design in speech-mediated data-entry: the role of syntatic constraints and feedback. International Journal of Human-Computer Studies. 45 (3, 1996), 263-286.
[10]
Nielsen, J. Usability Engineering. Academic Press, 1993.
[11]
Oviatt, S., DeAngeli, A., and Kuhn, K. Integration and synchronisation of input modes during multimodal human-computer interaction. Proceedings CHI'97 (Atlanta, April 1997), ACM Press, 415422.
[12]
Robbe, S., Carbonell, N., and Dauchy, P. Constrained vs spontaneous speech and gestures for interacting with computers: A comparative empirical study. Proceedings INTERACT'97 (Sydney, July 1997), Chapman & Hall, 445-452.

Cited By

View all
  • (2022)Gesture and Voice Commands to Interact With AR Windshield Display in Automated Vehicle: A Remote Elicitation StudyProceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3543174.3545257(171-182)Online publication date: 17-Sep-2022
  • (2020)Explore, Create, Annotate: Designing Digital Drawing Tools with Visually Impaired PeopleProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376349(1-12)Online publication date: 21-Apr-2020
  • (2018)Combining topic-based model and text categorisation approach for utterance understanding in human-machine dialogueInternational Journal of Computational Science and Engineering10.1504/IJCSE.2018.09442917:1(109-117)Online publication date: 1-Jan-2018
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
IUI '00: Proceedings of the 5th international conference on Intelligent user interfaces
January 2000
288 pages
ISBN:1581131348
DOI:10.1145/325737
  • Chairmen:
  • Doug Riecken,
  • David Benyon,
  • Henry Lieberman
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 January 2000

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. multimodal user interfaces
  2. speech constraints
  3. usability

Qualifiers

  • Article

Conference

IUI00
Sponsor:
IUI00: International Conference on Intelligent User Interfaces
January 9 - 12, 2000
Louisiana, New Orleans, USA

Acceptance Rates

Overall Acceptance Rate 746 of 2,811 submissions, 27%

Upcoming Conference

IUI '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)30
  • Downloads (Last 6 weeks)7
Reflects downloads up to 17 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Gesture and Voice Commands to Interact With AR Windshield Display in Automated Vehicle: A Remote Elicitation StudyProceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3543174.3545257(171-182)Online publication date: 17-Sep-2022
  • (2020)Explore, Create, Annotate: Designing Digital Drawing Tools with Visually Impaired PeopleProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376349(1-12)Online publication date: 21-Apr-2020
  • (2018)Combining topic-based model and text categorisation approach for utterance understanding in human-machine dialogueInternational Journal of Computational Science and Engineering10.1504/IJCSE.2018.09442917:1(109-117)Online publication date: 1-Jan-2018
  • (2012)Web on the wallProceedings of the 2012 ACM international conference on Interactive tabletops and surfaces10.1145/2396636.2396651(95-104)Online publication date: 11-Nov-2012
  • (2009)User-defined gestures for surface computingProceedings of the SIGCHI Conference on Human Factors in Computing Systems10.1145/1518701.1518866(1083-1092)Online publication date: 4-Apr-2009
  • (2007)How really effective are multimodal hints in enhancing visual target spotting? Some evidence from a usability studyJournal on Multimodal User Interfaces10.1007/BF028844271:1(1-5)Online publication date: Mar-2007
  • (2005)Do Oral Messages Help Visual Search?Advances in Natural Multimodal Dialogue Systems10.1007/1-4020-3933-6_7(131-157)Online publication date: 2005
  • (2004)On-the-Fly TrainingArticulated Motion and Deformable Objects10.1007/978-3-540-30074-8_15(146-153)Online publication date: 2004
  • (2003)Towards the design of usable multimodal interaction languagesUniversal Access in the Information Society10.1007/s10209-003-0051-02:2(143-159)Online publication date: 1-Jun-2003

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media