Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1180995.1181057acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
Article

The benefits of multimodal information: a meta-analysis comparing visual and visual-tactile feedback

Published: 02 November 2006 Publication History

Abstract

Information display systems have become increasingly complex and more difficult for human cognition to process effectively. Based upon Wicken's Multiple Resource Theory (MRT), information delivered using multiple modalities (i.e., visual and tactile) could be more effective than communicating the same information through a single modality. The purpose of this meta-analysis is to compare user effectiveness when using visual-tactile task feedback (a multimodality) to using only visual task feedback (a single modality). Results indicate that using visual-tactile feedback enhances task effectiveness more so than visual feedback (g = .38). When assessing different criteria, visual-tactile feedback is particularly effective at reducing reaction time (g = .631) and increasing performance (g = .618). Follow up moderator analyses indicate that visual-tactile feedback is more effective when workload is high (g = .844) and multiple tasks are being performed (g = .767). Implications of results are discussed in the paper.

References

[1]
Wickens, C., (2002). Multiple resources and performance prediction. Theoretical Issues in Ergonomics Science, 3, 2, 159--177.
[2]
Chiasson, J., McGrath, B., & Rupert, A. (2002). Enhanced situation awareness in sea, air, and land environment. In Proceedings of NATO RTO Human Factors & Medicine Panel Symposium on "Spatial disorientation in military vehicles: Causes, consequences and cures," La Coruñña, Spain, No. TRO-MP-086, 1--10.
[3]
Van Erp, J. & Van Veen, H. (2004). Vibrotactile in-vehicle navigation system. Transportation Research Part F, 247--256.
[4]
Wilson, D. B. (2001). Effect size determination program. Software.
[5]
Hedges, L. V., & Olkin, I. O. (1985). Statistical methods for meta-analysis. San Diego, CA: Academic Press, Inc.
[6]
Borenstein, M., & Rothstein, H. (1999). Comprehensive meta-analysis: A computer program for research synthesis. Englewood, NJ: Biostat.
[7]
Akamatsu, M., & Sato, S. (1994). A multimodal mouse with tactile and force feedback. International Journal of Human-Computer Studies, 40(3), 443--453.
[8]
Cockburn, A., Firth, A. (2003). Improving the acquisition of small targets. In proceedings of the HCI, 181--196.
[9]
Diamond, D. D., Kass, S. J., Andrasik, F., Raj, A. K., & Rupert, A. H. (2002). Vibrotactile cueing as a master caution system for visual monitoring. Human Factors & Aerospace Safety, 2(4), 339--354.
[10]
Forster, B., Cavina-Pratesi, C., Aglioti, S. M., & Berlucchi, G. (2002). Redundant target effect and intersensory facilitation from visual-tactile interactions in simple reaction time. Experimental Brain Research, 143(4), 480--487.
[11]
He, F., & Agah, A. (2001). Multi-modal human interactions with an intelligent interface utilizing images, sounds, and force feedback. Journal of Intelligent & Robotic Systems, 32(2), 171--190.
[12]
Hopp, P. J., Smith, C. A. R., Clegg, B. A., & Heggestad, E. D. (2005). Interruption management: The use of attention-directing tactile cues. Human Factors, 47(1), 1--11.
[13]
Hwang, F., Keates, S., Langdon, P., & Clarkson, P. J. (2003). Multiple haptic target for motion-impaired computer users. CHI '03: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Ft. Lauderdale, Florida, USA, 41--48.
[14]
Lindeman, R. W., Sibert, J. L., Mendez-Mendez, E., Patil, S., & Phifer, D. (2005). Effectiveness of directional vibrotactile cuing on a building-clearing task. CHI '05: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Portland, Oregon, USA, 271--280.
[15]
Lindeman, R. W., Yanagida, Y., Sibert, J. L., & Lavine, R. (2003). Effective vibrotactile cueing in a visual search task. Proceedings of the Ninth IFIP TC13 International Conference on Human-Computer Interaction (INTERACT 2003), Sept. 1-5, 2003, Zurich, Switzerland, pp. 89--96.
[16]
McGee, M. R. (1999). A haptically enhanced scrollbar: Force-Feedback as a means of reducing the problems associated with scrolling, First PHANTOM Users Research Symposium, May, Deutsches Krebsforschungszentrum, Heidelberg, Germany.
[17]
Moorhead, I. R., Holmes, S., & Furnell, A. (2004). Understanding multisensory integration for pilot spatial orientation. QINETIQ/KI/CHS/TR042277.
[18]
Oakley, I., McGee, M. R., Brewster, S., & Gray, P. (2000). Putting the feel in 'look and feel'. CHI '00: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, The Hague, The Netherlands, 415--422.
[19]
Oakley, I., & O'Modhrain, S. (2005). Tilt to scroll: Evaluating a motion based vibrotactile mobile interface. WHC '05: Proceedings of the 1st joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Pisa, Italy, 40--49.
[20]
Swindells, C., Unden, A., & Sang, T. (2003). Torque BAR: An ungrounded haptic feedback device. ICMI '03: Proceedings of the 5th International Conference on Multimodal Interfaces, Vancouver, British Columbia, Canada, 52--59.
[21]
Tang, H., Beebe, D. J., & Kramer, A. F. (1997). Comparison of tactile and visual feedback for a multi-state input mechanism. IEMBS '97: Proceedings of the 19th annual international conference of the IEEE engineering in medicine and biology society, Chicago, Illinois, USA, 4 1697--1700.
[22]
Unger, B. J., Nicolaidis, A., Berkelman, P. J., Thompson, A., Lederman, S., & Klatzky, R. L. et al. (2002). Virtual peg-in-hole performance using a 6-DOF magnetic levitation haptic device: Comparison with real forces and with visual guidance alone. HAPTIC '02: Proceedings of the 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Orlando, Florida, USA, 263--270.
[23]
Van Erp, J. B. F., & Verschoor, M. H. (2004). Cross-modal visual and vibrotactile tracking. Applied Ergonomics, 35(2), 105--112.

Cited By

View all
  • (2024)Sensing the Machine: Evaluating Multi-modal Interaction for Intelligent Dynamic GuidanceProceedings of the 29th International Conference on Intelligent User Interfaces10.1145/3640543.3645179(66-73)Online publication date: 18-Mar-2024
  • (2024)The Back Ring Light Element (BRLE) – Where User Experience Meets Safety13th International Munich Chassis Symposium 202210.1007/978-3-662-68163-3_1(1-14)Online publication date: 30-Apr-2024
  • (2023)The Effects of Body Location and Biosignal Feedback Modality on Performance and Workload Using Electromyography in Virtual RealityProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580738(1-16)Online publication date: 19-Apr-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI '06: Proceedings of the 8th international conference on Multimodal interfaces
November 2006
404 pages
ISBN:159593541X
DOI:10.1145/1180995
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 November 2006

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. meta-analysis
  2. multimodal
  3. visual feedback
  4. visual-tactile feedback

Qualifiers

  • Article

Conference

ICMI06
Sponsor:

Acceptance Rates

Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)55
  • Downloads (Last 6 weeks)5
Reflects downloads up to 10 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Sensing the Machine: Evaluating Multi-modal Interaction for Intelligent Dynamic GuidanceProceedings of the 29th International Conference on Intelligent User Interfaces10.1145/3640543.3645179(66-73)Online publication date: 18-Mar-2024
  • (2024)The Back Ring Light Element (BRLE) – Where User Experience Meets Safety13th International Munich Chassis Symposium 202210.1007/978-3-662-68163-3_1(1-14)Online publication date: 30-Apr-2024
  • (2023)The Effects of Body Location and Biosignal Feedback Modality on Performance and Workload Using Electromyography in Virtual RealityProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580738(1-16)Online publication date: 19-Apr-2023
  • (2023)The Impact of Modality, Technology Suspicion, and NDRT Engagement on the Effectiveness of AV ExplanationsIEEE Access10.1109/ACCESS.2023.330226111(81981-81994)Online publication date: 2023
  • (2023)Evaluating Visual and Auditory Substitution of Tactile Feedback During Mixed Reality TeleoperationCognitive Computation and Systems10.1007/978-981-99-2789-0_28(331-345)Online publication date: 24-May-2023
  • (2022)Posture-based Golf Swing Instruction using Multi-modal FeedbackJournal of Information Processing10.2197/ipsjjip.30.10730(107-117)Online publication date: 2022
  • (2022)CoCAtt: A Cognitive-Conditioned Driver Attention Dataset2022 IEEE 25th International Conference on Intelligent Transportation Systems (ITSC)10.1109/ITSC55140.2022.9921777(32-39)Online publication date: 8-Oct-2022
  • (2022)Designing mobile spatial navigation systems from the user’s perspective: an interdisciplinary reviewSpatial Cognition & Computation10.1080/13875868.2022.205338222:1-2(1-29)Online publication date: 16-Mar-2022
  • (2022)Do multimodal search cues help or hinder teleoperated search and rescue missions?Ergonomics10.1080/00140139.2022.214464666:9(1255-1269)Online publication date: 15-Nov-2022
  • (2021)Endless Knob with Programmable Resistive Force FeedbackHuman-Computer Interaction – INTERACT 202110.1007/978-3-030-85610-6_32(580-589)Online publication date: 26-Aug-2021
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media