Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2491367.2491371acmconferencesArticle/Chapter ViewAbstractPublication PagessuiConference Proceedingsconference-collections
research-article

Performance effects of multi-sensory displays in virtual teleoperation environments

Published: 20 July 2013 Publication History

Abstract

Multi-sensory displays provide information to users through multiple senses, not only through visuals. They can be designed for the purpose of creating a more-natural interface for users or reducing the cognitive load of a visual-only display. However, because multi-sensory displays are often application-specific, the general advantages of multi-sensory displays over visual-only displays are not yet well understood. Moreover, the optimal amount of information that can be perceived through multi-sensory displays without making them more cognitively demanding than a visual-only displays is also not yet clear. Last, the effects of using redundant feedback across senses on multi-sensory displays have not been fully explored. To shed some light on these issues, this study evaluates the effects of increasing the amount of multi-sensory feedback on an interface, specifically in a virtual teleoperation context. While objective data showed that increasing the number of senses in the interface from two to three led to an improvement in performance, subjective feedback indicated that multi-sensory interfaces with redundant feedback may impose an extra cognitive burden on users.

References

[1]
Billinghurst, M. and Weghorst, S. 1995. The use of sketch maps to measure cognitive maps of virtual environments. Virtual Reality Annual International Symposium, 40--47.
[2]
Blom, K.J. and Beckhaus, S. 2010. Virtual collision notification. In Proceedings of IEEE Symposium on 3D User Interfaces, 35--38.
[3]
Bowman, D., Kruijff, E., LaViola Jr., J., Poupyrev, I. 2005. 3D User Interfaces: Theory and Practice, parts 2 and 3, 27--310, Addison-Wesley, Boston, MA. 2005.
[4]
Burke, J.L., Prewett, M.S., Gray, A.A., Yang, L. Stilson, F.R.B., Coovert, M.D., Elliot, L.R., and Redden, E. 2006. Comparing the effects of visual-auditory and visual-tactile feedback on user performance: a meta-analysis. In Proceedings of the 8th International Conference on Multimodal Interfaces. ACM, New York, NY, 108--117.
[5]
Cassinelli, A., Reynolds, C., and Ishikawa, M. 2006. Augmenting spatial awareness with haptic radar. Tenth International Symposium on Wearable Computers. (Montreux, Switzerland, October 2006). ISWC'06, 61--64.
[6]
de Barros, P.G., Lindeman, R.W. 2012. Poster: Comparing Vibro-tactile Feedback Modes for Collision Proximity Feedback in USAR Virtual Robot Teleoperation. Proc. of IEEE 2012 Symposium on 3D User Interfaces. 3DUI'12, 137--138. 2012.
[7]
de Barros, P.G., Lindeman, R.W., Ward, M.O. 2011. Enhancing robot teleoperator situation awareness and performance using vibro-tactile and graphical feedback. Proceedings of IEEE 2011 Symposium on 3D User Interfaces. 3DUI?11, 47--54.
[8]
Drury, J.L., Hestand, D., Yanco, H.A., and Scholtz, J. 2004. Design guidelines for improved human-robot interaction. Extended Abstracts on Human Factors in Computing Systems. CHI '04, 1540.
[9]
Endsley, M.R., and Garland, D.G. 2000. Theoretical underpinning of situation awareness: a critical review. Situation Awareness Analysis and Measurement, Lawrence Erlbaum, Mahwah, NJ.
[10]
Ernst, M.O. and Bülthoff, H.H. 2004. Merging the senses into a robust percept. Trends in cognitive sciences. 8, 4 (Apr. 2004), 162--9.
[11]
Gonot, A. et al. 2007. The Roles of Spatial Auditory Perception and Cognition in the Accessibility of a Game Map with a First Person View. International Journal of Intelligent Games & Simulation. 4, 2 (2007), 23--39.
[12]
Grohn, M. et al. 2005. Comparison of Auditory, Visual, and Audiovisual Navigation in a 3D Space. Transactions on Applied Perception. 2, 4 (2005), 564--570.
[13]
Gwizdka, J. 2010. Using Stroop Task to Assess Cognitive Load. ECCE. (2010), 219--222.
[14]
Hart, S.G. 2006. NASA-Task Load Index (NASA-TLX): 20 Years Later. Proc. of the Human Factors and Ergonomics Society 50th Ann. Meeting, HFES '06, 904--908.
[15]
Heilig, M. L., Sensorama Simulator, U. S. Patent 3,050,870, 1962.
[16]
Herbst, I. and Stark, J. 2005. Comparing force magnitudes by means of vibro-tactile, auditory and visual feedback. IEEE International Workshop on Haptic Audio Visual Environments and their Applications. HAVE'05, 67--71.
[17]
Jacoff, A., Messina, E., Weiss, B.A., Tadokoro, S., and Nakagawa, Y. 2003. Test arenas and performance metrics for urban search and rescue. In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS'03. 3, 3396--3403.
[18]
Johnson, C.A., Adams, J.A., Kawamura, K. 2003. Evaluation of an enhanced human-robot interface. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics. SMC'03. 900--905.
[19]
Kaber, D.B., Wright, M.C., Sheik-Nainar, M.A. 2006. Investigation of multi-modal interface features for adaptive automation of a human-robot system. In International Journal of Human-Computer Studies, 64, 6 (Jun. 2006), 527--540.
[20]
Kennedy, R.S., and Land, N.E., 1993. Simulator sickness questionnaire: an enhanced method for quantifying simulator sickness. The International J. of Aviation Psychology, 3, 3, 203--220.
[21]
Koslover, R.L. et al. 2012. Mobile Navigation Using Haptic, Audio, and Visual Direction Cues with a Handheld Test Platform. IEEE Transactions on Haptics. 5, 1 (Jan. 2012), 33--38.
[22]
Lindeman, R.W. 2003. Virtual contact: the continuum from purely visual to purely physical. In Proceedings of the 47th Annual Meeting of the Human Factors and Ergonomics Society. HFES'03, 2103--2107.
[23]
Lindeman, R.W., Noma, H., and de Barros, P.G. 2007. Hear-Through and Mic-Through Augmented Reality: Using Bone Conduction to Display Spatialized Audio. IEEE Int'l Symp. on Mixed and Augmented Reality, 1--4.
[24]
Lindeman, R.W. and Cutler, J.R. 2003. Controller design for a wearable, near-field haptic display. In Proceedings of the 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 397--403.
[25]
McFarlane, D.C. and Latorella, K.A. 2002. The scope and importance of human interruption in human-computer interaction design. Human-Computer Interaction, 17, 1--61.
[26]
Micire, M. et al. 2011. Hand and finger registration for multi-touch joysticks on software-based operator control units. 2011 IEEE Conference on Technologies for Practical Robot Applications (Apr. 2011), 88--93.
[27]
Mine, N., Brooks Jr., F.P., Sequin, C. 1997. Moving Objects in Space: Exploiting Proprioception in Virtual-Environment Interaction, Proc. of SIGGRAPH, Los Angeles, CA, 19--26.
[28]
Nielsen, C.W., Goodrich, M.A., and Ricks, B. 2007. Ecological interfaces for improving mobile robot teleoperation. IEEE Transactions on Robotics, 23, 5, 927--941.
[29]
Pielot, M. and Boll, S. 2010. Tactile Wayfinder: Comparison of Tactile Waypoint Navigation with Commercial Pedestrian Navigation Systems. (2010), 76--93.
[30]
Pielot, M. and Poppinga, B. 2012. PocketNavigator: Studying Tactile Navigation Systems. (2012), 3131--3139.
[31]
Raisamo, R. et al. 2012. Orientation Inquiry: A New Haptic Interaction Technique for Non-visual Pedestrian Navigation. EuroHaptics Conference (2012), 139--144.
[32]
Sibert, J., Cooper, J., Covington, C., Stefanovski, A., Thompson, D., and Lindeman, R.W. 2006. Vibrotactile feedback for enhanced control of urban search and rescue robots. Proc. of the IEEE Symposium on Safety, Security and Rescue Robots, Gaithersburg, MD, Aug. 2006.
[33]
Sigrist, R. et al. 2013. Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review. Psychonomic bulletin & review. 20, 1 (Feb. 2013), 21--53.
[34]
Steinfeld, A., Fong, T., Kaber, D., Lewis, M., Scholtz, J., Schultz, A., and Goodrich, M. 2006. Common metrics for human-robot interaction. In Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction, 33--40.
[35]
Usoh, M., Catena, E., Arman, S., and Slater, M. 2000. Using presence questionnaires in reality. Presence-Teleoperators and Virtual Environments, 9, 5, 497--503.
[36]
Thullier, F. et al. 2012. Vibrotactile Pattern Recognition: A Portable Compact Tactile Matrix. IEEE Transactions on Biomedical Engineering. 59, 2 (Feb. 2012), 525--530.
[37]
Van Erp, J.B.F. and Van Veen, H.A.H.C. 2004. Vibrotactile in-vehicle navigation system. Transportation Research. F, 7 (2004), 247--256.
[38]
Yanco, H.A., Baker, M., Casey, R., Keyes, B. Thoren, P., Drury, J.L., Few, D., Nielsen, C., and Bruemmer, D. 2006. Analysis of human-robot interaction for urban search and rescue. In Proceedings of the IEEE Symposium on Safety, Security and Rescue Robots.

Cited By

View all
  • (2024)The Role of Audio Feedback and Gamification Elements for Remote Boom OperationMultimodal Technologies and Interaction10.3390/mti80800698:8(69)Online publication date: 1-Aug-2024
  • (2024)How Different Training Types and Computer Anxiety Influence Performance and Experiences in Virtual RealityMedia and Communication10.17645/mac.873012Online publication date: 27-Nov-2024
  • (2024)Affective Landscapes: Navigating the Emotional Impact of Multisensory Stimuli in Virtual RealityIEEE Access10.1109/ACCESS.2024.349985812(169955-169976)Online publication date: 2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SUI '13: Proceedings of the 1st symposium on Spatial user interaction
July 2013
108 pages
ISBN:9781450321419
DOI:10.1145/2491367
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 20 July 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. audio and vibro-tactile feedback
  2. multi-sensory interfaces
  3. robot teleoperation
  4. urban search-and-rescue
  5. virtual environment
  6. visual

Qualifiers

  • Research-article

Conference

SUI '13
SUI '13: Symposium on Spatial User Interaction
July 20 - 21, 2013
California, Los Angeles, USA

Acceptance Rates

SUI '13 Paper Acceptance Rate 12 of 31 submissions, 39%;
Overall Acceptance Rate 86 of 279 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)28
  • Downloads (Last 6 weeks)2
Reflects downloads up to 01 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)The Role of Audio Feedback and Gamification Elements for Remote Boom OperationMultimodal Technologies and Interaction10.3390/mti80800698:8(69)Online publication date: 1-Aug-2024
  • (2024)How Different Training Types and Computer Anxiety Influence Performance and Experiences in Virtual RealityMedia and Communication10.17645/mac.873012Online publication date: 27-Nov-2024
  • (2024)Affective Landscapes: Navigating the Emotional Impact of Multisensory Stimuli in Virtual RealityIEEE Access10.1109/ACCESS.2024.349985812(169955-169976)Online publication date: 2024
  • (2023)A CNN-Based Framework for Enhancing 360° VR Experiences With Multisensorial EffectsIEEE Transactions on Multimedia10.1109/TMM.2022.315755625(3245-3258)Online publication date: 2023
  • (2023)Thermal and wind devices for multisensory human-computer interaction: an overviewMultimedia Tools and Applications10.1007/s11042-023-14672-y82:22(34485-34512)Online publication date: 7-Mar-2023
  • (2022)Do Multisensory Stimuli Benefit the Virtual Reality Experience? A Systematic ReviewIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2020.301008828:2(1428-1442)Online publication date: 1-Feb-2022
  • (2022)A Systematic Review: The Role of Multisensory Feedback in Virtual Reality2022 IEEE 2nd International Conference on Intelligent Reality (ICIR)10.1109/ICIR55739.2022.00024(39-42)Online publication date: Dec-2022
  • (2022)Authoring tools for virtual reality experiences: a systematic reviewMultimedia Tools and Applications10.1007/s11042-022-12829-981:19(28037-28060)Online publication date: 29-Mar-2022
  • (2022)Immersive multisensory virtual reality technologies for virtual tourismMultimedia Systems10.1007/s00530-022-00898-728:3(1027-1037)Online publication date: 4-Feb-2022
  • (2021)Delivering Critical Stimuli for Decision Making in VR Training: Evaluation Study of a Firefighter Training ScenarioIEEE Transactions on Human-Machine Systems10.1109/THMS.2020.303074651:2(65-74)Online publication date: Apr-2021
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media