Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1109/ROMAN.2017.8172387guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
research-article

Understanding human-robot interaction in virtual reality

Published: 28 August 2017 Publication History

Abstract

Interactions with simulated robots are typically presented on screens. Virtual reality (VR) offers an attractive alternative as it provides visual cues that are more similar to the real world. In this paper, we explore how virtual reality mediates human-robot interactions through two user studies. The first study shows that in situations where perception of the robot is challenging, a VR display provides significantly improved performance on a collaborative task. The second study shows that this improved performance is primarily due to stereo cues. Together, the findings of these studies suggest that VR displays can offer users unique perceptual benefits in simulated robotics applications.

References

[1]
C. Bodden, D. Rakita, B. Mutlu, and M. Gleicher, “Evaluating intent-expressive robot arm motion,” in 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, aug 2016, pp. 658–663. [Online]. Available: http://graphics.cs.wisc.edu/Papers/2016/BRMG16http://ieeexplorejeee.org/document/7745188/.
[2]
G. C. Burdea, “Invited review: the synergy between virtual reality and robotics,” IEEE Transactions on Robotics and Automation, vol. 15, no. 3, pp. 400–410, 1999.
[3]
X. Tang and H. Yamada, “Tele-operation construction robot control system with virtual reality technology,” Procedia Engineering, vol. 15, pp. 1071–1076, 2011.
[4]
I. R. Belousov, R. Chellali, and G. J. Clapworthy, “Virtual reality tools for internet robotics,” in Robotics and Automation, 2001. Proceedings 2001 ICRA. IEEE International Conference on, vol. 2. IEEE, 2001, pp. 1878–1883.
[5]
R. Safaric, S. Sinjur, B. Zalik, and R. M. Parkin, “Control of robot arm with virtual environment via the internet,” Proceedings of the IEEE, vol. 91, no. 3, pp. 422–429, 2003.
[6]
H. Kawasaki, T. Mouri, T. Abe, and S. Ito, “Virtual teaching based on hand manipulability for multi-fingered robots,” Journal of the Robotics Society of Japan, vol. 21, no. 2, pp. 194–200, 2003.
[7]
S. Kiesler, A. Powers, S. Fussell, and C. Torrey, “Anthropomorphic interactions with a robot and robotlike agent,” Social Cognition, vol. 26, no. 2, pp. 169–181, 2008.
[8]
S. Woods, M. Walters, K. Koay, and K. Dautenhahn, “Comparing human robot interaction scenarios using live and video based methods: towards a novel methodological approach,” in IEEE International Workshop on Advanced Motion Control, 2006, pp. 750–755.
[9]
D. Szafir, B. Mutlu, and T. Fong, “Communicating Directionality in Flying Robots,” in ACM/IEEE HRI, 2015, pp. 19–26.
[10]
W. A. Bainbridge, J. W. Hart, E. S. Kim, and B. Scassellati, “The benefits of interactions with physically present robots over video-displayed agents,” International Journal of Social Robotics, vol. 3, no. 1, pp. 41–52, 2011.
[11]
S. H. Seo, D. Geiskkovitch, M. Nakane, C. King, and J. E. Young, “Poor thing! would you feel sorry for a simulated robot?: A comparison of empathy toward a physical and a simulated robot,” in Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction. ACM, 2015, pp. 125–132.
[12]
J. Wainer, D. J. Feil-Seifer, D. A. Shell, and M. J. Mataric, “Embodiment and human-robot interaction: A task-based perspective,” in Robot and Human interactive Communication, 2007. RO-MAN 2007. The 16th IEEE International Symposium on. IEEE, 2007, pp. 872–877.
[13]
R. Pausch, D. Proffitt, and G. Williams, “Quantifying immersion in virtual reality,” in Proceedings of the 24th annual conference on Computer graphics and interactive techniques, ACM Press/Addison-Wesley Publishing Co., 1997, pp. 13–18.
[14]
W. Barfield, C. Hendrix, and K.-E. Bystrom, “Effects of stereopsis and head tracking on performance using desktop virtual environment displays,” Presence: Teleoperators and Virtual Environments, vol. 8, no. 2, pp. 237–240, 1999.
[15]
A. Kulshreshth and J. J. LaViola Jr, “Evaluating performance benefits of head tracking in modern video games,” in Proceedings of the 1st symposium on Spatial user interaction. ACM, 2013, pp. 53–60.
[16]
E. D. Ragan, R. Kopper, P. Schuchardt, and D. A. Bowman, “Studying the effects of stereo, head tracking, and field of regard on a small-scale spatial judgment task,” IEEE transactions on visualization and computer graphics, vol. 19, no. 5, pp. 886–896, 2013.
[17]
M. Slater, B. Spanlang, and D. Corominas, “Simulating virtual environments within virtual environments as the basis for a psychophysics of presence,” ACM Transactions on Graphics, vol. 29, no. 4, p. 1, jul 2010. [Online]. Available: http://dl.acm.org/citation.cfm?id=1778765.1778829.
[18]
K. Simons, “A comparison of the frisby, random-dot e, tno, and randot circles stereotests in screening and office use,” Archives of ophthalmology, vol. 99, no. 3, pp. 446–452, 1981.
[19]
E. Walker and A. S. Nowacki, “Understanding equivalence and noninferiority testing,” Journal of general internal medicine, vol. 26, no. 2, pp. 192–196, 2011.
[20]
S. Palmisano, B. Gillam, D. G. Govan, R. S. Allison, and J. M. Harris, “Stereoscopic perception of real depths at large distances,” Journal of Vision, vol. 10, no. 6, pp. 19–19, jun 2010. [Online]. Available: http://jov.arvojournals.org/Article.aspx?doi=10.1167/10.6.19.
[21]
J. A. Jones, J. E. Swan II, G. Singh, E. Kolstad, and S. R. Ellis, “The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception,” in Proceedings of the 5th symposium on Applied perception in graphics and visualization. ACM, 2008, pp. 9–14.
[22]
C. Armbrüster, M. Wolter, T. Kuhlen, W. Spijkers, and B. Fimm, “Depth perception in virtual reality: distance estimations in peri-and extrapersonal space,” Cyberpsychology & Behavior, vol. 11, no. 1, pp. 9–15, 2008.

Cited By

View all
  • (2024)Difficulties in Perceiving and Understanding Robot Reliability Changes in a Sequential Binary TaskProceedings of the 2024 ACM Symposium on Spatial User Interaction10.1145/3677386.3682083(1-11)Online publication date: 7-Oct-2024
  • (2022)A Taxonomy of Functional Augmented Reality for Human-Robot InteractionProceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction10.5555/3523760.3523801(294-303)Online publication date: 7-Mar-2022

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)
Aug 2017
1512 pages

Publisher

IEEE Press

Publication History

Published: 28 August 2017

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 13 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Difficulties in Perceiving and Understanding Robot Reliability Changes in a Sequential Binary TaskProceedings of the 2024 ACM Symposium on Spatial User Interaction10.1145/3677386.3682083(1-11)Online publication date: 7-Oct-2024
  • (2022)A Taxonomy of Functional Augmented Reality for Human-Robot InteractionProceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction10.5555/3523760.3523801(294-303)Online publication date: 7-Mar-2022

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media