Abstract
Interactive stories in Virtual Reality need a way of making decisions that influence the further progress of the story. The decision making should be easy and should not disturb the feeling of presence in the virtual world. As many virtual reality glasses come with an integrated eye tracker, it suggests itself to use gaze for making the decisions. We created an interactive story and implemented three different gaze interaction methods, which we evaluated in a user study with 24 participants. The three interaction methods used a dwell-time mechanism, one with a gaze button, one with a texture change of the object looked at, and one method without any feedback. The texture change was the favorite of the users, however, the choice of the interaction method may also depend on the intended dramaturgy and aesthetic aspects of the story.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Argyriou, L., Economou, D., Bouki, V., Doumanis, I.: Engaging immersive video consumers: challenges regarding 360-degree gamified video applications. In: 2016 15th International Conference on Ubiquitous Computing and Communications and 2016 International Symposium on Cyberspace and Security (IUCC-CSS), pp. 145–152 (2016). https://doi.org/10.1109/IUCC-CSS.2016.028
Bowman, D.A., Kruijff, E., LaViola, J.J., Poupyrev, I.: An introduction to 3-d user interface design. Presence Teleoperators Virtual Environ. 10, 96–108 (2001). https://doi.org/10.1162/105474601750182342. http://www.mitpressjournals.org/doi/10.1162/105474601750182342
Drewes, H., Khamis, M., Alt, F.: Smooth pursuit target speeds and trajectories. In: Proceedings of the 17th International Conference on Mobile and Ubiquitous Multimedia, MUM 2018, pp. 139–146. ACM, New York (2018). https://doi.org/10.1145/3282894.3282913. http://doi.acm.org/10.1145/3282894.3282913
Drewes, H., Schmidt, A.: Interacting with the computer using gaze gestures. In: Baranauskas, C., Palanque, P., Abascal, J., Barbosa, S.D.J. (eds.) INTERACT 2007. LNCS, vol. 4663, pp. 475–488. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-74800-7_43http://dl.acm.org/citation.cfm?id=1778331.1778385
Hart, S.G.: Nasa-Task Load Index (NASA-TLX); 20 years later. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 50, no. 9, pp. 904–908 (2006). https://doi.org/10.1177/154193120605000909
Hart, S.G., Staveland, L.E.: Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. In: Hancock, P.A., Meshkati, N. (eds.) Advances in Psychology: Human Mental Workload, vol. 52, pp. 139–183. North-Holland (1988). https://doi.org/10.1016/S0166-4115(08)62386-9. http://www.sciencedirect.com/science/article/pii/S0166411508623869
Jacob, R.J.K.: What you look at is what you get: eye movement-based interaction techniques. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 1990, pp. 11–18. Association for Computing Machinery, New York (1990). https://doi.org/10.1145/97243.97246. https://doi.org/10.1145/97243.97246
Jacob, R.J.K.: The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans. Inf. Syst. 9, 152–169 (1991). https://doi.org/10.1145/123078.128728. http://portal.acm.org/citation.cfm?doid=123078.128728
Kallioniemi, P., et al.: Hotspot interaction in omnidirectional videos using head-mounted displays. In: Proceedings of the 22nd International Academic Mindtrek Conference, Mindtrek 2018, pp. 126–134. Association for Computing Machinery, New York (2018). https://doi.org/10.1145/3275116.3275148. https://doi.org/10.1145/3275116.3275148
Kennedy, R.S., Lane, N.E., Berbaum, K.S., Lilienthal, M.G.: Simulator sickness questionnaire: an enhanced method for quantifying simulator sickness. Int. J. Aviat. Psychol. 3(3), 203–220 (1993). https://doi.org/10.1207/s15327108ijap0303_3
Nukarinen, T., Kangas, J., Rantala, J., Koskinen, O., Raisamo, R.: Evaluating ray casting and two gaze-based pointing techniques for object selection in virtual reality. In: Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, VRST 2018. Association for Computing Machinery, New York (2018). https://doi.org/10.1145/3281505.3283382. https://doi.org/10.1145/3281505.3283382
Qian, Y.Y., Teather, R.J.: The eyes don’t have it: an empirical comparison of head-based and eye-based selection in virtual reality. In: Simeone, A.L. (ed.) Proceedings of the 5th Symposium on Spatial User Interaction, Brighton, UK, pp. 91–98. ACM (2017). https://doi.org/10.1145/3131277.3132182
Reyes, M.C.: Screenwriting framework for an interactive virtual reality film. In: Online proceedings of the 3rd Immersive Learning Research Network, Coimbra, Portugal (2017). https://doi.org/10.3217/978-3-85125-530-0-15
Reyes, M.C.: Measuring user experience on interactive fiction in cinematic virtual reality. In: Rouse, R., Koenitz, H., Haahr, M. (eds.) ICIDS 2018. LNCS, vol. 11318, pp. 295–307. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-04028-4_33
Roth, C.: Experiencing interactive storytelling. Ph.D. thesis, Vrije Universiteit Amsterdam, January 2015. http://dare.ubvu.vu.nl/handle/1871/53840
Rothe, S., Hussmann, H.: Spaceline: a concept for interaction in cinematic virtual reality. In: Cardona-Rivera, R.E., Sullivan, A., Young, R.M. (eds.) ICIDS 2019. LNCS, vol. 11869, pp. 115–119. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-33894-7_12
Rothe, S., Pothmann, P., Drewes, H., Hussmann, H.: Interaction techniques for cinematic virtual reality. In: Teather, R., Itoh, Y., Gabbard, J. (eds.) Proceedings, 26th IEEE Conference on Virtual Reality and 3D User Interfaces, Osaka, Japan, pp. 1733–1737. IEEE (2019). https://doi.org/10.1109/VR.2019.8798189
Schreer, O., et al.: Lessons learned during one year of commercial volumetric video production. SMPTE Motion Imaging J. 129(9), 31–37 (2020). https://doi.org/10.5594/JMI.2020.3010399
Tanak, N.: Interactive Cinema. MediaLAB Amsterdam (2015)
Verdugo, R., Nussbaum, M., Corro, P., Nuñnez, P., Navarrete, P.: Interactive films and coconstruction. ACM Trans. Multimedia Comput. Commun. Appl. 7, 1–24 (2011). https://doi.org/10.1145/2043612.2043617. http://dl.acm.org/citation.cfm?doid=2043612.2043617
Vesterby, T., Voss, J.C., Hansen, J.P., Glenstrup, A.J., Hansen, D.W., Rudolph, M.: Gaze-guided viewing of interactive movies. Digital Creativity 16(4), 193–204 (2005). https://doi.org/10.1080/14626260500476523
Vidal, M., Bulling, A., Gellersen, H.: Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp 2013, pp. 439–448. ACM, New York (2013). https://doi.org/10.1145/2493432.2493477. http://doi.acm.org/10.1145/2493432.2493477
Vosmeer, M., Schouten, B.: Interactive cinema: engagement and interaction. In: Mitchell, A., Fernández-Vara, C., Thue, D. (eds.) ICIDS 2014. LNCS, vol. 8832, pp. 140–147. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-12337-0_14
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Drewes, H., Müller, E., Rothe, S., Hussmann, H. (2021). Gaze-Based Interaction for Interactive Storytelling in VR. In: De Paolis, L.T., Arpaia, P., Bourdot, P. (eds) Augmented Reality, Virtual Reality, and Computer Graphics. AVR 2021. Lecture Notes in Computer Science(), vol 12980. Springer, Cham. https://doi.org/10.1007/978-3-030-87595-4_8
Download citation
DOI: https://doi.org/10.1007/978-3-030-87595-4_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-87594-7
Online ISBN: 978-3-030-87595-4
eBook Packages: Computer ScienceComputer Science (R0)