Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Rendering Tree Roots Outdoors: A Comparison Between Optical See Through Glasses and Smartphone Modules for Underground Augmented Reality Visualization

  • Conference paper
  • First Online:
Virtual, Augmented and Mixed Reality (HCII 2021)

Abstract

In this paper we propose augmented reality (AR) modes for showing virtual tree roots in the nature. The main question that we want to answer is “How is it possible to convincingly represent virtual 3D models of tree roots as being located in the forest soil, underground, while using the AR technology?”. We present different rendering and occlusion modes and how they were implemented on two different types of hardware. Two user studies, that we performed to compare the AR visualization for optical see-through glasses (HoloLens head mounted display) and for mobile devices (smartphone), are described. We specifically focus on depth perception and situated visualization (the merging of real and virtual environment and actions). After discussing the experiences collected during outdoor user tests and the results of a questionnaire, we give some directions for the future use of AR in the nature and as a part of an environmental educational setting. The central result of the study is that while supporting depth perception with additional depth cues, specifically occlusion, is very beneficial in the mobile device setting, this support does not change depth perception significantly in the stereo HMD setting.

This work has been performed in project SAARTE (Spatially-Aware Augmented Reality in Teaching and Education). SAARTE is supported by the European Union (EU) in the ERDF program P1-SZ2-7 and by the German federal state Rhineland-Palatinate (Antr.-Nr. 84002945).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    \(++\) (fully agree), \(+\) (agree), \(=\) (neutral), − (reject) and \(--\) (totally reject).

References

  1. Azuma, R.T.: A survey of augmented reality. Presence: Teleoper. Virtual Environ. 6(4), 355–385 (1997). https://doi.org/10.1162/pres.1997.6.4.355

  2. Bajura, M., Fuchs, H., Ohbuchi, R.: Merging virtual objects with the real world: seeing ultrasound imagery within the patient. ACM SIGGRAPH Comput. Graph. 26(2), 203–210 (1992)

    Article  Google Scholar 

  3. Behzadan, A.H., Dong, S., Kamat, V.R.: Augmented reality visualization: a review of civil infrastructure system applications. Adv. Eng. Inform. 29(2), 252–267 (2015). https://doi.org/10.1016/j.aei.2015.03.005

    Article  Google Scholar 

  4. Billinghurst, M., Clark, A., Lee, G.: A survey of augmented reality. Found. Trends® Hum.-Comput. Interact. 8(2–3), 73–272 (2015). https://doi.org/10.1561/1100000049

  5. Billinghurst, M., Grasset, R., Looser, J.: Designing augmented reality interfaces. SIGGRAPH Comput. Graph. 39(1), 17–22 (2005). https://doi.org/10.1145/1057792.1057803

  6. Johnson, R.B., Onwuegbuzie, A.J., Turner, L.A.: Toward a definition of mixed methods research. J. Mixed Methods Res. 1(2), 112–133 (2007). https://doi.org/10.1177/1558689806298224

    Article  Google Scholar 

  7. Kalkofen, D., Sandor, C., White, S., Schmalstieg, D.: Visualization techniques for augmented reality. In: Furht, B. (eds.) Handbook of Augmented Reality, pp. 65–98. Springer, New York, NY (2011). https://doi.org/10.1007/978-1-4614-0064-6_3

  8. Kamarainen, A., Reilly, J., Metcalf, S., Grotzer, T., Dede, C.: Using mobile location-based augmented reality to support outdoor learning in undergraduate ecology and environmental science courses. Bull. Ecol. Soc. Am. 99(2), 259–276 (2018). https://doi.org/10.1002/bes2.1396

    Article  Google Scholar 

  9. Kolsch, M., Bane, R., Hollerer, T., Turk, M.: Multimodal interaction with a wearable augmented reality system. IEEE Comput. Graph. Appl. 26(3), 62–71 (2006). https://doi.org/10.1109/MCG.2006.66

    Article  Google Scholar 

  10. Lilligreen, G., Keuchel, S., Wiebel, A.: Augmented reality in higher education: an active learning approach for a course in audiovisual production. In: Proceedings of the 16th Annual EuroVR Conference, pp. 23–36 (2019). https://doi.org/10.32040/2242-122X.2019.T357

  11. Livingston, M.A., Dey, A., Sandor, C., Thomas, B.H.: Pursuit of “X-Ray Vision" for Augmented Reality, pp. 67–107. Springer, New York, NY (2013). https://doi.org/10.1007/978-1-4614-4205-9_4

  12. Microsoft: Microsoft HoloLens 1 gestures. https://docs.microsoft.com/en-us/dynamics365/mixed-reality/guides/authoring-gestures. Accessed 10 Nov 2020

  13. Microsoft: Mixed Reality Toolkit Documentation, MRTK Shader. https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/README_MRTKStandardShader.html. Accessed 11 Jan 2021

  14. Microsoft: Microsoft HoloLens 1 spatial mapping. https://docs.microsoft.com/en-us/windows/mixed-reality/design/spatial-mapping. Accessed 20 Jan 2021

  15. Microsoft: Microsoft mixed reality documentation. spatial anchors. https://docs.microsoft.com/de-de/windows/mixed-reality/design/spatial-anchors. Accessed 22 Jan 2021

  16. Palmarini, R., Erkoyuncu, J.A., Roy, R., Torabmostaedi, H.: A systematic review of augmented reality applications in maintenance. Robot. Comput.-Integr. Manuf. 49, 215–228 (2018)

    Article  Google Scholar 

  17. Rambach, J., Lilligreen, G., Schäfer, A., Bankanal, R., Wiebel, A., Stricker, D.: A survey on applications of augmented, mixedand virtual reality for nature and environment. In: Proceedings of the HCI International Conference 2021. p. in this volume (2021)

    Google Scholar 

  18. Rubin, J., Chisnell, D.: Handbook of Usability Testing: How to Plan, Design ad Conduct Effective Tests. John Wiley & Sons, New Jersey (2008)

    Google Scholar 

  19. Schall, G., Zollmann, S., Reitmayr, G.: Smart vidente: advances in mobile augmented reality for interactive visualization of underground infrastructure. Pers. Ubiquit. Comput. 17(7), 1533–1549 (2013). https://doi.org/10.1007/s00779-012-0599-x

    Article  Google Scholar 

  20. Schmalstieg, D., Höllerer, T.: Augmented Reality - Principles and Practice. Addison-Wesley Professional, United States (2016)

    Google Scholar 

  21. Skjott Linneberg, M., Korsgaard, S.: Coding qualitative data: a synthesis guiding the novice. Qual. Res. J. 19(3), 59–270 (2019). https://doi.org/10.1108/QRJ-12-2018-0012

    Article  Google Scholar 

  22. Unity Technologies: Unity User Manual, stencil. https://docs.unity3d.com/2019.4/Documentation/Manual/SL-Stencil.html. Accessed 10 Nov 2020

  23. Unity Technologies: Unity real-time development platform. https://unity.com/. Accessed 27 Jan 2021

  24. Von Itzstein, G.S., Billinghurst, M., Smith, R.T., Thomas, B.H.: Augmented reality entertainment: taking gaming out of the box. In: Lee, N. (eds.) Encyclopedia of Computer Graphics and Games, pp. 1–9. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-08234-9_81-1

  25. Zhang, X., Han, Y., Hao, D., Lv, Z.: Argis-based outdoor underground pipeline information system. J. Vis. Commun. Image Representation 40, 779–790 (2016). https://doi.org/10.1016/j.jvcir.2016.07.011

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gergana Lilligreen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lilligreen, G., Marsenger, P., Wiebel, A. (2021). Rendering Tree Roots Outdoors: A Comparison Between Optical See Through Glasses and Smartphone Modules for Underground Augmented Reality Visualization. In: Chen, J.Y.C., Fragomeni, G. (eds) Virtual, Augmented and Mixed Reality. HCII 2021. Lecture Notes in Computer Science(), vol 12770. Springer, Cham. https://doi.org/10.1007/978-3-030-77599-5_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-77599-5_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-77598-8

  • Online ISBN: 978-3-030-77599-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics