Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3411764.3445255acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

HoloBar: Rapid Command Execution for Head-Worn AR Exploiting Around the Field-of-View Interaction

Published: 07 May 2021 Publication History

Abstract

Inefficient menu interfaces lead to system and application commands being tedious to execute in Immersive Environments. HoloBar is a novel approach to ease the interaction with multi-level menus in immersive environments: with HoloBar, the hierarchical menu splits between the field of view (FoV) of the Head Mounted Display and the smartphone (SP). Command execution is based on around-the-FoV interaction with the SP, and touch input on the SP display. The HoloBar offers a unique combination of features, namely rapid mid-air activation, implicit selection of top-level items and preview of second-level items on the SP, ensuring rapid access to commands. In a first study we validate its activation method, which consists in bringing the SP within an activation distance from the FoV. In a second study, we compare the HoloBar to two alternatives, including the standard HoloLens menu. Results show that the HoloBar shortens each step of a multi-level menu interaction (menu activation, top-level item selection, second-level item selection and validation), with a high success rate. A follow-up study confirms that these results remain valid when compared with the two validation mechanisms of HoloLens (Air-Tap and clicker).

Supplementary Material

MP4 File (3411764.3445255_videofigure.mp4)
Supplemental video
MP4 File (3411764.3445255_videopreview.mp4)
Preview video

References

[1]
Daniel Ashbrook. 2010. Enabling Mobile Microinteractions. PhD Thesis May (January 2010).
[2]
Takumi Azai, Shuhei Ogawa, Mai Otsuki, Fumihisa Shibata, and Asako Kimura. 2017. Selection and manipulation methods for a menu widget on the human forearm. In Conference on Human Factors in Computing Systems - Proceedings, Association for Computing Machinery, New York, New York, USA, 357–360.
[3]
Takumi Azai, Mai Otsuki, Fumihisa Shibata, and Asako Kimura. 2018. Open palm menu: A virtual menu placed in front of the palm. In ACM International Conference Proceeding Series, Association for Computing Machinery, New York, New York, USA, 1–5.
[4]
Takumi Azai, Syunsuke Ushiro, Junlin Li, Mai Otsuki, Fumihisa Shibata, and Asako Kimura. 2018. Tap-tap menu: Body touching for virtual interactive menus. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST, Association for Computing Machinery, New York, NY, USA, 1–2.
[5]
Charles Bailly, François Leitner, and Laurence Nigay. 2019. Head-Controlled Menu in Mixed Reality with a HMD. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Verlag, 395–415.
[6]
Gilles Bailly, Eric Lecolinet, and Laurence Nigay. 2016. Visual menu techniques. ACM Computing Surveys 49, 4 (December 2016).
[7]
Hrvoje Benko, Eyal Ofek, Feng Zheng, and Andrew D. Wilson. 2015. FoveAR: Combining an optically see-through near-eye display with spatial augmented reality projections. In UIST 2015 - Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology, Association for Computing Machinery, Inc, New York, New York, USA, 129–135.
[8]
Amal Benzina, Arindam Dey, Marcus Tönnis, and Gudrun Klinker. 2012. Empirical evaluation of mapping functions for navigation in virtual reality using phones with integrated sensors. In APCHI’12 - Proceedings of the 2012 Asia Pacific Conference on Computer-Human Interaction, ACM Press, New York, New York, USA, 149–158.
[9]
Louis-Pierre Bergé, Emmanuel Dubois, and Mathieu Raynal. 2015. Design and Evaluation of an “Around the SmartPhone” Technique for 3D Manipulations on Distant Display. Proceedings of the 3rd ACM Symposium on Spatial User Interaction - SUI ’15 (2015), 69–78.
[10]
G. Borg and H. Löllgen. 2001. Borg's perceived exertion and pain scales. Human Kinetics, Champaign IL.
[11]
D. A. Bowman and C. A. Wingrave. 2001. Design and evaluation of menu systems for immersive virtual environments. In Proceedings - Virtual Reality Annual International Symposium, 149–156.
[12]
Doug A. Bowman and Larry F. Hodges. 1999. Formalizing the design, evaluation, and application of interaction techniques for immersive virtual environments. Journal of Visual Languages and Computing 10, 1 (February 1999), 37–53.
[13]
Frederik Brudy, Christian Holz, Roman Rädle, Chi-Jui Wu, Steven Houben, Clemens Nylandsted Klokmose, and Nicolai Marquardt. 2019. Cross-Device Taxonomy. (2019), 1–28.
[14]
Rahul Budhiraja, Gun A. Lee, and Mark Billinghurst. 2013. Using a HHD with a HMD for mobile AR interaction. In 2013 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2013.
[15]
Tsung Hsiang Chang and Yang Li. 2011. Deep shot: A framework for migrating tasks across devices using mobile phone cameras. In Conference on Human Factors in Computing Systems - Proceedings, 2163–2172.
[16]
Dustin B. Chertoff, Ross W. Byers, and Joseph J. LaViola. 2009. An exploration of menu techniques using a 3D game input device. In FDG 2009 - 4th International Conference on the Foundations of Digital Games, Proceedings, ACM Press, New York, New York, USA, 256–263.
[17]
Manuela Chessa, Guido Maiello, Lina K. Klein, Vivian C. Paulun, and Fabio Solari. 2019. Grasping objects in immersive virtual reality. In 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings, Institute of Electrical and Electronics Engineers Inc., 1749–1754.
[18]
Raimund Dachselt and Anett Hübner. 2007. Three-dimensional menus: A survey and taxonomy. Computers and Graphics (Pergamon) 31, 1 (January 2007), 53–65.
[19]
Kaushik Das and Christoph W. Borst. 2010. An evaluation of menu properties and pointing techniques in a projection-based VR environment. In 3DUI 2010 - IEEE Symposium on 3D User Interfaces 2010, Proceedings, IEEE, 47–50.
[20]
Kate E. Decleene and Jennifer Fogo. 2012. Publication Manual of the American Psychological Association. In Occupational Therapy in Health Care, American Psychological Association (ed.). 90–92.
[21]
Steven Feiner, Blair MacIntyre, Tobias Höllerer, and Anthony Webster. 1997. A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment. Personal and Ubiquitous Computing 1, 4 (1997), 208–217.
[22]
Bruno Fruchard, Arnaud Prouzeau, Olivier Chapuis, Eric Lecolinet, Bruno Fruchard, Arnaud Prouzeau, Olivier Chapuis, Eric Lecolinet, and Leveraging Body. 2019. Leveraging Body Interactions to Support Immersive Analytics To cite this version: HAL Id: hal-02095993 Leveraging Body Interactions to Support Immersive Analytics. Retrieved May 2, 2020 from https://hal.archives-ouvertes.fr/hal-02095993
[23]
D Gerber and D Bechmann. 2004. Design and evaluation of the ring menu in virtual environments. Immersive projection technologies (2004). Retrieved May 2, 2020 from http://liris.cnrs.fr/sylvain.brandel/old_site/en/research/VR/ipt_gb04_web.pdf
[24]
Jens Grubert, Matthias Heinisch, Aaron Quigley, and Dieter Schmalstieg. 2015. MultiFi: Multi-fidelity interaction with displays on and around the body. In Conference on Human Factors in Computing Systems - Proceedings, Association for Computing Machinery, New York, New York, USA, 3933–3942.
[25]
Chris Harrison, Hrvoje Benko, and Andrew D. Wilson. 2011. OmniTouch: Wearable multitouch interaction everywhere. In UIST’11 - Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, ACM Press, New York, New York, USA, 441–450.
[26]
Teresa Hirzle, Jan Gugenheimer, Jan Rixen, and Enrico Rukzio. 2018. Watchvr: Exploring the usage of a smartwatch for interaction in mobile virtual reality. In Conference on Human Factors in Computing Systems - Proceedings, Association for Computing Machinery, New York, NY, USA, 1–6.
[27]
Richard H. Jacoby and Stephen R. Ellis. 1992. Using virtual menus in a virtual environment. In Visual Data Interpretation, SPIE, 39–48.
[28]
Seongwook Jeong, Eui S. Jung, and Youngjae Im. 2016. Ergonomic evaluation of interaction techniques and 3D menus for the practical design of 3D stereoscopic displays. International Journal of Industrial Ergonomics 53, (May 2016), 205–218.
[29]
Keiko Katsuragawa, Krzysztof Pietroszek, James R. Wallace, and Edward Lank. 2016. Watchpoint: Freehand pointing with a smartwatch in a ubiquitous display environment. Proceedings of the Workshop on Advanced Visual Interfaces AVI 07-10-June, June (2016), 128–135.
[30]
Daniel Kharlamov, Brandon Woodard, Liudmila Tahai, and Pietroszek Krzysztof. 2016. TickTockRay: Smartwatch-based 3D pointing for smartphone-based virtual reality. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST, Association for Computing Machinery, New York, New York, USA, 363–364.
[31]
Wilton Marion Krogman. 1961. : The Measure of Man: Human Factors in Design . Henry Dreyfuss. American Anthropologist 63, 4 (1961), 884–884.
[32]
Martin Krzywinski and Naomi Altman. 2013. Points of Significance: Error bars. Nature Methods 10, 921–922.
[33]
Martin Krzywinski and Naomi Altman. 2013. Points of Significance: Error bars. Nature Methods 10, 921–922.
[34]
Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A. Lee, and Mark Billinghurst. 2018. Pinpointing: Precise head- and eye-based target selection for augmented reality. In Conference on Human Factors in Computing Systems - Proceedings, Association for Computing Machinery, New York, New York, USA, 1–14.
[35]
Ricardo Langner, Ulrich von Zadow, Tom Horak, Annett Mitschick, and Raimund Dachselt. 2017. Content sharing between spatially-aware mobile phones and large vertical displays supporting collaborative work. Collaboration Meets Interactive Spaces (2017), 75–96.
[36]
Microsoft. 2019. Getting around HoloLens (1st gen). Retrieved May 2, 2020 from https://docs.microsoft.com/en-us/hololens/hololens1-basic-usage#use-hololens-with-your-hands
[37]
Alexandre Millette and Michael J. McGuffin. 2017. DualCAD: Integrating Augmented Reality with a Desktop GUI and Smartphone Interaction. In Adjunct Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2016, Institute of Electrical and Electronics Engineers Inc., 21–26.
[38]
Mark R. Mine, Frederick P. Brooks, and Carlo H. Sequin. 1997. Moving objects in space: Exploiting proprioception in virtual-environment interaction. In Proceedings of the 24th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1997, Association for Computing Machinery, Inc, New York, New York, USA, 19–26.
[39]
Peter Mitchell and Brett Wilkinson. 2016. Periphery triggered menus for head mounted menu interface interactions. In Proceedings of the 28th Australian Computer-Human Interaction Conference, OzCHI 2016, Association for Computing Machinery, Inc, New York, New York, USA, 30–33.
[40]
Peter Mohr, Markus Tatzgern, Tobias Langlotz, Andreas Lang, Dieter Schmalstieg, and Denis Kalkofen. 2019. TrackCap: Enabling smartphones for 3D interaction on mobile head-mounted displays. In Conference on Human Factors in Computing Systems - Proceedings, Association for Computing Machinery, New York, New York, USA, 1–11.
[41]
Erwan Normand and Michael J. McGuffin. 2019. Enlarging a Smartphone with AR to Create a Handheld VESAD (Virtually Extended Screen-Aligned Display). In Proceedings of the 2018 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2018, Institute of Electrical and Electronics Engineers Inc., 123–133.
[42]
Jeni Paay, Dimitrios Raptis, Jesper Kjeldskov, Mikael B. Skov, Eric v. Ruder, and Bjarke M. Lauridsen. 2017. Investigating cross-device interaction between a handheld device and a large display. In Conference on Human Factors in Computing Systems - Proceedings, Association for Computing Machinery, New York, NY, USA, 6608–6619.
[43]
Chanho Park, Hyunwoo Cho, Sangheon Park, Young Suk Yoon, and Sung Uk Jung. 2019. Handposemenu: Hand posture-based virtual menus for changing interaction mode in 3D space. In ISS 2019 - Proceedings of the 2019 ACM International Conference on Interactive Surfaces and Spaces, Association for Computing Machinery, Inc, New York, NY, USA, 361–366.
[44]
Gary Perelman, Marcos Serrano, Christophe Bortolaso, Celia Picard, Mustapha Derras, and Emmanuel Dubois. 2019. Combining Tablets with Smartphones for Data Analytics. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Verlag, 439–460.
[45]
Krzysztof Pietroszek, Anastasia Kuzminykh, James R. Wallace, and Edward Lank. 2014. Smartcasting: A discount 3D interaction technique for public displays. Proceedings of the 26th Australian Computer-Human Interaction Conference, OzCHI 2014 December (2014), 119–128.
[46]
Krzysztof Pietroszek, Liudmila Tahai, James R. Wallace, and Edward Lank. 2017. Watchcasting: Freehand 3D interaction with off-the-shelf smartwatch. 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 - Proceedings January (2017), 172–175.
[47]
Krzysztof Pietroszek, James R. Wallace, and Edward Lank. 2015. Tiltcasting: 3D interaction on large displays using a mobile device. In UIST 2015 - Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology, Association for Computing Machinery, Inc, New York, NY, USA, 57–62.
[48]
Carole Plasson, Dominique Cunin, Yann Laurillau, and Laurence Nigay. 2019. Tabletop AR with HMD and tablet: A comparative study for 3D selection. In ISS 2019 - Proceedings of the 2019 ACM International Conference on Interactive Surfaces and Spaces, Association for Computing Machinery, Inc, New York, NY, USA, 409–414.
[49]
Jun Rekimoto. 1998. Multiple device approach for supporting whiteboard-based interactions. In Conference on Human Factors in Computing Systems - Proceedings, ACM, New York, New York, USA, 344–351.
[50]
Houssem Saidi, Marcos Serrano, and Emmanuel Dubois. 2016. Investigating the effects of splitting detailed views in Overview+Detail interfaces. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI 2016, Association for Computing Machinery, Inc, New York, NY, USA, 180–184.
[51]
Houssem Saidi, Marcos Serrano, Pourang Irani, and Emmanuel Dubois. 2017. TDome: A touch-enabled 6DOF interactive device for multi-display environments. In Conference on Human Factors in Computing Systems - Proceedings, ACM Press, New York, New York, USA, 5892–5904.
[52]
Houssem Saidi, Marcos Serrano, Pourang Irani, Christophe Hurter, and Emmanuel Dubois. 2019. On-Body Tangible Interaction: Using the Body to Support Tangible Manipulations for Immersive Environments. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Verlag, 471–492.
[53]
A. Santos, I. Aedo, T. Zarraonandia, and P. Díaz. 2017. A comparative study of menus in virtual reality environments. In Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces, ISS 2017, Association for Computing Machinery, Inc, New York, New York, USA, 294–299.
[54]
Marcos Serrano, Barrett Ens, and Pourang Irani. 2014. Exploring the use of hand-to-face input for interacting with head-worn displays. In Conference on Human Factors in Computing Systems - Proceedings, Association for Computing Machinery, New York, New York, USA, 3181–3190.
[55]
Marcos Serrano, Barrett Ens, Xing-Dong Yang, and Pourang Irani. 2015. Gluey: Developing a Head-Worn Display Interface to Unify the Interaction Experience in Distributed Display Environments. Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services - MobileHCI ’15 Figure 1 (2015), 161–171.
[56]
Marcos Serrano, Eric Lecolinet, and Yves Guiard. 2013. Bezel-Tap Gestures: Quick activation of commands from sleep mode on tablets. In Conference on Human Factors in Computing Systems - Proceedings, ACM, New York, NY, USA, 3027–3036.
[57]
Marco Speicher, Anna Maria Feit, Pascal Ziegler, and Antonio Krüger. 2018. Selection-based text entry In Virtual Reality. In Conference on Human Factors in Computing Systems - Proceedings, Association for Computing Machinery, New York, New York, USA, 1–13.
[58]
Wesley Willett, Bernhard Jenny, Tobias Isenberg, and Pierre Dragicevic. 2015. Lightweight relief shearing for enhanced terrain perception on interactive maps. In Conference on Human Factors in Computing Systems - Proceedings, Association for Computing Machinery, New York, New York, USA, 3563–3572.
[59]
Wenge Xu, Hai Ning Liang, Yuxuan Zhao, Difeng Yu, and Diego Monteiro. 2019. DMove: Directional Motion-based Interaction for Augmented Reality Head-Mounted Displays. In Conference on Human Factors in Computing Systems - Proceedings, Association for Computing Machinery, New York, New York, USA, 1–14.
[60]
Yukang Yan, Chun Yu, Xin Yi, and Yuanchun Shi. 2018. HeadGesture. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2, 4 (December 2018), 1–23.
[61]
Fengyuan Zhu and Tovi Grossman. 2020. BISHARE: Exploring Bidirectional Interactions Between Smartphones and Head-Mounted Augmented Reality. (2020), 1–14.
[62]
2019. Meta View. Retrieved May 2, 2020 from https://www.metavision.com
[63]
MenUA. Retrieved May 2, 2020 from http://www.gillesbailly.fr/menua/
[64]
Start menu and mixed reality home | Microsoft Docs. Retrieved May 2, 2020 from https://docs.microsoft.com/en-us/hololens/holographic-home
[65]
Using the Quick Look Framework. Retrieved May 2, 2020 from https://www.vive.com/eu/support/vive-focus-plus/category_howto/using-quick-menu.html
[66]
Bad Stats: Not what it Seems. Retrieved January 3, 2021 from https://www.aviz.fr/badstats#papers
[67]
Bad Stats: Not what it Seems. Retrieved January 3, 2021 from https://www.aviz.fr/badstats#sece
[68]
Data Analysis example with data from a real study. Retrieved May 2, 2020 from https://aviz.fr/ci/

Cited By

View all
  • (2024)Pro-Tact: Hierarchical Synthesis of Proprioception and Tactile Exploration for Eyes-Free Ray Pointing on Out-of-View VR MenusProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676324(1-11)Online publication date: 13-Oct-2024
  • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
  • (2024)Screen Augmentation Technique Using AR Glasses and Smartphone without External SensorsExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3648664(1-5)Online publication date: 11-May-2024
  • Show More Cited By

Index Terms

  1. HoloBar: Rapid Command Execution for Head-Worn AR Exploiting Around the Field-of-View Interaction
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
      May 2021
      10862 pages
      ISBN:9781450380966
      DOI:10.1145/3411764
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 07 May 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Augmented reality
      2. Menu interaction
      3. Smartphone based interactions

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      CHI '21
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)91
      • Downloads (Last 6 weeks)5
      Reflects downloads up to 12 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Pro-Tact: Hierarchical Synthesis of Proprioception and Tactile Exploration for Eyes-Free Ray Pointing on Out-of-View VR MenusProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676324(1-11)Online publication date: 13-Oct-2024
      • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
      • (2024)Screen Augmentation Technique Using AR Glasses and Smartphone without External SensorsExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3648664(1-5)Online publication date: 11-May-2024
      • (2024)Dual-Thumb pointing and command selection techniques for tabletsInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103203184:COnline publication date: 17-Apr-2024
      • (2024)Pervasive Augmented Reality to support real-time data monitoring in industrial scenariosComputers and Graphics10.1016/j.cag.2023.10.025118:C(11-22)Online publication date: 2-Jul-2024
      • (2023)Stereoscopic Viewing and Monoscopic Touching: Selecting Distant Objects in VR Through a Mobile DeviceProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606809(1-7)Online publication date: 29-Oct-2023
      • (2023)Enhancing the Reading Experience on AR HMDs by Using Smartphones as Assistive Displays2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR55154.2023.00053(378-386)Online publication date: Mar-2023
      • (2023)Understanding User MotionHandbook of Human Computer Interaction10.1007/978-3-319-27648-9_105-1(1-29)Online publication date: 23-Sep-2023
      • (2022)Visual Transitions around Tabletops in Mixed Reality: Study on a Visual Acquisition Task between Vertical Virtual Displays and Horizontal TabletopsProceedings of the ACM on Human-Computer Interaction10.1145/35677386:ISS(660-679)Online publication date: 14-Nov-2022
      • (2022)Immersive Analytics Spaces and SurfacesCompanion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces10.1145/3532104.3571471(68-71)Online publication date: 20-Nov-2022
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media