Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2858036.2858140acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Novel Optical Configurations for Virtual Reality: Evaluating User Preference and Performance with Focus-tunable and Monovision Near-eye Displays

Published: 07 May 2016 Publication History

Abstract

Emerging virtual reality (VR) displays must overcome the prevalent issue of visual discomfort to provide high-quality and immersive user experiences. In particular, the mismatch between vergence and accommodation cues inherent to most stereoscopic displays has been a long standing challenge. In this paper, we evaluate several adaptive display modes afforded by focus-tunable optics or actuated displays that have the promise to mitigate visual discomfort caused by the vergence-accommodation conflict, and improve performance in VR environments. We also explore monovision as an unconventional mode that allows each eye of an observer to accommodate to a different distance. While this technique is common practice in ophthalmology, we are the first to report its effectiveness for VR applications with a custom built set up. We demonstrate that monovision and other focus-tunable display modes can provide better user experiences and improve user performance in terms of reaction times and accuracy, particularly for nearby simulated distances in VR.

Supplementary Material

ZIP File (pn0649-file4.zip)
pn0649-file4.zip
suppl.mov (pn649.mp4)
Supplemental video

References

[1]
K. Akeley, S.J Watt, A.R Girshick, and M.S Banks. 2004. A Stereo Display Prototype with Multiple Focal Distances. ACM Trans. Graph. 23, 3 (2004), 804--813.
[2]
A. Back. 1995. Factors Influencing Success and Failure in Monovision. Int. Contact Lens Clinic 22, 7 (1995), 165--72.
[3]
R. Bakeman. 2005. Recommended effect size statistics for repeated measures designs. Behavior research methods 37, 3 (2005), 379--384.
[4]
B. A. Barsky and T. J. Kosloff. 2008. Algorithms for Rendering Depth of Field Effects in Computer Graphics. In Proc. Int. Conf. on Computers. 999--1010.
[5]
J. P. Brooker and P.M. Sharkey. 2001. Operator performance evaluation of controlled depth of field in a stereographically displayed virtual environment. Proc. SPIE 4297 (2001), 408--417.
[6]
WN Charman and G. Heron. 1988. Fluctuations in accommodation: a review. Ophthalmic & physiological optics 8, 2 (1988), 153--164.
[7]
E. Dolgoff. 1997. Real-Depth imaging: a new 3D imaging technology with inexpensive direct-view (no glasses) video and other applications. Proc. SPIE 3012 (1997), 282--288.
[8]
A. T. Duchowski, D. H. House, J. Gestring, R. I. Wang, K. Krejtz, I. Krejtz, R. Mantiuk, and B. Bazyluk. 2014. Reducing Visual Discomfort of 3D Stereoscopic Displays with Gaze-contingent Depth-of-field. In ACM Symposium on Applied Perception. 39--46.
[9]
B. J. Evans. 2007. Monovision: a review. Ophthalmic and Physiological Optics 27, 5 (2007), 417--439.
[10]
G.E. Favalora. 2005. Volumetric 3D Displays and Application Infrastructure. IEEE Computer 38, 8 (2005), 37--44.
[11]
R. Gil-Cazorla, S. Shah, and S. A. Naroo. 2015. A review of the surgical options for the correction of presbyopia. British Journal of Ophthalmology (2015).
[12]
E. Harb, F. Thorn, and D. Troilo. 2006. Characteristics of accommodative behavior during sustained reading in emmetropes and myopes. Vision Research 46, 16 (2006), 2581--92.
[13]
M. G. Harris, J. E. Sheedy, and C. M. Gan. 1992. Vision and Task Performance with Monovision and Diffractive Bifocal Contact Lenses. Optometry and Vision Science 69, 8 (1992), 609--14.
[14]
R. T. Held, E. A. Cooper, and M. S. Banks. 2012. Blur and Disparity Are Complementary Cues to Depth. Current Biology 22, 5 (2012), 426--431.
[15]
S. Hillaire, A. Lecuyer, R. Cozot, and G. Casiez. 2008. Using an Eye-Tracking System to Improve Camera Motions and Depth-of-Field Blur Effects in Virtual Environments. In Proc. IEEE VR. 47--50.
[16]
D. M. Hoffman and M. S. Banks. 2010. Focus information is used to interpret binocular images. Journal of Vision 10, 5 (2010), 13.
[17]
X. Hu and H. Hua. 2014. Design and Assessment of a Depth-Fused Multi-Focal-Plane Display Prototype. Journal of Display Technology 10, 4 (2014), 308--316.
[18]
F.C. Huang, K. Chen, and G. Wetzstein. 2015. The Light Field Stereoscope: Immersive Computer Graphics via Factored Near-Eye Light Field Display with Focus Cues. ACM Trans. Graph. (SIGGRAPH) 34, 4 (2015).
[19]
A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec. 2007. Rendering for an interactive 360° light field display. ACM Trans. Graph. (SIGGRAPH) 26, 3 (2007).
[20]
M. Lambooij, M. Fortuin, I. Heynderickx, and W. IJsselsteijn. 2009. Visual Discomfort and Visual Fatigue of Stereoscopic Displays: A Review. Journal of Imaging Science and Technology 53, 3 (2009).
[21]
D. Lanman, M. Hirsch, Y. Kim, and R. Raskar. 2010. Content-Adaptive Parallax Barriers: Optimizing Dual-Layer 3D Displays using Low-Rank Light Field Factorization. ACM Trans. Graph. (SIGGRAPH Asia) 29, 6 (2010), 163:1--163:10.
[22]
D. Lanman and D. Luebke. 2013. Near-eye Light Field Displays. ACM Trans. Graph. (SIGGRAPH Asia) 32, 6 (2013), 220:1--220:10.
[23]
S. Liu, D. Cheng, and H. Hua. 2008. An optical see-through head mounted display with addressable focal planes. In Proc. ISMAR. 33--42.
[24]
P. Llull, N. Bedard, W. Wu, I. Tosic, K. Berkner, and N. Balram. 2015. Design and optimization of a near-eye multifocal display system for augmented reality. In OSA Imaging and Applied Optics.
[25]
G.D. Love and M.S. Banks. 2014. Stereoscopic Image Generation with Asymmetric Level of Sharpness, Patent Application WO2014199127A1. (2014).
[26]
G. D. Love, D. M. Hoffman, P. J. Hands, J. Gao, A. K. Kirby, and M. S. Banks. 2009. High-speed switchable lens enables the development of a volumetric stereoscopic display. Optics Express 17, 18 (2009), 15716--25.
[27]
G. Maiello, M. Chessa, F. Solari, and P. J. Bex. 2014. Simulated disparity and peripheral blur interact during binocular fusion. Journal of Vision 14, 8 (2014).
[28]
A. Maimone, G. Wetzstein, M. Hirsch, D. Lanman, R. Raskar, and H. Fuchs. 2013. Focus 3D: Compressive Accommodation Display. ACM Trans. Graphics 32, 5 (2013), 153:1--153:13.
[29]
L. Marran and C. Schor. 1997. Multiaccommodative stimuli in VR systems: problems and solutions. Human Factors 39, 3 (1997), 382--388.
[30]
M. Mauderer, S. Conte, M. A. Nacenta, and D. Vishwanath. 2014. Depth Perception with Gaze-contingent Depth of Field. ACM SIGCHI (2014).
[31]
R. Narain, R. Albert, A. Bulbul, G. J. Ward, M. S. Banks, and J. F. O'Brien. 2015. Optimal Presentation of Imagery with Focus Cues on Multi-Plane Displays. ACM Trans. Graph. (SIGGRAPH) 34, 4 (2015).
[32]
E. Peli. 1999. Optometric and perceptual issues with head-mounted displays. In Visual Instrumentation: Optical Design & Engineering Principles. McGraw-Hill.
[33]
J. Rolland, M. Krueger, and A. Goon. 2000. Multifocal planes head-mounted displays. Applied Optics 39, 19 (2000), 3209--3215.
[34]
J. Rolland and L. Vaissie. 2002. Head mounted display with eyetracking capability. U.S. Patent 6,433,760B1. (2002).
[35]
B. Schowengerdt and E. Seibel. 2006. True 3-D scanned voxel displays using single or multiple light sources. J. SID 14, 2 (2006), 135--143.
[36]
T. Shibata, J. Kim, D.M. Hoffman, and M.S. Banks. 2011. The zone of comfort: Predicting visual discomfort with stereo displays. Journal of Vision 11, 8 (2011), 11.
[37]
S. Siegel. 1956. Nonparametric statistics for the behavioral sciences. McGraw-Hill.
[38]
T. Sugihara and T. Miyasato. 1998. 32.4: A Lightweight 3-D HMD with Accommodative Compensation. SID Digest 29, 1 (1998), 927--930.
[39]
A. Sullivan. 2003. A solid-state multi-planar volumetric display. SID Digest 34, 1 (2003), 1531--1533.
[40]
I. E. Sutherland. 1968. A Head-mounted Three Dimensional Display. In Proc. Fall Joint Computer Conference. 757--764.
[41]
M. Waldkirch, P. Lukowicz, and G. Tröster. 2004. Multiple imaging technique for extending depth of focus in retinal displays. Optics Express 12, 25 (2004).
[42]
S. J. Watt, K. Akeley, M. O. Ernst, and M. S. Banks. 2005. Focus cues affect perceived depth. Journal of Vision 5, 7 (2005), 834--862.
[43]
G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar. 2012. Tensor Displays: Compressive Light Field Synthesis using Multilayer Displays with Directional Backlighting. ACM Trans. Graph. (SIGGRAPH) 31, 4 (2012), 1--11.
[44]
T. Zhou, J. X. Chen, and M. Pullen. 2007. Accurate Depth of Field Simulation in Real Time. Computer Graphics Forum 26, 1 (2007), 15--23.

Cited By

View all
  • (2024)Optimizing Resource Allocation for Wireless VR ServicesIEEE Transactions on Services Computing10.1109/TSC.2024.337790217:5(2719-2732)Online publication date: Sep-2024
  • (2023)Experimental characterization, modelling and compensation of temperature effects in optotunable lensesScientific Reports10.1038/s41598-023-28795-713:1Online publication date: 28-Jan-2023
  • (2023)Neutral density filter-based catadioptric optical design for thin virtual reality head-mounted deviceOptics & Laser Technology10.1016/j.optlastec.2023.109623165(109623)Online publication date: Oct-2023
  • Show More Cited By

Index Terms

  1. Novel Optical Configurations for Virtual Reality: Evaluating User Preference and Performance with Focus-tunable and Monovision Near-eye Displays

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        CHI '16: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems
        May 2016
        6108 pages
        ISBN:9781450333627
        DOI:10.1145/2858036
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 07 May 2016

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. focus cues
        2. user comfort
        3. user performance
        4. virtual reality

        Qualifiers

        • Research-article

        Funding Sources

        • Intel Corporation
        • Meta
        • Google

        Conference

        CHI'16
        Sponsor:
        CHI'16: CHI Conference on Human Factors in Computing Systems
        May 7 - 12, 2016
        California, San Jose, USA

        Acceptance Rates

        CHI '16 Paper Acceptance Rate 565 of 2,435 submissions, 23%;
        Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

        Upcoming Conference

        CHI 2025
        ACM CHI Conference on Human Factors in Computing Systems
        April 26 - May 1, 2025
        Yokohama , Japan

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)42
        • Downloads (Last 6 weeks)2
        Reflects downloads up to 06 Feb 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Optimizing Resource Allocation for Wireless VR ServicesIEEE Transactions on Services Computing10.1109/TSC.2024.337790217:5(2719-2732)Online publication date: Sep-2024
        • (2023)Experimental characterization, modelling and compensation of temperature effects in optotunable lensesScientific Reports10.1038/s41598-023-28795-713:1Online publication date: 28-Jan-2023
        • (2023)Neutral density filter-based catadioptric optical design for thin virtual reality head-mounted deviceOptics & Laser Technology10.1016/j.optlastec.2023.109623165(109623)Online publication date: Oct-2023
        • (2022)Impact of correct and simulated focus cues on perceived realismSIGGRAPH Asia 2022 Conference Papers10.1145/3550469.3555405(1-9)Online publication date: 29-Nov-2022
        • (2022)Holographic Glasses for Virtual RealityACM SIGGRAPH 2022 Conference Proceedings10.1145/3528233.3530739(1-9)Online publication date: 27-Jul-2022
        • (2022)Design of the varifocal and multifocal optical near-eye see-through displayOptik10.1016/j.ijleo.2022.169942270(169942)Online publication date: Nov-2022
        • (2021)Electrically tunable lenses – eliminating mechanical axial movements during high-speed 3D live imagingJournal of Cell Science10.1242/jcs.258650134:16Online publication date: 19-Aug-2021
        • (2021)Multifocal Stereoscopic Projection MappingIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2021.310648627:11(4256-4266)Online publication date: Nov-2021
        • (2021)A calibration-free workflow for image-based mixed reality navigation of total shoulder arthroplastyComputer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization10.1080/21681163.2021.200937810:3(243-251)Online publication date: 9-Dec-2021
        • (2021)Liquid crystal technology for vergence-accommodation conflicts in augmented reality and virtual reality systems: a reviewLiquid Crystals Reviews10.1080/21680396.2021.1948927(1-52)Online publication date: 29-Jun-2021
        • Show More Cited By

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media