Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Foveated Depth-of-Field Filtering in Head-Mounted Displays

Published: 19 September 2018 Publication History
  • Get Citation Alerts
  • Abstract

    In recent years, a variety of methods have been introduced to exploit the decrease in visual acuity of peripheral vision, known as foveated rendering. As more and more computationally involved shading is requested and display resolutions increase, maintaining low latencies is challenging when rendering in a virtual reality context. Here, foveated rendering is a promising approach for reducing the number of shaded samples. However, besides the reduction of the visual acuity, the eye is an optical system, filtering radiance through lenses. The lenses create depth-of-field (DoF) effects when accommodated to objects at varying distances. The central idea of this article is to exploit these effects as a filtering method to conceal rendering artifacts. To showcase the potential of such filters, we present a foveated rendering system, tightly integrated with a gaze-contingent DoF filter. Besides presenting benchmarks of the DoF and rendering pipeline, we carried out a perceptual study, showing that rendering quality is rated almost on par with full rendering when using DoF in our foveated mode, while shaded samples are reduced by more than 69%.

    Supplementary Material

    weier.mov (weier.zip)
    Supplemental movie, appendix, image and software files for, Foveated Depth-of-Field Filtering in Head-Mounted Displays

    References

    [1]
    Rachel Albert, Anjul Patney, David Luebke, and Joohwan Kim. 2017. Latency requirements for foveated rendering in virtual reality. ACM Trans. Appl. Percept. 14, 4 (Sep 2017), 25:1--25:13.
    [2]
    Elena Arabadzhiyska, Okan Tarhan Tursun, Karol Myszkowski, Hans-Peter Seidel, and Piotr Didyk. 2017. Saccade landing position prediction for gaze-contingent rendering. Proceedings of ACM Transactions on Graphics (SIGGRAPH’17) 36, 4, Article 50 (2017), 12 pages.
    [3]
    Brian A. Barsky and Todd J. Kosloff. 2008. Algorithms for rendering depth of field effects in computer graphics. In Proceedings of the 12th WSEAS International Conference on Computers (ICCOMP’08). World Scientific and Engineering Academy and Society (WSEAS), Stevens Point, Wisconsin, USA, 999--1010.
    [4]
    Mike Bukowski, Padraic Hennessy, Brian Osman, and Morgan McGuire. 2013. The skylanders SWAP force depth-of-field shader. In GPU Pro 4: Advanced Rendering Techniques. A K Peters/CRC Press, Wellesley, MA, USA, 175--184.
    [5]
    Joe Demers. 2004. Depth of field: A survey of techniques. GPU Gems. Vol. 1. Addison-Wesley, Chapter 23.
    [6]
    Andrew T. Duchowski, Donald H. House, Jordan Gestring, Rui I. Wang, Krzysztof Krejtz, Izabela Krejtz, Radoslaw Mantiuk, and Bartosz Bazyluk. 2014. Reducing visual discomfort of 3D stereoscopic displays with gaze-contingent depth-of-field. In Proceedings of the ACM Symposium on Applied Perception (SAP’14). ACM, New York, NY, USA, 39--46.
    [7]
    Andrew T. Duchowski, Brandon Pelfrey, Donald H. House, and Rui Wang. 2011. Measuring gaze depth with an eye tracker during stereoscopic display. In Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization (APGV'11). ACM, New York, NY, USA, 15--22.
    [8]
    Masahiro Fujita and Takahiro Harada. 2014. Foveated Real-Time Ray Tracing for Virtual Reality Headset. Poster, SIGGRAPH Asia’14.
    [9]
    Herbert Gross. 2005. Survey of Optical Instruments, Vol. 4., Handbook of Optical Systems. Wiley-VCH.
    [10]
    Brian Guenter, Mark Finch, Steven Drucker, Desney Tan, and John Snyder. 2012. Foveated 3D graphics. ACM Trans. Graph. (SIGGRAPH Asia’12) 31, 6, Article 164 (Nov. 2012), 10 pages.
    [11]
    Sébastien Hillaire, Anatole Lécuyer, Rémi Cozot, and Géry Casiez. 2008. Using an eye-tracking system to improve camera motions and depth-of-field blur effects in virtual environments. In IEEE (VR’08). IEEE, 47--50.
    [12]
    H. Igehy. 1999. Tracing ray differentials. In Proceedings of SIGGRAPH’99. ACM, New York, NY, 179--186.
    [13]
    George-Alex Koulieris, Bee Bui, Martin S. Banks, and George Drettakis. 2017. Accommodation and comfort in head-mounted displays. ACM Trans. Graph. (SIGGRAPH'17) 36, 4, Article 87 (2017), 11 pages.
    [14]
    Tim Lindeberg. 2016. Concealing Rendering Simplifications using Gaze Contingent Depth of Field. Master’s thesis. KTH Royal Institute of Technology School of Computer Science and Communication, Sweden. https://kth.diva-portal.org/smash/get/diva2:947325/FULLTEXT01.pdf.
    [15]
    Radoslaw Mantiuk, Bartosz Bazyluk, and Rafal K. Mantiuk. 2013. Gaze-driven object tracking for real time rendering. In Comput. Graph. Forum (Eurographics'13), Vol. 32. The Eurographs Association 8 John Wiley 8 Sons, Ltd., 163--173.
    [16]
    Radoslaw Mantiuk, Bartosz Bazyluk, and Anna Tomaszewska. 2011. Gaze-dependent depth-of-field effect rendering in virtual environments. In Proceedings of the SGDA’11. Springer, Berlin, Heidelberg, 1--12.
    [17]
    Ricardo Marroquim, Martin Kraus, and Paulo Roma Cavalcanti. 2007. Efficient point-based rendering using image reconstruction. The Eurographics Association on Point-Based Graphics 1 (2007), 101--108.
    [18]
    Michael Mauderer, Simone Conte, Miguel A. Nacenta, and Dhanraj Vishwanath. 2014. Depth perception with gaze-contingent depth of field. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’14). ACM, New York, NY, USA, 217--226.
    [19]
    L. McIntosh, Bernhard E. Riecke, and Steve DiPaola. 2012. Efficiently simulating the bokeh of polygonal apertures in a post-process depth of field shader. In Comput. Graph. Forum (Eurographics'12) 31, 6 (2012), 1810--1822.
    [20]
    Xiaoxu Meng, Ruofei Du, Matthias Zwicker, and Amitabh Varshney. 2018. Kernel foveated rendering. In Proc. ACM Comput. Graph. Interact. Tech. (ACM I3D'18) 1, 1, Article 5 (July 2018), 20 pages.
    [21]
    Jurriaan D. Mulder and Robert van Liere. 2000. Fast perception-based depth of field rendering. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology (VRST'00). ACM, New York, NY, USA, 129--133.
    [22]
    Anjul Patney, Marco Salvi, Joohwan Kim, Anton Kaplanyan, Chris Wyman, Nir Benty, David Luebke, and Aaron Lefohn. 2016. Towards foveated rendering for gaze-tracked virtual reality. ACM Trans. Graph. (SIGGRAPH Asia’16) 35, 6 (Nov. 2016), pp. 179:1--179:12.
    [23]
    Arsène Pérard-Gayot, Javor Kalojanov, and Philipp Slusallek. 2017. GPU ray tracing using irregular grids. Comput. Graph. Forum (Eurographics’17) 36, 2 (May 2017), 477--486.
    [24]
    Michael Stengel, Steve Grogorick, Martin Eisemann, and Marcus Magnor. 2016. Adaptive image-space sampling for gaze-contingent real-time rendering. In Proceedings of the Eurographics Symposium on Rendering (EGSR'16), E. Eisemann and E. Fiume (Eds.). Eurographics Association, Goslar, Germany, 129--139.
    [25]
    L. A. Temme and A. Morris. 1989. Speed of accommodation and age. Optom. Vis. Sci. 66, 2 (Feb 1989), 106--112.
    [26]
    Karthik Vaidyanathan, Marco Salvi, Robert Toth, Tim Foley, Tomas Akenine-Möller, Jim Nilsson, Jacob Munkberg, Jon Hasselgren, Masamichi Sugihara, Petrik Clarberg, Tomasz Janczak, and Aaron Lefohn. 2014. Coarse pixel shading. In ACM HPG’14. Eurographics Association, Goslar, Germany, 9--18.
    [27]
    M. Vinnikov, R. S. Allison, and S. Fernandes. 2016. Impact of depth of field simulation on visual fatigue. Int. J. Hum.-Comput. Stud. 91, C (July 2016), 37--51.
    [28]
    Martin Weier, Thorsten Roth, André Hinkenjann, and Philipp Slusallek. 2018. Predicting the gaze depth in head-mounted displays using multiple feature regression. In Proceedings of the ACM (ETRA'18). ACM, Warsaw, Poland, Article 19, 8 pages.
    [29]
    Martin Weier, Thorsten Roth, Ernst Kruijff, André Hinkenjann, Arsène Pérard-Gayot, Philipp Slusallek, and Yongmin Li. 2016. Foveated real-time ray tracing for head-mounted displays. In Computer Graphics Forum (PG'16). Eurographics Association, Goslar, Germany, 289--298.
    [30]
    Martin Weier, Michael Stengel, Thorsten Roth, Piotr Didyk, Elmar Eisemann, Martin Eisemann, Steve Grogorick, Andre Hinkenjann, Ernst Kruijff, Marcus Magnor, Karol Myszkowski, and Philipp Slusallek. 2017. Perception-driven accelerated rendering. In Comput. Graph. Forum (Eurographics'17) 36, 2 (May 2017), 611--643.
    [31]
    Lei Yang, Diego F. Nehab, Pedro V. Sander, Pitchaya Sitthi-amorn, Jason Lawrence, and Hugues Hoppe. 2009. Amortized supersampling. ACM Trans. Graph. (SIGGRAPH Asia'09) 28, 5 (2009), 135:1--135:12.

    Cited By

    View all
    • (2024)Privacy-Preserving Gaze Data Streaming in Immersive Interactive Virtual Reality: Robustness and User ExperienceIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337203230:5(2257-2268)Online publication date: May-2024
    • (2023)Is Foveated Rendering Perception Affected by Users’ Motion?2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00127(1104-1112)Online publication date: 16-Oct-2023
    • (2023)Application of eye-tracking systems integrated into immersive virtual reality and possible transfer to the sports sector - A systematic reviewMultimedia Tools and Applications10.1007/s11042-022-13474-y82:3(4181-4208)Online publication date: 1-Jan-2023
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Applied Perception
    ACM Transactions on Applied Perception  Volume 15, Issue 4
    October 2018
    57 pages
    ISSN:1544-3558
    EISSN:1544-3965
    DOI:10.1145/3280853
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 September 2018
    Accepted: 01 June 2018
    Revised: 01 June 2018
    Received: 01 May 2018
    Published in TAP Volume 15, Issue 4

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Gaze-contingent depth-of-field
    2. eye-tracking
    3. foveated rendering
    4. ray tracing

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)48
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 11 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Privacy-Preserving Gaze Data Streaming in Immersive Interactive Virtual Reality: Robustness and User ExperienceIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337203230:5(2257-2268)Online publication date: May-2024
    • (2023)Is Foveated Rendering Perception Affected by Users’ Motion?2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00127(1104-1112)Online publication date: 16-Oct-2023
    • (2023)Application of eye-tracking systems integrated into immersive virtual reality and possible transfer to the sports sector - A systematic reviewMultimedia Tools and Applications10.1007/s11042-022-13474-y82:3(4181-4208)Online publication date: 1-Jan-2023
    • (2022)The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended RealityACM Computing Surveys10.1145/349120755:3(1-39)Online publication date: 25-Mar-2022
    • (2022)Rectangular Mapping-based Foveated Rendering2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00097(756-764)Online publication date: Mar-2022
    • (2022)Redirected Walking with IRS-assisted BeamformingICC 2022 - IEEE International Conference on Communications10.1109/ICC45855.2022.9838715(3352-3357)Online publication date: 16-May-2022
    • (2021)Perception-Driven Hybrid Foveated Depth of Field Rendering for Head-Mounted Displays2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR52148.2021.00014(1-10)Online publication date: Oct-2021
    • (2021)An integrative view of foveated renderingComputers & Graphics10.1016/j.cag.2021.10.010Online publication date: Oct-2021
    • (2020)A review of current trends on visual perception studies in virtual and augmented realitySIGGRAPH Asia 2020 Courses10.1145/3415263.3419144(1-97)Online publication date: 17-Nov-2020
    • (2019)Near‐Eye Display and Tracking Technologies for Virtual and Augmented RealityComputer Graphics Forum10.1111/cgf.1365438:2(493-519)Online publication date: 7-Jun-2019

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media