Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Towards foveated rendering for gaze-tracked virtual reality

Published: 05 December 2016 Publication History

Abstract

Foveated rendering synthesizes images with progressively less detail outside the eye fixation region, potentially unlocking significant speedups for wide field-of-view displays, such as head mounted displays, where target framerate and resolution is increasing faster than the performance of traditional real-time renderers.
To study and improve potential gains, we designed a foveated rendering user study to evaluate the perceptual abilities of human peripheral vision when viewing today's displays. We determined that filtering peripheral regions reduces contrast, inducing a sense of tunnel vision. When applying a postprocess contrast enhancement, subjects tolerated up to 2× larger blur radius before detecting differences from a non-foveated ground truth. After verifying these insights on both desktop and head mounted displays augmented with high-speed gaze-tracking, we designed a perceptual target image to strive for when engineering a production foveated renderer.
Given our perceptual target, we designed a practical foveated rendering system that reduces number of shades by up to 70% and allows coarsened shading up to 30° closer to the fovea than Guenter et al. [2012] without introducing perceivable aliasing or blur. We filter both pre- and post-shading to address aliasing from undersampling in the periphery, introduce a novel multiresolution- and saccade-aware temporal antialising algorithm, and use contrast enhancement to help recover peripheral details that are resolvable by our eye but degraded by filtering.
We validate our system by performing another user study. Frequency analysis shows our system closely matches our perceptual target. Measurements of temporal stability show we obtain quality similar to temporally filtered non-foveated renderings.

Supplementary Material

ZIP File (a179-patney.zip)
Supplemental file.

References

[1]
Baker, D., 2016. Object space lighting - following film rendering 2 decades later in real time, 03. Game Developers Conference Talk.
[2]
Banks, M. S., Sekuler, A. B., and Anderson, S. J. 1991. Peripheral spatial vision: limits imposed by optics, photoreceptors, and receptor pooling. Journal of the Optical Society of America A 8, 11, 1775--1787.
[3]
Banks, M. S., Gepshtein, S., and Landy, M. S. 2004. Why is spatial stereoresolution so low? The Journal of Neuroscience 24, 9, 2077--2089.
[4]
Clarberg, P., Toth, R., Hasselgren, J., Nilsson, J., and Akenine-Möller, T. 2014. Amfs: adaptive multi-frequency shading for future graphics processors. ACM Transactions on Graphics 33, 4, 141:1--141:12.
[5]
Cowey, A., and Rolls, E. T. 1974. Human cortical magnification factor and its relation to visual acuity. Experimental Brain Research 21, 5, 447--454.
[6]
Curcio, C. A., and Allen, K. A. 1990. Topography of ganglion cells in human retina. Journal of Comparative Neurology 300, 1, 5--25.
[7]
Curcio, C. A., Sloan, K. R., Kalina, R. E., and Hendrickson, A. E. 1990. Human photoreceptor topography. Journal of Comparative Neurology 292, 4, 497--523.
[8]
Ferree, C. E., Rand, G. G., and Hardy, C. C. 1931. Refraction for the peripheral field of vision. Archives of Ophthalmology 5, 5, 717--731.
[9]
Green, C. 2007. Improved alpha-tested magnification for vector textures and special effects. In ACM SIGGRAPH Courses, SIGGRAPH, 9--18.
[10]
Grundland, M., Vohra, R., Williams, G. P., and Dodgson, N. A. 2006. Cross Dissolve Without Cross Fade: Preserving Contrast, Color and Salience in Image Compositing. Computer Graphics Forum 25, 3, 577--586.
[11]
Guenter, B., Finch, M., Drucker, S., Tan, D., and Snyder, J. 2012. Foveated 3D graphics. ACM Transactions on Graphics 31, 6, 164:1--164:10.
[12]
Hansen, T., Pracejus, L., and Gegenfurtner, K. R. 2009. Color perception in the intermediate periphery of the visual field. Journal of Vision 9, 4, 26:1--26:12.
[13]
He, Y., Gu, Y., and Fatahalian, K. 2014. Extending the graphics pipeline with adaptive, multi-rate shading. ACM Transactions on Graphics 33, 4, 142:1--142:12.
[14]
Hill, S., McAuley, S., Burley, B., Chan, D., Fascione, L., Iwanicki, M., Hoffman, N., Jakob, W., Neubelt, D., Pesce, A., and Pettineo, M. 2015. Physically based shading in theory and practice. In ACM SIGGRAPH Courses, SIGGRAPH, 22:1--22:8.
[15]
Hillesland, K. E., and Yang, J. C. 2016. Texel Shading. In EG 2016 - Short Papers, The Eurographics Association, T. Bashford-Rogers and L. P. Santos, Eds.
[16]
Jimenez, J., Echevarria, J. I., Sousa, T., and Gutierrez, D. 2012. SMAA: Enhanced morphological antialiasing. Computer Graphics Forum (Proc. EUROGRAPHICS 2012) 31, 2.
[17]
Kaplanyan, A., Hill, S., Patney, A., and Lefohn, A. 2016. Filtering distributions of normals for shading antialiasing. In Proceedings of the Symposium on High-Performance Graphics.
[18]
Karis, B. 2014. High-quality temporal supersampling. In Advances in Real-Time Rendering in Games, SIGGRAPH Courses.
[19]
Kelly, D. H., and Savoie, R. E. 1973. A study of sine-wave contrast sensitivity by two psychophysical methods. Perception & Psychophysics 14, 2, 313--318.
[20]
Kelly, D. H. 1984. Retinal inhomogeneity. i. spatiotemporal contrast sensitivity. Journal of the Optical Society of America A 1, 1, 107--113.
[21]
Kim, M. H., Ritschel, T., and Kautz, J. 2011. Edge-aware color appearance. ACM Transactions on Graphics 30, 2, 13:1--13:9.
[22]
Koenderink, J. J., Bouman, M. A., Bueno de Mesquita, A. E., and Slappendel, S. 1978. Perimetry of contrast detection thresholds of moving spatial sine patterns. II. The far peripheral visual field (eccentricity 0 degrees-50 degrees). Journal of the Optical Society of America A 68, 6, 850--854.
[23]
Koenderink, J. J., Bouman, M. A., Bueno de Mesquita, A. E., and Slappendel, S. 1978. Perimetry of contrast detection thresholds of moving spatial sine wave patterns. I. The near peripheral visual field (eccentricity 0 degrees-8 degrees). Journal of the Optical Society of America A 68, 6, 845--849.
[24]
Koenderink, J. J., Bouman, M. A., Bueno de Mesquita, A. E., and Slappendel, S. 1978. Perimetry of contrast detection thresholds of moving spatial sine wave patterns. III. The target extent as a sensitivity controlling parameter. Journal of the Optical Society of America A 68, 6, 854--860.
[25]
Lauritzen, A., Salvi, M., and Lefohn, A. 2011. Sample distribution shadow maps. In Symposium on Interactive 3D Graphics and Games, 97--102.
[26]
Levi, D. M., Klein, S. A., and Aitsebaomo, P. 1985. Vernier acuity, crowding and cortical magnification. Vision Research 25, 7, 963--977.
[27]
Levitt, H. 1971. Transformed up-down methods in psychoacoustics. The Journal of the Acoustical society of America 49, 2B, 467--477.
[28]
McKee, S. P., and Nakayama, K. 1984. The detection of motion in the peripheral visual field. Vision Research 24, 1, 25--32.
[29]
Mäkelä, P., Näsänen, R., Rovamo, J., and Melmoth, D. 2001. Identification of facial images in peripheral vision. Vision Research 41, 5, 599--610.
[30]
Navarro, R., Artal, P., and Williams, D. R. 1993. Modulation transfer of the human eye as a function of retinal eccentricity. Journal of the Optical Society of America A 10, 2, 201--212.
[31]
Noorlander, C., Koenderink, J. J., Olden, R. J. D., and Edens, B. W. 1983. Sensitivity to spatiotemporal colour contrast in the peripheral visual field. Vision Research 23, 1, 1--11.
[32]
Olano, M., and Baker, D. 2010. Lean mapping. In Symposium on Interactive 3D Graphics and Games, 181--188.
[33]
Öztireli, A. C., and Gross, M. 2015. Perceptually based downscaling of images. ACM Transactions on Graphics 34, 4, 77:1--77:10.
[34]
Patney, A., Kim, J., Salvi, M., Kaplanyan, A., Wyman, C., Benty, N., Lefohn, A., and Luebke, D. 2016. Perceptually-based foveated virtual reality. In ACM SIGGRAPH 2016 Emerging Technologies, ACM, New York, NY, USA, SIGGRAPH '16, 17:1--17:2.
[35]
Pharr, M., and Humphreys, G. 2010. Physically Based Rendering, Second Edition: From Theory to Implementation, 2nd ed. Morgan Kaufmann Publishers, Inc.
[36]
Rosén, R. 2013. Peripheral Vision: Adaptive Optics and Psychophysics. PhD thesis, Royal Institute of Technology, Stockholm, Sweden.
[37]
Rovamo, J., and Virsu, V. 1979. An estimation and application of the human cortical magnification factor. Experimental Brain Research 37, 3, 495--510.
[38]
Rovamo, J., Virsu, V., Laurinen, P., and Hyvärinen, L. 1982. Resolution of gratings oriented along and across meridians in peripheral vision. Investigative Ophthalmology & Visual Science 23, 5, 666--670.
[39]
Salvi, M., and Vaidyanathan, K. 2014. Multi-layer alpha blending. In Proceedings of the 18th meeting of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games, 151--158.
[40]
Schütt, H. H., Harmeling, S., Macke, J. H., and Wichmann, F. A. 2016. Painfree and accurate bayesian estimation of psychometric functions for (potentially) overdispersed data. Vision Research 122, 105 -- 123.
[41]
Solomon, S. G., Lee, B. B., White, A. J., Ruttiger, L., and Martin, P. R. 2005. Chromatic organization of ganglion cell receptive fields in the peripheral retina. Journal of Neuroscience 25, 18, 4527--4539.
[42]
Strasburger, H., Rentschler, I., and Harvey, L. O. 1994. Cortical magnification theory fails to predict visual recognition. European Journal of Neuroscience 6, 10, 1583--1588.
[43]
Strasburger, H., Rentschler, I., and Jüttner, M. 2011. Peripheral vision and pattern recognition: A review. Journal of Vision 11, 5, 13:1--13:82.
[44]
Swafford, N. T., Iglesias-Guitian, J. A., Koniaris, C., Moon, B., Cosker, D., and Mitchell, K. 2016. User, metric, and computational evaluation of foveated rendering methods. In Proceedings of the ACM Symposium on Applied Perception, ACM, New York, NY, USA, SAP '16, 7--14.
[45]
Thibos, L. N., Cheney, F. E., and Walsh, D. J. 1987. Retinal limits to the detection and resolution of gratings. Journal of the Optical Society of America A 4, 8, 1524--1529.
[46]
Thibos, L., Walsh, D., and Cheney, F. 1987. Vision beyond the resolution limit: Aliasing in the periphery. Vision Research 27, 12, 2193--2197.
[47]
Thibos, L. N., Still, D. L., and Bradley, A. 1996. Characterization of spatial aliasing and contrast sensitivity in peripheral vision. Vision Research 36, 2, 249--258.
[48]
Thibos, L. N. 1987. Calculation of the influence of lateral chromatic aberration on image quality across the visual field. Journal of the Optical Society of America A 4, 8, 1673--1680.
[49]
Toth, R., Nilsson, J., and Akenine-Moller, T. 2016. Comparison of projection methods for rendering virtual reality. In Proceedings of the Symposium on High-Performance Graphics.
[50]
Vaidyanathan, K., Salvi, M., Toth, R., Foley, T., Akenine-Moller, T., Nilsson, J., Munkberg, J., Hasselgren, J., Sugihara, M., Clarberg, P., Janczak, T., and Lefohn, A. 2014. Coarse pixel shading. In Proceedings of the Symposium on High-Performance Graphics.
[51]
Wandell, B. A. 1995. Foundations of Vision. Sinauer Associates, Inc.
[52]
Wang, Y.-Z., Thibos, L. N., and Bradley, A. 1996. Undersampling produces non-veridical motion perception, but not necessarily motion reversal, in peripheral vision. Vision Research 36, 12, 1737--1744.
[53]
Wang, Y.-Z., Bradley, A., and Thibos, L. N. 1997. Aliased frequencies enable the discrimination of compound gratings in peripheral vision. Vision Research 37, 3, 283--290.
[54]
Wichmann, F. A., and Hill, N. J. 2001. The psychometric function: I. fitting, sampling, and goodness of fit. Perception & Psychophysics 63, 8, 1293--1313.
[55]
Wichmann, F. A., and Hill, N. J. 2001. The psychometric function: Ii. bootstrap-based confidence intervals and sampling. Perception & Psychophysics 63, 8, 1314--1329.
[56]
Williams, D. R., Artal, P., Navarro, R., McMahon, M. J., and Brainard, D. H. 1996. Off-axis optical quality and retinal sampling in the human eye. Vision Research 36, 8, 1103--1114.
[57]
Williams, L. 1983. Pyramidal parametrics. SIGGRAPH Comput. Graph. 17, 3, 1--11.
[58]
Yang, L., Nehab, D., Sander, P. V., Sitthi-amorn, P., Lawrence, J., and Hoppe, H. 2009. Amortized supersampling. In ACM SIGGRAPH Asia 2009 Papers, ACM, New York, NY, USA, SIGGRAPH Asia '09, 135:1--135:12.

Cited By

View all
  • (2024)Assessing the data quality of AdHawk MindLink eye-tracking glassesBehavior Research Methods10.3758/s13428-023-02310-256:6(5771-5787)Online publication date: 2-Jan-2024
  • (2024)Appearance-Based Gaze Estimation as a Benchmark for Eye Image Data Generation MethodsApplied Sciences10.3390/app1420958614:20(9586)Online publication date: 21-Oct-2024
  • (2024)Foveated pancake lens design for improved optical performance and eye rotation supportOptics Letters10.1364/OL.523675Online publication date: 15-Apr-2024
  • Show More Cited By

Index Terms

  1. Towards foveated rendering for gaze-tracked virtual reality

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Transactions on Graphics
      ACM Transactions on Graphics  Volume 35, Issue 6
      November 2016
      1045 pages
      ISSN:0730-0301
      EISSN:1557-7368
      DOI:10.1145/2980179
      Issue’s Table of Contents
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 December 2016
      Published in TOG Volume 35, Issue 6

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. foveated rendering
      2. gaze-tracking
      3. perception
      4. virtual reality

      Qualifiers

      • Research-article

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)461
      • Downloads (Last 6 weeks)39
      Reflects downloads up to 12 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Assessing the data quality of AdHawk MindLink eye-tracking glassesBehavior Research Methods10.3758/s13428-023-02310-256:6(5771-5787)Online publication date: 2-Jan-2024
      • (2024)Appearance-Based Gaze Estimation as a Benchmark for Eye Image Data Generation MethodsApplied Sciences10.3390/app1420958614:20(9586)Online publication date: 21-Oct-2024
      • (2024)Foveated pancake lens design for improved optical performance and eye rotation supportOptics Letters10.1364/OL.523675Online publication date: 15-Apr-2024
      • (2024)Integrating eye rotation and contrast sensitivity into image quality evaluation of virtual reality head-mounted displaysOptics Express10.1364/OE.52766032:14(24968)Online publication date: 25-Jun-2024
      • (2024)Analysing Hybrid Neural and Ray Tracing Perception for Foveated RenderingProceedings of the 26th Symposium on Virtual and Augmented Reality10.1145/3691573.3691580(21-30)Online publication date: 30-Sep-2024
      • (2024)PredATW: Predicting the Asynchronous Time Warp Latency For VR SystemsACM Transactions on Embedded Computing Systems10.1145/367732923:5(1-37)Online publication date: 14-Aug-2024
      • (2024)Collision Prevention in Diminished Reality through the Use of Peripheral VisionAdjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3672539.3686346(1-3)Online publication date: 13-Oct-2024
      • (2024)Towards Motion Metamers for Foveated RenderingACM Transactions on Graphics10.1145/365814143:4(1-10)Online publication date: 19-Jul-2024
      • (2024)Cybersickness Reduction via Gaze-Contingent Image DeformationACM Transactions on Graphics10.1145/365813843:4(1-14)Online publication date: 19-Jul-2024
      • (2024)PEA-PODs: Perceptual Evaluation of Algorithms for Power Optimization in XR DisplaysACM Transactions on Graphics10.1145/365812643:4(1-17)Online publication date: 19-Jul-2024
      • Show More Cited By

      View Options

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media