Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2929464.2929472acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
abstract

Perceptually-based foveated virtual reality

Published: 24 July 2016 Publication History

Abstract

Humans have two distinct vision systems: foveal and peripheral vision. Foveal vision is sharp and detailed, while peripheral vision lacks fidelity. The difference in characteristics of the two systems enable recently popular foveated rendering systems, which seek to increase rendering performance by lowering image quality in the periphery.
We present a set of perceptually-based methods for improving foveated rendering running on a prototype virtual reality headset with an integrated eye tracker. Foveated rendering has previously been demonstrated in conventional displays, but has recently become an especially attractive prospect in virtual reality (VR) and augmented reality (AR) display settings with a large field-of-view (FOV) and high frame rate requirements. Investigating prior work on foveated rendering, we find that some previous quality-reduction techniques can create objectionable artifacts like temporal instability and contrast loss. Our emerging technologies installation demonstrates these techniques running live in a head-mounted display and we will compare them against our new perceptually-based foveated techniques. Our new foveation techniques enable significant reduction in rendering cost but have no discernible difference in visual quality. We show how such techniques can fulfill these requirements with potentially large reductions in rendering cost.

Supplementary Material

ZIP File (a17-patney.zip)
Supplemental files.

References

[1]
Guenter, B., Finch, M., Drucker, S., Tan, D., and Snyder, J. 2012. Foveated 3D graphics. ACM Transactions on Graphics 31, 6, 164:1--164:10.
[2]
Karis, B. 2014. High-quality temporal supersampling. In Advances in Real-Time Rendering in Games, SIGGRAPH Courses.

Cited By

View all
  • (2024)Gaze-Swin: Enhancing Gaze Estimation with a Hybrid CNN-Transformer Network and Dropkey MechanismElectronics10.3390/electronics1302032813:2(328)Online publication date: 12-Jan-2024
  • (2024)PredATW: Predicting the Asynchronous Time Warp Latency For VR SystemsACM Transactions on Embedded Computing Systems10.1145/367732923:5(1-37)Online publication date: 14-Aug-2024
  • (2024)A Gaze Estimation Method Based on Binocular CamerasInternational Journal of Pattern Recognition and Artificial Intelligence10.1142/S021800142335001338:01Online publication date: 1-Feb-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGGRAPH '16: ACM SIGGRAPH 2016 Emerging Technologies
July 2016
41 pages
ISBN:9781450343725
DOI:10.1145/2929464
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 July 2016

Check for updates

Author Tags

  1. augmented reality
  2. foveated rendering
  3. perceptually-based
  4. virtual reality

Qualifiers

  • Abstract

Conference

SIGGRAPH '16
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,822 of 8,601 submissions, 21%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)91
  • Downloads (Last 6 weeks)4
Reflects downloads up to 18 Aug 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Gaze-Swin: Enhancing Gaze Estimation with a Hybrid CNN-Transformer Network and Dropkey MechanismElectronics10.3390/electronics1302032813:2(328)Online publication date: 12-Jan-2024
  • (2024)PredATW: Predicting the Asynchronous Time Warp Latency For VR SystemsACM Transactions on Embedded Computing Systems10.1145/367732923:5(1-37)Online publication date: 14-Aug-2024
  • (2024)A Gaze Estimation Method Based on Binocular CamerasInternational Journal of Pattern Recognition and Artificial Intelligence10.1142/S021800142335001338:01Online publication date: 1-Feb-2024
  • (2024)Fovea Prediction Model in VR2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00230(867-868)Online publication date: 16-Mar-2024
  • (2024)Retinotopic Foveated Rendering2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00109(903-912)Online publication date: 16-Mar-2024
  • (2024)Automatic Gaze Analysis: A Survey of Deep Learning Based ApproachesIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2023.332133746:1(61-84)Online publication date: Jan-2024
  • (2024)Gaze Estimation Based on the Improved Xception NetworkIEEE Sensors Journal10.1109/JSEN.2024.335908524:6(8450-8464)Online publication date: 15-Mar-2024
  • (2024)BlissCam: Boosting Eye Tracking Efficiency with Learned In-Sensor Sparse Sampling2024 ACM/IEEE 51st Annual International Symposium on Computer Architecture (ISCA)10.1109/ISCA59077.2024.00094(1262-1277)Online publication date: 29-Jun-2024
  • (2024)Appearance debiased gaze estimation via stochastic subject-wise adversarial learningPattern Recognition10.1016/j.patcog.2024.110441152(110441)Online publication date: Aug-2024
  • (2023)Immersive Experiences and XR: A Game Engine or Multimedia Streaming Problem?SMPTE Motion Imaging Journal10.5594/JMI.2023.3269752132:5(30-37)Online publication date: Jun-2023
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media