Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Perceptual considerations for motion blur rendering

Published: 29 August 2011 Publication History

Abstract

Motion blur is a frequent requirement for the rendering of high-quality animated images. However, the computational resources involved are usually higher than those for images that have not been temporally antialiased. In this article we study the influence of high-level properties such as object material and speed, shutter time, and antialiasing level. Based on scenes containing variations of these parameters, we design different psychophysical experiments to determine how influential they are in the perception of image quality.
This work gives insights on the effects these parameters have and exposes certain situations where motion blurred stimuli may be indistinguishable from a gold standard. As an immediate practical application, images of similar quality can be produced while the computing requirements are reduced.
Algorithmic efforts have traditionally been focused on finding new improved methods to alleviate sampling artifacts by steering computation to the most important dimensions of the rendering equation. Concurrently, rendering algorithms can take advantage of certain perceptual limits to simplify and optimize computations. To our knowledge, none of them has identified nor used these limits in the rendering of motion blur. This work can be considered a first step in that direction.

Supplementary Material

Navarro (navarro.zip)
Supplemental movie, appendix, image and software files for, Perceptual considerations for motion blur rendering

References

[1]
Abrams, J., Barbot, A., and Carrasco, M. 2010. Voluntary attention increases perceived spatial frequency. Atten. Percept. Psychophys. 72, 6, 1510--21.
[2]
Adelson, E. H. and Bergen, J. R. 1985. Spatiotemporal energy models for the perception of motion. J. Opt. Soc. Amer. A 2, 2, 284--299.
[3]
Anson, O., Sundstedt, V., Gutierrez, D., and Chalmers, A. 2006. Efficient selective rendering of participating media. In Proceedings of the 3rd Symposium on Applied Perception in Graphics and Visualization (APGV'06). ACM, New York, 135--142.
[4]
Burr, D. 1980. Motion smear. Nature 284, 164--165.
[5]
Burr, D. C. 1981. Temporal summation of moving images by the human visual system. Roy. Soc. London Proc. Series B 211, 321--339.
[6]
Burr, D. C. and Morgan, M. J. 1997. Motion deblurring in human vision. Roy. Soc. London Proc. Series B 264, 431--436.
[7]
Burr, D. C., Ross, J., and Morrone, M. C. 1986. Seeing objects in motion. Proc. R. Soc. Lond., B, Biol. Sci. 227, 249--265.
[8]
Čadík, M. 2004. Human perception and computer graphics. Tech. rep. DC-PSR-2004-04, Department of Computer Science, Faculty of Electrical Engineering, Czech Technical University in Prague.
[9]
David, H. 1963. The Method of Paired Comparisons. Charles Griffin and Company.
[10]
Desimone, R., Chelazzi, L., Miller, E. K., and Duncan, J. 1995. Neuronal Mechanisms of Visual Attention. Massachusetts Institute of Technology, Cambridge, MA, 219--226.
[11]
Didyk, P., Eisemann, E., Ritschel, T., Myszkowski, K., and Seidel, H.-P. 2010. Perceptually motivated realtime temporal upsampling of 3D content for high refresh rate displays. Comput. Graph. Forum 29, 2, 713--722.
[12]
Feng, X. 2006. Lcd motion blur analysis, perception, and reduction using synchronized backlight flashing. In Society of PhotoOptical Instrumentation Engineers Conference Series. Vol. 6057. 213--226.
[13]
Fleet, D. J. and Langley, K. 1994. Computational analysis of non-fourier motion. Vis. Res. 34, 22, 3057--3079.
[14]
Fleming, R. W., Dror, R. O., and Adelson, E. H. 2003. Real-world illumination and the perception of surface reflectance properties. J. Vis. 3, 347--368.
[15]
Gorea, A. and Tyler, C. W. 1986. New look at bloch's law for contrast. J. Opt. Soc. Amer. A 3, 1, 52--61.
[16]
Gutierrez, D., Seron, F. J., Lopez-Moreno, J., Sanchez, M. P., Fandos, J., and Reinhard, E. 2008. Depicting procedural caustics in single images. In ACM SIGGRAPH Asia 2008 Papers. 120:1--120:9.
[17]
Hegde, J., Albright, T., and Stoner, G. 2004. Second order motion conveys depth order information. J. Vis. 4, 10, 1--1.
[18]
Jimenez, J., Sundstedt, V., and Gutierrez, D. 2009. Screen-space perceptual rendering of human skin. ACM Trans. Appl. Percept. 6, 4, 23:1--23:15.
[19]
Kendall, M. G. and Babington-Smith, B. 1940. On the method of paired comparisons. Biometrica 31, 3/4, 324--345.
[20]
Kozlowski, O. and Kautz, J. 2007. Is accurate occlusion of glossy reflections necessary. In Proceedings of the 4th Symposium on Applied Perception in Graphics and Visualization (APGV'07). ACM, New York, 91--98.
[21]
Křivánek, J., Fajardo, M., Christensen, P. H., Tabellion, E., Bunnell, M., Larsson, D., Kaplanyan, A., Levy, B., and Zhang, R. H. 2010. Global illumination across industries. In ACM SIGGRAPH Course Notes.
[22]
Lange, H. D. 1958. Research into the dynamic nature of the human fovea cortex systems with intermittent and modulated light. i. attenuation characteristics with white and colored light. J. Opt. Soc. Amer. 48, 11, 777--783.
[23]
Liu, T., Abrams, J., and Carrasco, M. 2009. Voluntary attention enhances contrast appearance. Psych. Sci.
[24]
Longhurst, P., Debattista, K., and Chalmers, A. 2006. A gpu based saliency map for high-fidelity selective rendering. In Proceedings of the 4th International Conference on Computer Graphics, Virtual Reality, Visualisation and Interaction in Africa. (AFRIGRAPH'06). ACM, New York, 21--29.
[25]
Loomis, J. M. and Nakayama, K. 1973. A velocity analogue of brightness contrast. Percept. 2, 4, 425--428.
[26]
Lopez-Moreno, J., Sundstedt, V., Sangorrin, F., and Gutierrez, D. 2010. Measuring the perception of light inconsistencies. In Proceedings of the Symposium on Applied Perception in Graphics and Visualization (APGV). ACM Press.
[27]
Mack, A. and Rock, I. 1998. Inattentional Blindness. MIT Press, Cambridge, MA.
[28]
Mantiuk, R., Daly, S., Myszkowski, K., and Seidel, H.-P. 2005. Predicting visible differences in high dynamic range images-Model and its calibration. In Proceedings of SPIE' 17th Annual Symposium on Electronic Imaging., B. E. Rogowitz, T. N. Pappas, and S. J. Daly, Eds. vol. 5666. 204--214.
[29]
Mantiuk, R., Myszkowski, K., and Seidel, H.-P. 2004. Visible difference predicator for high dynamic range images. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. 2763--2769.
[30]
McDonnell, R., Larkin, M., Dobbyn, S., Collins, S., and O'Sullivan, C. 2008. Clone attack! Perception of crowd variety. ACM Trans. Graph. 27, 26:1--26:8.
[31]
McNamara, A. 2006. Exploring visual and automatic measures of perceptual fidelity in real and simulated imagery. ACM Trans. Appl. Percept. 3, 217--238.
[32]
Meredith-Jones, R. 2000. Point sampling algorithms for simulating motion blur. M.S. thesis, University of Toronto.
[33]
Navarro, F., Serón, F. J., and Gutierrez, D. 2011. Motion blur rendering: State of the art. Comput. Graph. Forum 30, 1, 3--26.
[34]
Pan, H., fan Feng, X., and Daly, S. 2005. Lcd motion blur modeling and analysis. In Proceedings of the International Conference on Image Processing. 21--24.
[35]
Pearson, E. and Hartley, H. O. 1966. Biometrika Tables for Statisticians, 3rd Ed. Cambridge University Press.
[36]
Pylyshyn, Z. W. and Storm, R. W. 1988. Tracking multiple independent targets: Evidence for a parallel tracking mechanism. Spatial Vis. 3, 3, 179--197.
[37]
Ramachandran, V., Rao, V., and Vidyasagar, T. 1974. Sharpness constancy during movement perception (short note). Percept. 3, 1, 97--98.
[38]
Ramanarayanan, G., Bala, K., and Ferwerda, J. A. 2008. Perception of complex aggregates. ACM Trans. Graph. 27, 60:1--60:10.
[39]
Ramanarayanan, G., Ferwerda, J., Walter, B., and Bala, K. 2007. Visual equivalence: towards a new standard for image fidelity. In ACM SIGGRAPH 2007 Papers. ACM, New York.
[40]
Ramasubramanian, M., Pattanaik, S. N., and Greenberg, D. P. 1999. A perceptually based physical error metric for realistic image synthesis. In Proceedings of the 26th Annual Conference on Computer Graphics and Interactive Techniques. (SIGGRAPH'99). 73--82.
[41]
Rensink, R. A., O'Regan, J. K., and Clark, J. J. 1997. To see or not to see: The need for attention to perceive changes in scenes. Psychol. Sci. 8, 368--373.
[42]
Robson, J. G. 1966. Spatial and temporal contrast-sensitivity functions of the visual system. J. Opt. Soc. Amer. 56, 8, 1141--1142.
[43]
Rubinstein, M., Gutierrez, D., Sorkine, O., and Shamir, A. 2010. A comparative study of image retargeting. In ACM SIGGRAPH Asia 2010 Papers. ACM, New York, 160:1--160:10.
[44]
Rushmeier, H. 2008. The perception of simulated materials. In ACM SIGGRAPH 2008 Classes. ACM, New York, 7:1--7:12.
[45]
Setyawan, I. and Lagendijk, R. L. 2004. L.: Human perception of geometric distortions in images. In Proceedings of SPIE, Security, Steganography and Watermarking of Multimedia Contents Conference. Vol. IV. 272--280.
[46]
Shapiro, K., Raymond, J., and Arnell, K. 1997. The attentional blink. Trends Cogn. Sci. 1, 8, 291--296.
[47]
Stephenson, I. 2007. Improving motion blur: Shutter efficiency and temporal sampling. J. Graph. GPU, Game Tools 12, 1, 9--15.
[48]
Sung, K., Pearce, A., and Wang, C. 2002. Spatial-temporal antialiasing. IEEE Trans. Vis. Comput. Graph. 8, 2, 144--153.
[49]
Treisman, A. and Sato, S. 1990. Conjunction search revisited. J. Exper. Psychol. Hum. Percept. Perform. 459--478.
[50]
Vangorp, P., Condon, T., Ferwerda, J., Bala, K., Schoukens, R., and Dutré, P. 2009. Visual equivalence in dynamic scenes. Tech. rep., Katholieke Universiteit Leuven.
[51]
Vangorp, P., Laurijssen, J., and Dutré, P. 2007. The influence of shape on the perception of material reflectance. ACM Trans. Graph. 26.
[52]
Virsu, V., Rovamo, J., Laurinen, P., and Nsnen, R. 1982. Temporal contrast sensitivity and cortical magnification. Vis. Res. 22, 9, 1211--1217.
[53]
Wang, Z., Bovik, A. C., Sheikh, H. R., and Simoncelli, E. P. 2004. Image quality assessment: From error measurement to structural similarity. IEEE Trans. Image Process. 13, 1, 1--14.
[54]
Wang, Z. and Li, Q. 2007. Video quality assessment using a statistical model of human visual speed perception. J. Opt. Soc. Amer. A 24, 12, B61--B69.
[55]
Watson, A. B. 2006. The spatial standard observer: A human vision model for display inspection. SID Symposium Digest of Technical Papers 37, 1, 1312--1315.
[56]
Watson, A. B. and Ahumada, A. J. 1985. Model of human visual-motion sensing. J. Opt. Soc. Amer. A 2, 2, 322--341.
[57]
Watson, A. B., Albert J. Ahumada, J., and Farrell, J. E. 1986. Window of visibility: A psychophysical theory of fidelity in time-sampled visual motion displays. J. Opt. Soc. Amer. A 3, 3, 300--307.
[58]
Watson, A. B., Hu, Q. J., McGowan, J. F., and Mulligan, J. B. 1999. Design and performance of a digital video quality metric. In Proceedings of the Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series. B. E. Rogowitz and T. N. Pappas Eds., vol. 3644. 168--174.
[59]
wei Tang, C. 2007. Spatiotemporal visual considerations for video coding. IEEE Trans. Multimedia 9, 231--238.
[60]
Xia, J., Shi, Y., Teunissen, K., and Heynderickx, I. 2009. Perceivable artifacts in compressed video and their relation to video quality. Image Comm. 24, 548--556.
[61]
Yu, I., Cox, A., Kim, M. H., Ritschel, T., Grosch, T., Dachsbacher, C., and Kautz, J. 2009. Perceptual influence of approximate visibility in indirect illumination. In Proceedings of the 6th Symposium on Applied Perception in Graphics and Visualization (APGV'09). ACM, New York.
[62]
Zhai, Y. and Shah, M. 2006. Visual attention detection in video sequences using spatiotemporal cues. In Proceedings of the 14th Annual ACM International Conference on Multimedia. ACM, New York, 815--824.

Cited By

View all
  • (2024)Subtle Visual Cues in Mixed Reality: Influencing User Perception and Facilitating InteractionProceedings of the 16th Conference on Creativity & Cognition10.1145/3635636.3664249(556-560)Online publication date: 23-Jun-2024
  • (2024)A Monte Carlo-based approach to reduce the cost of generating synthetic holograms for 3D computer graphics scenesMultimedia Tools and Applications10.1007/s11042-024-19991-2Online publication date: 13-Aug-2024
  • (2023)Illustrative Motion Smoothing for Attention Guidance in Dynamic VisualizationsComputer Graphics Forum10.1111/cgf.1483642:3(361-372)Online publication date: 27-Jun-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Applied Perception
ACM Transactions on Applied Perception  Volume 8, Issue 3
August 2011
79 pages
ISSN:1544-3558
EISSN:1544-3965
DOI:10.1145/2010325
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 29 August 2011
Accepted: 01 July 2011
Received: 01 April 2011
Published in TAP Volume 8, Issue 3

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Motion blur
  2. perceptual rendering
  3. temporal antialiasing

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)16
  • Downloads (Last 6 weeks)5
Reflects downloads up to 15 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Subtle Visual Cues in Mixed Reality: Influencing User Perception and Facilitating InteractionProceedings of the 16th Conference on Creativity & Cognition10.1145/3635636.3664249(556-560)Online publication date: 23-Jun-2024
  • (2024)A Monte Carlo-based approach to reduce the cost of generating synthetic holograms for 3D computer graphics scenesMultimedia Tools and Applications10.1007/s11042-024-19991-2Online publication date: 13-Aug-2024
  • (2023)Illustrative Motion Smoothing for Attention Guidance in Dynamic VisualizationsComputer Graphics Forum10.1111/cgf.1483642:3(361-372)Online publication date: 27-Jun-2023
  • (2023)Partial Monte Carlo sampling for computer generated hologramsEngineering Reports10.1002/eng2.126736:1Online publication date: 11-May-2023
  • (2022)A Study on Image Restoration and AnalysisAdvance Concepts of Image Processing and Pattern Recognition10.1007/978-981-16-9324-3_3(35-61)Online publication date: 21-Feb-2022
  • (2021)Perceptual model for adaptive local shading and refresh rateACM Transactions on Graphics10.1145/3478513.348051440:6(1-18)Online publication date: 10-Dec-2021
  • (2020)A perceptual model of motion quality for rendering with adaptive refresh-rate and resolutionACM Transactions on Graphics10.1145/3386569.339241139:4(133:1-133:17)Online publication date: 12-Aug-2020
  • (2020)Controllable Motion-Blur Effects in Still ImagesIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2018.288948526:7(2362-2372)Online publication date: 1-Jul-2020
  • (2019)The Effect of Motion on the Perception of Material AppearanceACM Symposium on Applied Perception 201910.1145/3343036.3343122(1-9)Online publication date: 19-Sep-2019
  • (2016)Emulating displays with continuously varying frame ratesACM Transactions on Graphics10.1145/2897824.292587935:4(1-11)Online publication date: 11-Jul-2016
  • Show More Cited By

View Options

Get Access

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media