Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Expressive chromatic accumulation buffering for defocus blur

  • Original Article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

This article presents a novel parametric model to include expressive chromatic aberrations in defocus blur rendering and its effective implementation using the accumulation buffering. Our model modifies the thin-lens model to adopt the axial and lateral chromatic aberrations, which allows us to easily extend them with nonlinear and artistic appearances beyond physical limits. For the dispersion to be continuous, we employ a novel unified 3D sampling scheme, involving both the lens and spectrum. We further propose a spectral equalizer to emphasize particular dispersion ranges. As a consequence, our approach enables more intuitive and explicit control of chromatic aberrations, unlike the previous physically-based rendering methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Akenine-Möller, T., Munkberg, J., Hasselgren, J.: Stochastic rasterization using time-continuous triangles. In: Proc. Graphics Hardware, pp. 7–16 (2007)

  2. Cook, R.L., Porter, T., Carpenter, L.: Distributed ray tracing. ACM Comput. Graph. 18(3), 137–145 (1984)

    Article  Google Scholar 

  3. Games, E.: Unreal engine. http://www.unrealengine.com (2016). Accessed 21 Feb 2016

  4. Gotanda, Y., Kawase, M., Kakimoto, M.: Real-time rendering of physically based optical effects in theory and practice. In: ACM SIGGRAPH Courses, p. 23. ACM, New York (2015)

  5. Guy, S., Soler, C.: Graphics gems revisited: fast and physically-based rendering of gemstones. ACM Trans. Graph. 23(3), 231–238 (2004)

    Article  Google Scholar 

  6. Haeberli, P., Akeley, K.: The accumulation buffer: hardware support for high-quality rendering. ACM Comput. Graph. 24(4), 309–318 (1990)

    Article  Google Scholar 

  7. Halton, J.H.: Algorithm 247: radical-inverse quasi-random point sequence. Commun. ACM 7(12), 701–702 (1964)

    Article  Google Scholar 

  8. Hanika, J., Dachsbacher, C.: Efficient monte carlo rendering with realistic lenses. Comput. Graph. Forum 33(2), 323–332 (2014)

    Article  Google Scholar 

  9. Hullin, M., Eisemann, E., Seidel, H.P., Lee, S.: Physically-based real-time lens flare rendering. ACM Trans. Graph. 30(4), 108:1–108:9 (2011)

    Article  Google Scholar 

  10. Hullin, M.B., Hanika, J., Heidrich, W.: Polynomial optics: a construction kit for efficient ray-tracing of lens systems. Comput. Graph. Forum 31(4), 1375–1383 (2012)

    Article  Google Scholar 

  11. Kolb, C., Mitchell, D., Hanrahan, P.: A realistic camera model for computer graphics. In: Proc. ACM SIGGRAPH, pp. 317–324. ACM, New York (1995)

  12. Kraus, M., Strengert, M.: Depth-of-field rendering by pyramidal image processing. Comput. Graph. Forum 26(3), 645–654 (2007)

    Article  Google Scholar 

  13. Lee, S., Eisemann, E.: Practical real-time lens-flare rendering. Comput. Graph. Forum 32(4), 1–6 (2013)

    Article  Google Scholar 

  14. Lee, S., Eisemann, E., Seidel, H.P.: Depth-of-field rendering with multiview synthesis. ACM Trans. Graph. 28(5), 134:1–134:6 (2009)

    Google Scholar 

  15. Lee, S., Eisemann, E., Seidel, H.P.: Real-time lens blur effects and focus control. ACM Trans. Graph. 29(4), 65:1–65:7 (2010)

    Google Scholar 

  16. Lee, S., Kim, G.J., Choi, S.: Real-time depth-of-field rendering using splatting on per-pixel layers. Comput. Graph. Forum 27(7), 1955–1962 (2008)

    Article  Google Scholar 

  17. Lee, S., Kim, G.J., Choi, S.: Real-time depth-of-field rendering using anisotropically filtered mipmap interpolation. IEEE Trans. Vis. Comput. Graph. 15(3), 453–464 (2009)

    Article  Google Scholar 

  18. McGraw, T.: Fast Bokeh effects using low-rank linear filters. Vis. Comput. 31(5), 601–611 (2015)

    Article  Google Scholar 

  19. Polyanskiy, M.: Refractive index database. http://refractiveindex.info (2016). Accessed 21 Feb 2016

  20. Potmesil, M., Chakravarty, I.: A lens and aperture camera model for synthetic image generation. ACM Comput. Graph. 15(3), 297–305 (1981)

    Article  Google Scholar 

  21. Rokita, P.: Generating depth of-field effects in virtual reality applications. IEEE Comput. Graph. Appl. 16(2), 18–21 (1996)

    Article  Google Scholar 

  22. Schedl, D.C., Wimmer, M.: A layered depth-of-field method for solving partial occlusion. J. WSCG 20(3), 239–246 (2012)

    Google Scholar 

  23. Sellmeier, W.: Zur erklärung der abnormen farbenfolge im spectrum einiger substanzen. Annalen der Physik und Chemie 219(6), 272–282 (1871)

    Article  Google Scholar 

  24. Smith, T., Guild, J.: The CIE colorimetric standards and their use. Trans. Opt. Soc. 33(3), 73 (1931)

    Article  Google Scholar 

  25. Smith, W.J.: Modern Optical Engineering. McGraw-Hill, New York (2000)

    Google Scholar 

  26. Steinert, B., Dammertz, H., Hanika, J., Lensch, H.P.: General spectral camera lens simulation. Comput. Graph. Forum 30(6), 1643–1654 (2011)

    Article  Google Scholar 

  27. Thomas, S.W.: Dispersive refraction in ray tracing. Vis. Comput. 2(1), 3–8 (1986)

    Article  Google Scholar 

  28. Wong, T.T., Luk, W.S., Heng, P.A.: Sampling with hammersley and halton points. J. Graph. Tools 2(2), 9–24 (1997)

    Article  Google Scholar 

  29. Wu, J., Zheng, C., Hu, X., Wang, Y., Zhang, L.: Realistic rendering of bokeh effect based on optical aberrations. Vis. Comput. 26(6–8), 555–563 (2010)

    Article  Google Scholar 

  30. Wu, J., Zheng, C., Hu, X., Xu, F.: Rendering realistic spectral Bokeh due to lens stops and aberrations. Vis. Comput. 29(1), 41–52 (2013)

    Article  Google Scholar 

  31. Zernike, F., Midwinter, J.E.: Applied Nonlinear Optics. Courier Corporation, North Chelmsford (2006)

Download references

Acknowledgments

The Penguins, Fading-man, Tree (Downy Oak), (Golden) Bird, and Pegasus models are provided through the courtesy of http://www.domawe.net, Riley Lewand, the Xfrog Inc., http://www.cadnav.com, and the AIM@Shape Repository, respectively. Correspondence concerning this article can be addressed to Sungkil Lee.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sungkil Lee.

Additional information

This work was supported by the Mid-career and Global Frontier (on Human-centered Interaction for Coexistence) R&D programs through the NRF grants funded by the Korea Government (MSIP) (Nos. 2015R1A2A2A01003783, 2012M3A6A3055695).

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 21899 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jeong, Y., Lee, S., Kwon, S. et al. Expressive chromatic accumulation buffering for defocus blur. Vis Comput 32, 1025–1034 (2016). https://doi.org/10.1007/s00371-016-1244-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-016-1244-x

Keywords