Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article

Projection defocus analysis for scene capture and image display

Published: 01 July 2006 Publication History

Abstract

In order to produce bright images, projectors have large apertures and hence narrow depths of field. In this paper, we present methods for robust scene capture and enhanced image display based on projection defocus analysis. We model a projector's defocus using a linear system. This model is used to develop a novel temporal defocus analysis method to recover depth at each camera pixel by estimating the parameters of its projection defocus kemel in frequency domain. Compared to most depth recovery methods, our approach is more accurate near depth discontinuities. Furthermore, by using a coaxial projector-camera system, we ensure that depth is computed at all camera pixels, without any missing parts. We show that the recovered scene geometry can be used for refocus synthesis and for depth-based image composition. Using the same projector defocus model and estimation technique, we also propose a defocus compensation method that filters a projection image in a spatially-varying, depth-dependent manner to minimize its defocus blur after it is projected onto the scene. This method effectively increases the depth of field of a projector without modifying its optics. Finally, we present an algorithm that exploits projector defocus to reduce the strong pixelation artifacts produced by digital projectors, while preserving the quality of the projected image. We have experimentally verified each of our methods using real scenes.

Supplementary Material

JPG File (p907-zhang-high.jpg)
JPG File (p907-zhang-low.jpg)
High Resolution (p907-zhang-high.mov)
Low Resolution (p907-zhang-low.mov)

References

[1]
Bimber, O., and Emmerling, A. 2006. Multi-focal projection. IEEE Trans. on Visualization and Computer Graphics to appear.
[2]
Bimber, O., Wetzstein, G., Emmerling, A., and Nitschke, C. 2005. Enabling view-dependent stereoscopic projection in real environments. In Proc. Int. Symp. on Mixed and Augmented Reality, 14--23.
[3]
Curless, B., and Levoy, M. 1995. Better optical triangulation through spacetime analysis. In Proc. Int. Conf. on Computer Vision, 987--994.
[4]
Davis, J., Nehab, D., Ramamoothi, R., and Rusinkiewicz, S. 2005. Space-time stereo: A unifying framework for depth from triangulation. IEEE Trans. on Pattern Analysis and Machine Intelligence 27, 2, 296--302.
[5]
Faugeras, O. 1993. Three-Dimensional Computer Vision. MIT Press.
[6]
Favaro, P., and Soatto, S. 2005. A geometric approach to shape from defocus. IEEE Trans. on Pattern Analysis and Machine Intelligence (in press).
[7]
Fujii, K., Grossberg, M., and Nayar, S. 2005. A Projector-Camera System with Real-Time Photometric Adaptation for Dynamic Environments. In Proc. IEEE Conf. on Computer Vision and Pattern Recognition, 814--821.
[8]
Girod, B., and Scherock, S. 1989. Depth from defocus of structured light. In Proc. SPIE Conf. on Optics, Illumination, and Image Sensing for Machine Vision.
[9]
Gonzales-Banos, H., and Davis, J. 2004. A method for computing depth under ambient illumination using multi-shuttered light. In Proc. IEEE Conf. on Computer Vision and Pattern Recognition, 234--241.
[10]
Grossberg, M., Peri, H., Nayar, S., and Belhumeur, P. 2004. Making One Object Look Like Another: Controlling Appearance using a Projector-Camera System. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), vol. I, 452--459.
[11]
Horn, B., and Brooks, M. 1989. Shape from Shading. MIT Press.
[12]
Huang, P. S., Zhang, C. P., and Chiang, F. P. 2003. High speed 3-d shape measurement based on digital fringe projection. Optical Engineering 42, 1, 163--168.
[13]
Jain, A. K. 1989. Fundamentals of Digital Image Processing. Prentice Hall.
[14]
Jin, H., and Favaro, P. 2002. A variational approach to shape from defocus. In Proc. Eur. Conf. on Computer Vision, 18--30.
[15]
Kanade, T., Gruss, A., and Carley, L. 1991. A very fast vlsi rangefinder. In Proc. Int. Conf. on Robotics and Automation, vol. 39, 1322--1329.
[16]
Koninckx, T. P., Peers, P., Dutr, P., and Gool, L. V. 2005. Scene-adapted structured light. In Proc. IEEE Conf. on Computer Vision and Pattern Recognition, 611--619.
[17]
Levoy, M., Chen, B., Vaish, V., Horowitz, M., Mcdowall, I., and Bolas, M. 2004. Synthetic aperture confocal imaging. In SIGGRAPH Conference Proceedings, 825--834.
[18]
Ljung, L. 1998. System Identification: A Theory for the User, 2 ed. Prentice Hall.
[19]
Majumder, A., and Welch, G. 2001. Computer graphics optique: Optical superposition of projected computer graphics. In Proc. Eurographics Workshop on Virtual Enviroment/Immersive Projection Technology.
[20]
Mcguire, M., Matusik, W., Pfister, H., Hughes, J. F., and Durand, F. 2005. Defocus video matting. In SIGGRAPH Conference Proceedings, 567--576.
[21]
Nayar, S. K., and Nakagawa, Y. 1994. Shape from focus. IEEE Trans. on Pattern Analysis and Machine Intelligence 16, 8, 824--831.
[22]
Nayar, S. K., Watanabe, M., and Noguchi, M. 1996. Real-time focus range sensor. IEEE Transactions on Pattern Analysis and Machine Intelligence 18, 12, 1186--1198.
[23]
Nocedal, J., and Wright, S. J. 1999. Numerical Optimization. Springer.
[24]
Oppenheim, A. V., and Willsky, A. S. 1997. Signals and Systems, 2 ed. Prentice Hall.
[25]
Pentland, A. 1987. A new sense for depth of field. IEEE Trans. on Pattern Analysis and Machine Intelligence 9, 4, 523--531.
[26]
Raj, A., and Zabih, R. 2005. A graph cut algorithm for generalized image deconvolution. In Proc. Int. Conf. on Computer Vision.
[27]
Rajagopalan, A. N., and Chaudhuri, S. 1997. A variational approach to recovering depth from defocused images. IEEE Trans. on Pattern Analysis and Machine Intelligence 19, 10, 1158--1164.
[28]
Raskar, R., Welch, G., Cutts, M., Lake, A., Stesin, L., and Fuchs, H. 1998. The office of the future: A unified approach to image-based modeling and spatially immersive displays. In SIGGRAPH Conference Proceedings, 179--188.
[29]
Raskar, R., Welch, G., Low, K., and Bandyopadhyay, D. 2001. Shader lamps. In Proc. Eurographics Workshop on Rendering.
[30]
Raskar, R., Van Baar, J., Beardsley, P., Willwacher, T., Rao, S., and Forlines, C. 2003. ilamps: geometrically aware and self-configuring projectors. In SIGGRAPH Conference Proceedings, 809--818.
[31]
Raskar, R., Han Tan, K., Feris, R., Yu, J., and Turk, M. 2004. Non-photorealistic camera: Depth edge detection and stylized rendering using multi-flash imaging. In SIGGRAPH Conference Proceedings, 679--688.
[32]
Scharstein, D., and Szeliski, R. 2003. High-accuracy stereo depth maps using structured light. In Proc. IEEE Conf. on Computer Vision and Pattern Recognition, 195--202.
[33]
Schechner, Y. Y., Kiryati, N., and Basri, R. 2000. Separation of transparent layers using focus. Int. J. on Computer Vision 39, 1, 25--39.
[34]
Sen, P., Chen, B., Garg, G., Marschner, S. R., Horowitz, M., Levoy, M., and Lensch, H. P. A. 2005. Dual photography. In SIGGRAPH Conference Proceedings, 745--755.
[35]
Tappen, M. F., Russell, B. C., and Freeman, W. T. 2004. Efficient graphical models for processing images. In Proc. IEEE Conf. on Computer Vision and Pattern Recognition, vol. 2, 673--680.
[36]
Zhang, L., Snavely, N., Curless, B., and Seitz, S. M. 2004. Spacetime faces: High-resolution capture for modeling and animation. In ACM Annual Conference on Computer Graphics, 548--558.
[37]
Zhang, Z. 2000. A flexible new technique for camera calibration. IEEE Trans. on Pattern Analysis and Machine Intelligence 22, 11, 1330--1334.

Cited By

View all
  • (2024)Accurate 3D Measurement of Complex Texture Objects by Height Compensation Using a Dual-Projector StructureIEEE Transactions on Image Processing10.1109/TIP.2024.338960933(3021-3030)Online publication date: 22-Apr-2024
  • (2024)Single-pixel imaging-based PSF compensation for large depth-of-field fringe projection profilometryMeasurement10.1016/j.measurement.2024.114954235(114954)Online publication date: Aug-2024
  • (2023)Depth of Field Extension for Microscopic Structured Light Vision System Based on Image Fusion2023 International Conference on Manipulation, Automation and Robotics at Small Scales (MARSS)10.1109/MARSS58567.2023.10294162(1-6)Online publication date: 9-Oct-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Graphics
ACM Transactions on Graphics  Volume 25, Issue 3
July 2006
742 pages
ISSN:0730-0301
EISSN:1557-7368
DOI:10.1145/1141911
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 July 2006
Published in TOG Volume 25, Issue 3

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. depth recovery
  2. image composition
  3. multi-focal projection
  4. projector defocus
  5. projector depixelation
  6. refocus synthesis
  7. temporal defocus analysis

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)1
Reflects downloads up to 29 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Accurate 3D Measurement of Complex Texture Objects by Height Compensation Using a Dual-Projector StructureIEEE Transactions on Image Processing10.1109/TIP.2024.338960933(3021-3030)Online publication date: 22-Apr-2024
  • (2024)Single-pixel imaging-based PSF compensation for large depth-of-field fringe projection profilometryMeasurement10.1016/j.measurement.2024.114954235(114954)Online publication date: Aug-2024
  • (2023)Depth of Field Extension for Microscopic Structured Light Vision System Based on Image Fusion2023 International Conference on Manipulation, Automation and Robotics at Small Scales (MARSS)10.1109/MARSS58567.2023.10294162(1-6)Online publication date: 9-Oct-2023
  • (2023)Focal surface projection: Extending projector depth of field using a phase‐only spatial light modulatorJournal of the Society for Information Display10.1002/jsid.126131:11(651-656)Online publication date: 3-Oct-2023
  • (2022)Perceptually-Based Optimization for Radiometric Projector Compensation2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW55335.2022.00226(750-751)Online publication date: Mar-2022
  • (2022)End-to-End Full Projector CompensationIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2021.305012444:6(2953-2967)Online publication date: 1-Jun-2022
  • (2022)Multi-Depth-of-Field 3-D Profilometry for a Microscopic System With Telecentric LensIEEE Transactions on Instrumentation and Measurement10.1109/TIM.2021.313715671(1-9)Online publication date: 2022
  • (2022)An Improved Time Defocus Analysis Method for Measuring 3D2022 IEEE 6th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC )10.1109/IAEAC54830.2022.9929976(1937-1940)Online publication date: 3-Oct-2022
  • (2022)Automatic objects’ depth estimation based on integral imagingMultimedia Tools and Applications10.1007/s11042-022-13221-381:30(43531-43549)Online publication date: 1-Dec-2022
  • (2021)Paraxial 3D shape measurement using parallel single-pixel imagingOptics Express10.1364/OE.43547029:19(30543)Online publication date: 7-Sep-2021
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media