Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article
Free access

Handling occlusions for robust augmented reality systems

Published: 01 January 2010 Publication History

Abstract

In Augmented Reality applications, the human perception is enhanced with computer-generated graphics. These graphics must be exactly registered to real objects in the scene and this requires an effective Augmented Reality system to track the user's viewpoint. In this paper, a robust tracking algorithm based on coded fiducials is presented. Square targets are identified and pose parameters are computed using a hybrid approach based on a direct method combined with the Kalman filter. An important factor for providing a robust Augmented Reality system is the correct handling of targets occlusions by real scene elements. To overcome tracking failure due to occlusions, we extend our method using an optical flow approach to track visible points and maintain virtual graphics overlaying when targets are not identified. Our proposed real-time algorithm is tested with different camera viewpoints under various image conditions and shows to be accurate and robust.

References

[1]
B. Jiang, S. You, and U. Neumann, "Camera tracking for augmented reality media," in Proceedings of IEEE International Conference on Multimedia and Expo, pp. 1637-1640, 2000.
[2]
H. Araujo, R. Carceroni, and C. Brown, "A fully projective formulation for Lowe's tracking algorithm," Tech. Rep. 641, University of Rochester, Rochester, NY, USA, 1996.
[3]
J. J. More, "The levenberg-marquardt algorithm, implementation and theory," in Proceedings of the Biennial Conference on Numerical Analysis, pp. 105-116, 1978.
[4]
D. B. Gennery, "Tracking known three-dimensional objects," in Proceedings of the American Association of Artificial Intelligence (AAAI '82), pp. 13-17, Pittsburgh, Pa, USA, August 1982.
[5]
D. G. Lowe, "Robust model-based motion tracking through the integration of search and estimation," International Journal of Computer Vision, vol. 8, no. 2, pp. 113-122, 1992.
[6]
C. Harris, "Tracking with rigid models," in Active Vision, A. Blake, Ed., Chapter 4, pp. 59-73, MIT Press, 1993.
[7]
L. Naimark and E. Foxlin, "Circular data matrix fiducial system and robust image processing for a wearable visioninertial self-tracker," in Proceedings of IEEE International Symposium on Mixed and Augmented Reality (ISMAR '02), pp. 27-36, Darmstadt, Germany, 2002.
[8]
A. I. Comport, E. Marchand, and F. Chaumette, "A real-time tracker for markerless augmented reality," in Proceedings of the ACM/IEEE International Symposium on Mixed and Augmented Reality (ISMAR '03), pp. 36-45, Tokyo, Japan, October 2003.
[9]
J. H. Chen, C. S. Chen, and Y. S. Chen, "Fast algorithm for robust template matching with M-estimators," IEEE Transactions on Signal Processing, vol. 51, no. 1, pp. 230-243, 2003.
[10]
M. Maidi, F. Ababsa, and M. Mallem, "Robust fiducials tracking in augmented reality," in Proceedings of the 13th International Conference on Systems, Signals and Image Processing (IWSSIP '06), pp. 423-42, Budapest, Hungary, 2006.
[11]
H. Kato and M. Billinghurst, "Marker tracking and hmd calibration for a video-based augmented reality conferencing system," in Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR '99), pp. 85-92, San Francisco, Calif, USA, 1999.
[12]
J. Rekimoto and Y. Ayatsuka, "Cybercode: designing augmented reality environments with visual tags," in Proceedings of DARE 2000 on Designing Augmented Reality Environments (DARE '00), pp. 1-10, Elsinore, Denmark, 2000.
[13]
Y. Cho and U. Neumann, "Multi-ring color fiducial systems for scalable fiducial tracking augmented reality," in Proceedings of IEEE Virtual Reality Annual International Symposium (VRAIS '98), p. 212, Atlanta, Ga, USA, 1998.
[14]
M. Fiala, "ARTag, a fiducial marker system using digital techniques," in Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR '05), vol. 2, pp. 590-596, San Diego, Calif, USA, 2005.
[15]
J. Y. Didier, F. Ababsa, and M. Mallem, "Hybrid camera pose estimation combining square fiducials localization technique and orthogonal iteration algorithm," International Journal of Image and Graphics, vol. 8, no. 1, pp. 169-188, 2008.
[16]
G. Welch and G. Bishop, "An introduction to the Kalman filter," Tech. Rep. TR 95-041, Department of Computer Science, University of North Carolina, USA, 2004.
[17]
B. D. Lucas and T. Kanade, "An iterative image registration technique with an application to stereo vision," in Proceedings of the DARPA Image Understanding Workshop, pp. 121-130, 1981.
[18]
M. Maidi, F. Ababsa, and M. Mallem, "Active contours motion based on optical flow for tracking in augmented reality," in Proceedings of the 8th International Conference on Virtual Reality (VRIC '06), pp. 215-222, Laval, France, 2006.
[19]
S. You, U. Neumann, and R. Azuma, "Hybrid inertial and vision tracking for augmented reality registration," in Proceedings of IEEE Virtual Reality (VR '99), pp. 260-267, Houston, Tex, USA, 1999.
[20]
L. Naimark and E. Foxlin, "Encoded LED system for optical trackers," in Proceedings of the 4th IEEE and ACM International Symposium on Symposium on Mixed and Augmented Reality (ISMAR '05), pp. 150-153, Vienna, Austria, October 2005.
[21]
D. Stricker, G. Klinker, and D. Reiners, "A fast and robust line-based optical tracker for augmented reality applications," in Proceedings of the 1st International Workshop on Augmented Reality (IWAR '98), pp. 129-145, San Francisco, Calif, USA, 1998.
[22]
B. Okumura, M. Kanbara, and N. Yokoya, "Precise geometric registration by blur estimation for vision-based augmented reality," in Proceedings of the 6th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR '07), pp. 1-4, Washington, DC, USA, 2007.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Journal on Image and Video Processing
Journal on Image and Video Processing  Volume 2010, Issue
Special issue on fast and robust methods for multiple-view vision
January 2010
55 pages
ISSN:1687-5176
EISSN:1687-5281
Issue’s Table of Contents

Publisher

Hindawi Limited

London, United Kingdom

Publication History

Accepted: 31 March 2010
Revised: 11 January 2010
Published: 01 January 2010
Received: 31 July 2009

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 112
    Total Downloads
  • Downloads (Last 12 months)39
  • Downloads (Last 6 weeks)8
Reflects downloads up to 01 Mar 2025

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media