Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2342896.2343041acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
research-article

Omnistereo images from ground based lidar

Published: 05 August 2012 Publication History

Abstract

Street level panoramic images provide users with an immersive means to remotely explore street level features. Stereoscopic panoramic images, also referred to as omnistereo panoramics provide an even more compelling view. As 3D display technology becomes less expensive and more ubiquitous; the ability to view these types of images is far easier. In addition to providing a wow factor, 3D stereo enhances the sense of scale and distance, factors which are useful when planing a visit a new location. Omnistereo panoramas cannot be photographed by a pair of full 360° panoramic cameras as each would capture the other when the panoramic image is taken. Typically they are taken by a pair of cameras mounted on either end of a bracket which is then rotated [Peleg et al. 2001]. Each image is broken into a set of successive strips. These strips are then mosaiced together to form the left and right panoramic image. This, however, requires the camera remains stationary while the pictures are taken, and unless the rotation rate and shutter speeds are quite high, that the scene remain static as well. These two restrictions prevent the live capture of panoramic data from a vehicle driving down a busy street. The technique described here uses very dense Lidar and calibrated panoramic images captured by a ground mobile collection vehicle equipped with a Lidar unit and a single panoramic camera, driven at posed speed limits, to automatically create street level omnistereo panoramas. These stereo images are higher quality and therefore more realistic than any other method which uses mobile collection.

Supplementary Material

ZIP File (a126-barnes.zip)
Supplemental files.

References

[1]
Peleg, S., Ben-Ezra, M., and Pritch, Y. 2001. Omnistereo: panoramic stereo imaging. IEEE Transactions on Pattern Analysis and Machine Intelligence 23, 3, 279--290.
[2]
Yang, R., and Wang, L. 2006. View-Dependent Textured Splatting. Environment 22, 7, 456--467.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGGRAPH '12: ACM SIGGRAPH 2012 Posters
August 2012
131 pages
ISBN:9781450316828
DOI:10.1145/2342896
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 August 2012

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article

Conference

SIGGRAPH '12
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,822 of 8,601 submissions, 21%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 100
    Total Downloads
  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)0
Reflects downloads up to 23 Dec 2024

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media