Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Information theoretical optimization for optical range sensors

Appl Opt. 2003 Sep 20;42(27):5418-26. doi: 10.1364/ao.42.005418.

Abstract

Most of the known optical range sensors require a large amount of two-dimensional raw data from which the three-dimensional (3D) data are decoded and so are associated with considerable cost. The cost arises from expensive hardware as well as from the time necessary to acquire the images. We will address the question of how one can acquire maximum shape information with a minimum amount of image raw data, in terms of information theory. It is shown that one can greatly reduce the amount of raw data needed by proper optical redundancy reduction. Through these considerations, a 3D sensor is introduced, which needs only a single color (red-green-blue) raw image and still delivers data with only approximately 2-microm longitudinal measurement uncertainty.