Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
6 views

Homography from Conic Intersection Camera Calibration based on Arbitrary Circular Patterns

Uploaded by

Arturo Heredia
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Homography from Conic Intersection Camera Calibration based on Arbitrary Circular Patterns

Uploaded by

Arturo Heredia
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Vol. 139 No. SIG 139(CVIM 2) IPSJ Transactions on Computer Vision and Image Media Feb.

2098

Regular Paper

Homography from Conic Intersection:


Camera Calibration based on Arbitrary Circular Patterns

H AIYUAN W U , † Q IAN C HEN † and T OSHIKAZU WADA †

For the camera calibration, point correspondence data between 3D and 2D points widely spreading over
the image plane is necessary to get a precise result. However, in the case of wide observing area, it is
impractical to use a huge calibration object that covers the camera view. This paper presents a refined
calibration method using circular patterns. We already proposed a circular pattern based camera calibration
method using a four-wheel mobile robot. That is, by keeping low speed and constant steering angle, the
robot can draw a big circular pattern on a plane. Our previous method, however, has a limitation that roll
angle around the optical axis is assumed to be zero. This paper presents an improved version of the camera
calibration method. Our method estimates the extrinsic parameters of fixed camera from a single circular
trajectory when the focal length of the camera is known. In the case that the focal length is not given, two
co-planar un-concentric circular trajectories are enough for calibrating the extrinsic parameters and the focal
length. In both cases, the center and the radius of the circle(s), the speed of the robot are not required.
The extensive experiments over simulated images as well as real images demonstrate the robustness and the
accuracy of our method.

wheel radio controlled car (hereafter “RC car”)


1. Introduction
moving on a plane as a calibration object. Although
Camera calibration plays an important role for the RC car cannot move along a straight line with-
Computer Vision tasks, e.g., stereo vision, volume out a navigation system, it is easy to drive it along
intersection, generating an orthogonal view of a a circular trajectory by keeping low speed and con-
planar scene from a perspective view, and so on. stant steering angle. The circular trajectory is pro-
For calibrating the extrinsic camera parame- jected as an ellipse on to the image plane (Figure
ters, point correspondence data between known 3D 1). Based on this geometric property, we developed
points and their projection is necessary. In many a refined method for camera calibration using cir-
cases, specially designed calibration objects are cular patterns.
employed for the ease of this point correspondence.
In order to get precise camera parameters, point
correspondence data between 3D and 2D points
spreading over the image plane is necessary. This Image
implies the calibration object should be big enough
for covering the camera view. This requirement can
be satisfied in the indoor scenes. However, it is Camera
impractical for the wide observing area, such as, a
baseball stadium or a football playground, to con-
struct a huge calibration object that covers the cam-
era view. Furthermore, using calibration objects is
not suitable for the automatic calibration, because
points on the objects are manually corresponded
in many cases. The manual correspondence some-
times produces erroneous data.
This paper presents a method for solving the Fig. 1 The addressed problem: calibrating extrinsic camera
above problems, i.e., an automatic camera calibra- parameters using four-wheel mobile robot
tion method that can be used for estimating extrin-
sic parameters of a fixed camera between the cam- Our method estimates the extrinsic parameters of
era and a far planar object, such as, floors, roads, fixed camera from a single circular trajectory when
playgrounds, and so on. the focal length of the camera is known. In the
Basic Idea: In this research, we employ a four- case that the focal length is not given, two co-planar
un-concentric circular trajectories are enough for
† Wakayama University calibrating the extrinsic parameters and the focal
1
2 IPSJ Transactions on Computer Vision and Image Media Feb. 2098

length. In both cases, our method does not require


Circle
the point correspondence data, the center position
and the radius of the circular trajectory. In addi- Ellipse
tion, it does not require the speed of the robot to
be known, and the completely circular trajectory to Orthogonal-view R
be seen. Through computer simulations and exper-
iments with some real images, we confirmed that Generic-view
our method is robust and accurate. We also con- T
firmed that our method gives higher accuracy than
the homography based algorithm.
Related Work: Several camera calibration meth-
ods using circular patterns 2) -5) , or conics6) -10) have Plane Circle
been proposed so far. Most of them assume multi- Fig. 2 Observing a circle from a generic viewpoint and from an
viewpoint images. Meng et al. 2) proposed a method orthogonal viewpoint, R and T are the rotation matrix
that uses a planar circle and lines passing through and the translation vector of the camera
its center. It requires at least three different views.
Kim et al.4) proposed a method that makes use of tion. As shown in figure 3, a ray passing through the
planar con-centric circles. It requires two views. observed ellipse and the viewpoint forms an oblique
Yang et al.5) proposed a similar method like Kim’s elliptical cone. In addition, the ray passing through
one except that con-centric ellipses are used instead the base circle and the viewpoint forms an oblique
of con-centric circles. Avidan et al. 7) proposed an circular cone. It is clear that the oblique elliptical
approach to reconstructing the 3D coordinates of cone and the oblique circular cone denote the same
a moving point along a conic section viewed by a cone. The difference between them is base planes,
monocular moving camera. It requires nine views. where the ellipse base and the circle base reside on.
If the point is known to move along a circle, it needs
seven views. Viewpoint
We have proposed a camera calibration method
using a single circular pattern 14) . However, It has a Ellipse
limitation that roll angle around the optical axis is Image plane
assumed to be zero.
Homography can be used as a camera calibra-
tion method using a planar pattern. It requires at
least four point correspondences to estimate the ma- Plane Circle
trix that convert points on an image plane to the
corresponding points on another image plane. By Fig. 3 A circle and its projection
decomposing the homography matrix, we can es-
timate the surface normal vector of the plane, the The R represents the rotation between the image
translation, and the rotation between the two cam- plane and planes parallel to the base plane where
eras. But the decomposition is alternative 12)13) . circular pattern(s) reside on. So, the R can be
The addressed problem in this paper is to esti- estimated by finding the rotation that converts the
mate the extrinsic camera parameters from one im- oblique elliptical cone defined by the viewpoint and
age containing circular pattern(s) on a plane. The the observed ellipse to an oblique circular cone.
extrinsic parameters consist of 1) translation vector Then, our problem ends up estimating this rotation
T from the center of a circle to the viewpoint, and transform. Once the R is estimated, the T can eas-
2) rotation matrix R that transforms the viewpoint ily be estimated except the scale factor. That is, the
to an orthogonal viewpoint where the perspective direction of T can be determined.
projection of the plane can be represented by a scal- In the case that the focal length of the camera
ing, as shown in Figure 2. In the case that the focal is unknown, two un-concentric circular patterns on
length of the camera is unknown, its estimation is the same plane are required for calibration. By
also our target. assuming a focal length f , we can suppose two
oblique elliptical cones, and estimate two Rs. If the
2. Calibration from circular patterns
difference between the estimated Rs is minimized,
From a generic viewpoint, a circle is observed as the assumed focal length should be regarded as a
an ellipse on the image plane by perspective projec- valid estimation.
Vol. 139 No. SIG 139(CVIM 2) Homography from Conic Intersection: Camera Calibration based on Arbitrary Circular Patterns
3

2.1 Oblique elliptical cones and oblique cir-


PT Qe P = h2 (Xe T Qe Xe ) = 0. (8)
cular cones
Here we describe a method to compute a rotation Eq.(8) shows that the oblique elliptical cone and the
matrix that transforms an oblique elliptical cone 3D ellipse are represented by the same quadratic
to an oblique circular cone. The method consists form XT Qe X.
of two steps. First, we compute a rotation matrix Circles and oblique circular cones are the special
that transforms an oblique elliptical cone to an (un- cases of ellipses and oblique elliptical cones. Sup-
oblique) elliptical cone. Then we compute another pose a 3D circle centered at (x 0 , y0 , z0 ) on z = z0
rotation matrix that transforms the (un-oblique) el- plane with radius r. The quadratic form of the fol-
liptical cone to an oblique circular cone. lowing matrix represents the oblique circular cone
2.1.1 Oblique elliptic cones defined by the
 3D circle and an apex at the
 origin:
Here we explore a relation between an ellipse on 1 0 − xz00
z = z0 plane and an oblique elliptical cone defined  − yz00 
Qc =  0 1  . (9)
by the ellipse and an apex at the origin (Figure 4). x20 +y02 −r 2
− xz00 − yz00
An ellipse on z = z0 plane can be represented by z02
Ax2 +2Bxy+Cy 2 +2Dx+2Ey+F = 0.(1) 2.1.2 Converting an oblique elliptical cone to
Eq.(1) can be rewritten in quadratic form as an elliptical cone
XT QX = 0, (2) In order to compute the rotation matrix that trans-
where forms an oblique elliptical cone to an oblique cir-
    cular cone, we first compute the matrix that trans-
A B D x
Q =  B C E  , X =  y  (3) forms the oblique elliptical cone to an (un-oblique)
D E F 1 elliptical cone (Figure 5).

Let Xe = [x y z0 ]T be a point on a 3D ellipse λ2 Y λ3


given by Eq.(2) on z = z 0 plane, then the following
Z
equation stands:
X = KXe , (4)
where K = diag(1, 1, 1/z0). By substituting An ellipes in image plane
Eq.(4) for X in Eq.(2), we obtain
λ1x2 + λ2 y 2 + λ3 z2 = 0
XT QX = Xe T Qe Xe = 0, (5)
where
Qe = KQK. (6) N
That is, Qe characterizes the 3D ellipse on z = z 0
plane. A circle in the ground
Fig. 5 The oblique elliptical cone to an elliptical cone
Z Y
An un-oblique elliptical cone in standard form is
o X
given by
x2 y2
2
+ 2 = z2. (10)
a b
Eq.(8) describes that an oblique elliptical cone is
represented by a quadratic form of Q e . Thus Qe
Ellipse (0, 0, Z0 ) have 3 real eigenvalues λ 1 , λ2 , and λ3 satisfying:
Z = Z0 plane λ1 λ2 λ3 < 0. (11)
Fig. 4 An oblique elliptical cone formed from an ellipse Without losing generality, we assume that
λ1 λ2 > 0, |λ1 | ≥ |λ2 | . (12)
Let P be a point on a line segment connecting the Let v1 , v2 and v3 = v1 × v2 be the normalized
origin [0 0 0]T and Xe on the 3D ellipse. P can eigenvectors corresponding to λ 1 , λ2 and λ3 , re-
be expressed by spectively. Using the eigenvalues and eigenvectors,
P = hXe . (7) Qe can be decomposed as
where h is a height parameter, i.e., h = 0 at the
Qe = VΛV−1 = VΛVT , (13)
origin, and h = 1 at z = z 0 . By substituting P for
Xe in Eq.(5), we obtain the following equation: where
4 IPSJ Transactions on Computer Vision and Image Media Feb. 2098

Λ = diag(λ
 1 , λ2 , λ3 )  the oblique circular cone. Thus u 3 is the normal
.
V = v1 v2 v3 vector of the base plane of the oblique circular cone.
By rotating the oblique elliptical cone with V T , we From Eq.(20) we notice that there are four possible
obtain solutions of u3 . Since u3 and −u3 describe the
T two normal vectors of the same plane, the number
PT Qe P = (VP ) (VΛV−1 )(VP )
T , (14) of meaningfully different solutions is two.
= P ΛP = 0
where P = VT P. Oblicue circular cone
Thus, the oblique elliptical cone can be expressed
in the normalized form as following: Circle base
λ1 x2 + λ2 y 2 + λ3 z 2 = 0. (15)
By comparing Eq.(10) with Eq.(15), we can esti- Elliptical cone
mate the values a and b as follows:
Ellipse base
a = −λ3 /λ1
. (16) Fig. 6 Convert an elliptical cone to an oblique circular cone
b = −λ3 /λ2
From Eq.(12) we obtain 0 < a ≤ b. By substituting Eq.(20) for U in Eq.(17), we can
2.1.3 Converting an elliptical cone to an compute the values of x 0 /z0 , y0 /z0 and r/z0 . Then
oblique circular cone we transform (x0 /z0 , y0 /z0 , 1) back to the coordi-
Our problem is to compute the rotation matrix nates system that describes the elliptical cone with
U that transforms an elliptical cone expressed by UT to compute the center C 0 and the radius r of
Eq.(10) to an oblique circular cone expressed by the circular base as following:
  
Eq.(9).(See figure 6) 2 −a2
−(−1)l b1+a 2 bz0
This problem can be formalized by the following  
equation: C0 =   0

 , (22)
UT diag(1/a2 , 1/b2 , −1)U = kQc . (17) (−1)m 1+a 1+b b2
2 a z0
Since U is a rotation matrix, it can be represented b2
by three orthogonal unit vectors: U = [u 1 u2 u3 ], r = a |z0 |
where ui = [uix uiy uiz ]T ; (i = 1, 2, 3). We also where z0 remains as a scale factor that describes the
have distance between the origin and the plane where the
UT U = I. (18) circular base resides on.
FromEq.(17) we obtain In consequence, the rotation matrix R that trans-
2
 a a+1 2 b2 +1 2 forms an oblique elliptical cone to an oblique circu-
 2 u1x + b2 u1y
a2 +1 2 2 lar cone is obtained by R = UV.
− a2 u2x − b b+1 2 u2y
2
= 0 .(19)

 a +1
2 2
b +1
a2 u1x u2x + b2 u1y u2y =0 2.2 Calibration with known focal length
By simplifying Eqs.(18) and (19), we obtain Here, we describe a method to estimate the ex-
 
δ cos α trinsic parameters of fixed camera by using one cir-
u1 =  sin α

 cular pattern.
l−m 2 In this paper, we assume a pinhole camera model
 (−1) m 1 − δ cosα and will use two coordinates systems as shown
(−1) δ sin α
u2 =  −(−1)√
m
cos α  , (20) in Figure 7: 1) the world coordinates system (O-
l 2 XY Z), and 2) camera coordinates system (O  -
 (−1) √ 1 − δ sin
 α
X  Y  Z  ). The origins of both coordinates systems
(−1)l 1 − δ 2 are set at the projection center of the camera. The
u3 =  0 
Z-axis of O-XY Z is set to be parallel to the normal
m
−(−1) δ vector of the ground. The Z  -axis of O  -X  Y  Z  is
where  set to be parallel to the optical-axis of the camera.
1 + 1/b2 The Y -axis of O-XY Z is set to be perpendicular to
δ= , (21) Z-axis and Z  -axis.
1 + 1/a2 The scene configuration is characterized by the
and α is a free variable, l and m are arbitrary integer following parameters: 1) the distance between the
number. u 1 , u2 and u3 are the unit vectors of X, Y base plane (ground) and the viewpoint (projection
and Z axes of the coordinates system that describes center) zg ; 2) the radius of the circle r; 3) the focal
Vol. 139 No. SIG 139(CVIM 2) Homography from Conic Intersection: Camera Calibration based on Arbitrary Circular Patterns
5

Image Plane Ix The representation of unit vector of the Z-axis


Z X' N(f ) under O  -X  Y  Z  can be obtained by apply-
Y'
Iy β
Y ing the rotation transform V(f ) to u 3 (f ) as follow-
ing:
Io O
Z' N(f ) = V(f )u3 (f ). (27)
θ Because there are two possible solutions of N(f )
Co X
f as described in 2.1.3, we need some methods to se-
Circle (0, 0, Zg) lect a true one from them. If we know that the circle
is on the ground and the camera is not set upside-
down, then we can remove the inconsistent answer
Ground of N(f ).
The tilt angle θ and the roll angle β can be com-
Fig. 7 The world coordinate systems and the camera putedas following:
coordinate system θ = π2 − cos−1 Nz (f )
, (28)
β = tan−1 N x (f )
Ny (f )
length f , i.e., the distance between the projection
center and the image plane. where Nx (f ), Ny (f ) and Nz (f ) are the X, Y and
The parameters to be estimated are 1) the tilt an- Z elements of N(f ), respectively.
gle θ between the X-axis and the Z  -axis; 2) the The translation vector T(f ) can be obtained by
roll angle β between the Y -axis and X  -axis; 3) the applying the rotation transform V(f ) to C 0 (f ) es-
translation vector T that describes the center of the timated with Eq.(22) as following:
circle C0 in O -X  Y  Z  coordinates system. T(f ) = V(f )C0 (f ). (29)
By observing a circle on a plane, such as ground, If an ellipse on the image plane has been detected
we have an image where the circle shows an ellipse. and fitted, an oblique elliptical cone can be defined
Let Q is the ellipse detected from the image. We de- uniquely when the focal length f is known. Then
fine an oblique elliptical cone whose apex is at the the extrinsic parameters can be computed from the
projection center and the base is the ellipse on the oblique elliptical cone with the method described
image plane. From Eq.(5), we obtain a quadratic above.
form matrix Q e (f ) that describes the oblique ellip- In the case that the f is unknown, the oblique el-
tical cone as following: liptical cone whose apex is at the projection cen-
Qe (f ) = K(f )T QK(f ) = K(f )QK(f ), (23) ter and the base is the ellipse cannot be defined
where K(f ) = diag(1, 1, −1/f ). Because K(f ) is uniquely, but can be expressed as a function of the
a function of f , then Q e (f ) is also a function of f . f . Thus the extrinsic parameters can be represented
With the method described in 2.1.2, we can com- by functions of the f .
pute the rotational matrix V(f ) that transforms 2.3 Calibration with unknown focal length
the oblique ellipse cone Q e to an ellipse cone ex- Here we describe an approach to calibrate the ex-
pressed by Eq.(10) by computing the eigenvalues trinsic camera parameters when the f is unknown.
and the eigenvectors
 of Q e :  In order to obtain unique solution of the extrinsic
V(f ) = v1 (f ) v2 (f ) v3 (f ) , (24) parameters, we let the camera to observe two un-
a(f ) = −λ3 (f )/λ1 (f ) concentric co-planner circles. Let Q 1 and Q2 are
, (25) the two ellipses detected from the image. Assuming
b(f ) = −λ3 (f )/λ2 (f )
a focal length f , we can suppose two oblique ellipti-
where V(f ), a(f ) and b(f ) are also functions of f . cal cones. From them we can compute the unit vec-
With the method described in 2.1.3, we estimate tor of the ground. Let N ia (f ) and Nib (f ) be the
the unit vector of Z-axis that is the normal vector two solutions estimated from Q i ; (i = 1, 2). Be-
of the ground where the circle resides on by com- cause the two circles are co-planar, the two vectors
puting the rotation matrix U(f ) that transforms the in one of the four pairs: (N 1a , N2a ), (N1a , N2b ),
elliptic cone to an oblique circular cone. By substi- (N1b , N2a ) and (N1b , N2b ) should be the same.
tuting Eq.(21) for δ in Eq.(20), we obtain: However, if we assumed a wrong f , all the esti-
   mated normal vectors from Q i ; (i = 1, 2) will be
)2 −1/b(f )2
(−1)l 1/a(f
 1+1/a(f )2  different. Thus we can estimate the f by minimum
u3 (f ) =  0  .(26)
 E(f ) given by
−(−1)m 1+1/b(f )2 E(f ) = min(x1 , x2 , x3 , x4 ), (30)
1+1/a(f )2
where x1 = |N1a (f ) × N2a (f )|2 |, x2 = N1a (f ) ×
6 IPSJ Transactions on Computer Vision and Image Media Feb. 2098

N2b (f )|2 , x3 = |N1b (f ) × N2a (f )|2 , x4 = [degree] respectively.


|N1b (f ) × N2b (f )|2 .
Once f is determined, then the extrinsic cam- Table 1 Some results by case of f is known
era parameters including the rotation parameters RMS error Standard deviation
and the translation parameters can be computed β1 (degree) 0.16 0.20
uniquely. θ1 (degree) 0.12 0.13
β2 (degree) 0.52 0.55
3. Experimental Results θ2 (degree) 0.13 0.15
β3 (degree) 0.09 0.11
3.1 Simulation Results
θ3 (degree) 0.08 0.06
In this simulation, we will use the world coor-
dinate system and camera coordinates system as
shown in figure 7. We assume that the length of 3.1.2 Two circles
an image sensor is 1/2 inch, and the image resolu- In the case of unknown f , our method calculates
tion is 640 × 480 pixels. Therefore, the length of the extrinsic camera parameters and f using a pair
one pixel is 0.15875 [mm]. of projected ellipses. We have tested our method
Figure 8 shows ellipses synthesized with the pa- using 32 sets of the ellipse-pair randomly selected
rameter settings (called “case-1”) of f = 200 from the case-1 and 17 sets from the case-2, respec-
[pixel], θ = 40 [degree], β = −10 [degree], tively. The experimental results are summarized in
Zg = 3.0 [m], and r = 1.0 [m]. Table 2 with suffixes 1 and 2.
Other examples of ellipses synthesized with the From these results, our method can estimate ac-
settings (called “case-2”) of f = 300 [pixel], θ = curate f as well as other parameters.
50 [degree], β = 30 [degree], Z g = 3.0 [m], and
r = 1.0 [m]. Table 2 Some results by case of f is unknown
RMS error Standard deviation
f1 (pixel) 5.52 9.21
β1 (degree) 0.36 0.47
θ1 (degree) 0.57 0.97
f2 (pixel) 7.19 11.89
β2 (degree) 0.11 0.15
θ2 (degree) 0.51 0.85

3.1.3 Comparison
In the case of f = 200 [pixel], θ = 38 [degree]
and β = −10 [degree], we synthesized a perspec-
Fig. 8 An example of ellipses synthesized by CG tive projection image include a square whose length
is 2.0 [m] and it s inscribed circle (Figure 9), and an
3.1.1 One circle orthographical projection image of the same scene.
When f is given, the extrinsic parameters can be Using these images, we compared our method with
estimated from a single circle. We have tested our the homography based approach.
method using ellipses in Figure 8 (case-1). 48 el-
lipses are selected as input data from the bottom to
sixth lines. The resulted parameters are denoted by
β1 and θ1 . 52 ellipses are selected from the top to
the third line. The resulted parameters are denoted
by β2 and θ2 . Also, the calibration result using 37
ellipses in case-2 is denoted by β 3 and θ3 .
The calibration results described above are sum- Fig. 9 An image for comparison with our method and
marized in Table 1. From these results, we can homography method
confirm that our method provides precise calibra-
tion results: In the cases of tiny reference ellipse Assuming that the correspondence between the
(worst case), the RMS errors are less than 0.52 and four vertexes of the square on the two images has
0.13 [degree] for β and θ respectively. In the better been established, we calculated the homography
case, it provides very accurate results of which the matrix, and then decomposed it into the rotation
RMS errors for β and θ are less than 0.09 and 0.08 matrix, translation, and the surface normal vector
Vol. 139 No. SIG 139(CVIM 2) Homography from Conic Intersection: Camera Calibration based on Arbitrary Circular Patterns
7

of the plane. The results are (θ 1 = 36.62 [degree], constant steering angle. When the observed trajec-
β1 = 11.28 [degree]) and (θ 2 = 38.11 [degree], tory is closed, the system examines whether the tra-
β2 = 9.71 [degree] ). We determined that the latter jectory is an ellipse or not. If the trajectory is an
result is valid, because it has smaller errors. ellipse, the system performs the camera calibration.
On the other hand, we applied our method to the After this calibration, the robot trajectory in the im-
same image. The result is θ = 37.94 [degree] and age plane will be able to be transformed into the
β = 10.22 [degree]. orthogonal view, i.e., we can monitor the real RC
The errors of four point correspondence are 0.11 car movement in the scene. Figure 10 (b) shows a
[degree] and 0.29 [degree] for θ and β, respectively. test environment.
Our method achieves smaller errors: 0.06 [degree] Figure 11 (a) shows the robot trajectory when
and 0.22 [degree] for θ and β, respectively. performing environment map generation using the
We notice that the conditions of these calibrations robot body. That is, if we detected that the robot
are slightly different. That is, square based calibra- movement is blocked irrelevant to the action com-
tion requires four points correspondence data, while mand, then we know that there must be an obstacle;
our method requires f and a single image of a sin- if the LEDs disappeared from the image, then we
gle circle but not point correspondence data. know that there must be an occluding object hiding
If we regard the above conditions are balanced the robot from the camera view. In this way, we can
in some sense, we can claim that our method pro- generate the environment map. The resulted map is
vides more precise results than four points corre- shown in figure 11 (b). Both of them are viewed
spondence calibration. from the virtual camera, and the real environment
3.2 Experiments with real data is showed in Figure 10 (b).
3.2.1 Accquisition of an Environmental Map
Our calibration method has been implemented as
an initialization procedure of a robot control sys-
tem. Our system consists of a four wheel RC car
with red and green LEDs, a fixed external camera
(SONY DFW VL500), and a PC with a radio con-
troller as shown in Figure 10 (a). In the system, the
motion trajectory of the RC car is observed by de-
tecting the LEDs from the images at video rate by
the color target detector 15) .

DFW-VL500
with Pan-tilt Unit

(a) (b)
Fig. 11 (a) Some robot’s move louses. (b) An obtained
environmental map
RD-6000
3.2.2 Accuracy Evaluation by other scenes
Our method is not limited to the camera calibra-
tion using a mobile robot. It can be applied to other
scenes (see figure 12), e.g., manholes on flat roads,
Dual CPU PC CD discs on a table, widening rings on the water
with PIC and PIO surface, and so on.
4. Conclusion
RC Car
withLEDs This paper presented a new camera calibration
method using circular pattern(s) drawn by four
(a) System (b) Test environment wheel robot. This approach can solve the prob-
Fig. 10 Experiment scenery lem estimating the extrinsic parameters between the
camera and far flat objects.
When the system is stared, it performs the cam- Estimating homography matrix from point cor-
era calibration automatically. In this calibration, the respondence data suffers from the alternative de-
RC car tries to move at low speed and keeping a composition problem. For solving this problem,
8 IPSJ Transactions on Computer Vision and Image Media Feb. 2098

or cameras.
In addition, we can realize an automatic camera
calibration system based on our method. We will
extend our method estimating image center and the
radial distortion parameters in the future work.
Acknowledgments: This research was partially
supported by the Ministry of Education, Science,
Sports and Culture, Grant-in-Aid for Scientific Re-
search (A), 12308016, 2000. The authors are grate-
ful to associate professor T.Nakamura (Wakayama
University) for helping the real robot experiments.
References
1) O.Faugeras, “Three-Dimensional Computer Vi-
sion: A Geometric Viewpoint”, MIT Press , 1993.
2) X.Meng and Z.Hu, “A New Easy Camera Calibra-
tion Technique Based on Circular Points”, Pattern
Recognition , Vol. 36, pp. 1155-1164, 2003.
3) G.Wang, F.Wu and Z.Hu, “Novel Approach to Cir-
cular Points Based Camera Calibration”,
4) J.S.Kim , H.W.Kim and I.S.Kweon, “A Camera
Calibration Method using Concentric Circles for Vi-
sion Applications”, ACCV pp. 23–25, 2002.
5) Yang, C., Sun, F. Hu, Z.: “Planar Conic Based Cam-
era Calibration”, ICPR, 2000.
6) Long, Q.: “Conic Reconstruction and Correspon-
dence From Two Views”, PAMI , Vol.18, No.2, pp.
151–160, 1996.
(a) Manholes on flat roads 7) S.Avidan and A.Shashua, “Trajectory Triangula-
tion: 3D Reconstruction of Moving Points from a
Monocular Image Sequence”, PAMI , Vol.22, No.4,
pp. 348–357, 2000.
8) J.Heikkila and O.Silven, “A Four-step Camera Cal-
ibration Procedure with Implicit Image Correction”.
9) P.Sturm, “A Case Against Kruppa’s Equations for
Camera Self-Calibration”, PAMI , Vol.22, No.10, pp.
(b) CD discs on a table 348–357, 2000.
10) P.Gurdjos, A.Grouzil and R.Payrissat, “Another
Way of Looking at Plane-Based Calibration: the
Center Circle Constraint”, ECCV , 2002.
11) R.Sukthankar, R.Stockton and M.Mullin, “Smarter
Presentations: Exploiting Homography in Camera-
Projector Systems”, ICCV pp. 247-253, 2001.
12) R.J. Holt and A.N.Netravali, “Camera Calibration
Problem: Some New Result”, CVIU , No.54, Vol.3,
pp. 368–383, 1991.
13) P.Sturm and S.Maybank, “On Plane-Based Cam-
(c) widening rings on the water surface era Calibration: A General Algorithm, Singularities,
Fig. 12 Other scenes Applications”, CVPR , pp. 432–437, 1999.
14) H.Wu, T.Wada and Q.Chen, “ Robot Body Guided
two planar patterns or three cameras are used. Our Camera Calibration”, JPJS SIG Notes, 2003-CVIM-
136, pp. 67–74, 2003.
method also produces dual solutions when refer-
15) T.Wada, “Color-Target Detection Based on Nearest
ring a single circle. However, by referring two or
Neighbor Classifier: Example Based Classification
more circles on same plane, our method produces and its Applications”, JPJS SIG Notes, 2002-CVIM-
a unique solution. This means that we can per- 134, pp. 17–24, 2002.
form the camera calibration without adding planes

You might also like