Homography from Conic Intersection Camera Calibration based on Arbitrary Circular Patterns
Homography from Conic Intersection Camera Calibration based on Arbitrary Circular Patterns
2098
Regular Paper
For the camera calibration, point correspondence data between 3D and 2D points widely spreading over
the image plane is necessary to get a precise result. However, in the case of wide observing area, it is
impractical to use a huge calibration object that covers the camera view. This paper presents a refined
calibration method using circular patterns. We already proposed a circular pattern based camera calibration
method using a four-wheel mobile robot. That is, by keeping low speed and constant steering angle, the
robot can draw a big circular pattern on a plane. Our previous method, however, has a limitation that roll
angle around the optical axis is assumed to be zero. This paper presents an improved version of the camera
calibration method. Our method estimates the extrinsic parameters of fixed camera from a single circular
trajectory when the focal length of the camera is known. In the case that the focal length is not given, two
co-planar un-concentric circular trajectories are enough for calibrating the extrinsic parameters and the focal
length. In both cases, the center and the radius of the circle(s), the speed of the robot are not required.
The extensive experiments over simulated images as well as real images demonstrate the robustness and the
accuracy of our method.
Λ = diag(λ
1 , λ2 , λ3 ) the oblique circular cone. Thus u 3 is the normal
.
V = v1 v2 v3 vector of the base plane of the oblique circular cone.
By rotating the oblique elliptical cone with V T , we From Eq.(20) we notice that there are four possible
obtain solutions of u3 . Since u3 and −u3 describe the
T two normal vectors of the same plane, the number
PT Qe P = (VP ) (VΛV−1 )(VP )
T , (14) of meaningfully different solutions is two.
= P ΛP = 0
where P = VT P. Oblicue circular cone
Thus, the oblique elliptical cone can be expressed
in the normalized form as following: Circle base
λ1 x2 + λ2 y 2 + λ3 z 2 = 0. (15)
By comparing Eq.(10) with Eq.(15), we can esti- Elliptical cone
mate the values a and b as follows:
Ellipse base
a = −λ3 /λ1
. (16) Fig. 6 Convert an elliptical cone to an oblique circular cone
b = −λ3 /λ2
From Eq.(12) we obtain 0 < a ≤ b. By substituting Eq.(20) for U in Eq.(17), we can
2.1.3 Converting an elliptical cone to an compute the values of x 0 /z0 , y0 /z0 and r/z0 . Then
oblique circular cone we transform (x0 /z0 , y0 /z0 , 1) back to the coordi-
Our problem is to compute the rotation matrix nates system that describes the elliptical cone with
U that transforms an elliptical cone expressed by UT to compute the center C 0 and the radius r of
Eq.(10) to an oblique circular cone expressed by the circular base as following:
Eq.(9).(See figure 6) 2 −a2
−(−1)l b1+a 2 bz0
This problem can be formalized by the following
equation: C0 = 0
, (22)
UT diag(1/a2 , 1/b2 , −1)U = kQc . (17) (−1)m 1+a 1+b b2
2 a z0
Since U is a rotation matrix, it can be represented b2
by three orthogonal unit vectors: U = [u 1 u2 u3 ], r = a |z0 |
where ui = [uix uiy uiz ]T ; (i = 1, 2, 3). We also where z0 remains as a scale factor that describes the
have distance between the origin and the plane where the
UT U = I. (18) circular base resides on.
FromEq.(17) we obtain In consequence, the rotation matrix R that trans-
2
a a+1 2 b2 +1 2 forms an oblique elliptical cone to an oblique circu-
2 u1x + b2 u1y
a2 +1 2 2 lar cone is obtained by R = UV.
− a2 u2x − b b+1 2 u2y
2
= 0 .(19)
a +1
2 2
b +1
a2 u1x u2x + b2 u1y u2y =0 2.2 Calibration with known focal length
By simplifying Eqs.(18) and (19), we obtain Here, we describe a method to estimate the ex-
δ cos α trinsic parameters of fixed camera by using one cir-
u1 = sin α
√
cular pattern.
l−m 2 In this paper, we assume a pinhole camera model
(−1) m 1 − δ cosα and will use two coordinates systems as shown
(−1) δ sin α
u2 = −(−1)√
m
cos α , (20) in Figure 7: 1) the world coordinates system (O-
l 2 XY Z), and 2) camera coordinates system (O -
(−1) √ 1 − δ sin
α
X Y Z ). The origins of both coordinates systems
(−1)l 1 − δ 2 are set at the projection center of the camera. The
u3 = 0
Z-axis of O-XY Z is set to be parallel to the normal
m
−(−1) δ vector of the ground. The Z -axis of O -X Y Z is
where set to be parallel to the optical-axis of the camera.
1 + 1/b2 The Y -axis of O-XY Z is set to be perpendicular to
δ= , (21) Z-axis and Z -axis.
1 + 1/a2 The scene configuration is characterized by the
and α is a free variable, l and m are arbitrary integer following parameters: 1) the distance between the
number. u 1 , u2 and u3 are the unit vectors of X, Y base plane (ground) and the viewpoint (projection
and Z axes of the coordinates system that describes center) zg ; 2) the radius of the circle r; 3) the focal
Vol. 139 No. SIG 139(CVIM 2) Homography from Conic Intersection: Camera Calibration based on Arbitrary Circular Patterns
5
3.1.3 Comparison
In the case of f = 200 [pixel], θ = 38 [degree]
and β = −10 [degree], we synthesized a perspec-
Fig. 8 An example of ellipses synthesized by CG tive projection image include a square whose length
is 2.0 [m] and it s inscribed circle (Figure 9), and an
3.1.1 One circle orthographical projection image of the same scene.
When f is given, the extrinsic parameters can be Using these images, we compared our method with
estimated from a single circle. We have tested our the homography based approach.
method using ellipses in Figure 8 (case-1). 48 el-
lipses are selected as input data from the bottom to
sixth lines. The resulted parameters are denoted by
β1 and θ1 . 52 ellipses are selected from the top to
the third line. The resulted parameters are denoted
by β2 and θ2 . Also, the calibration result using 37
ellipses in case-2 is denoted by β 3 and θ3 .
The calibration results described above are sum- Fig. 9 An image for comparison with our method and
marized in Table 1. From these results, we can homography method
confirm that our method provides precise calibra-
tion results: In the cases of tiny reference ellipse Assuming that the correspondence between the
(worst case), the RMS errors are less than 0.52 and four vertexes of the square on the two images has
0.13 [degree] for β and θ respectively. In the better been established, we calculated the homography
case, it provides very accurate results of which the matrix, and then decomposed it into the rotation
RMS errors for β and θ are less than 0.09 and 0.08 matrix, translation, and the surface normal vector
Vol. 139 No. SIG 139(CVIM 2) Homography from Conic Intersection: Camera Calibration based on Arbitrary Circular Patterns
7
of the plane. The results are (θ 1 = 36.62 [degree], constant steering angle. When the observed trajec-
β1 = 11.28 [degree]) and (θ 2 = 38.11 [degree], tory is closed, the system examines whether the tra-
β2 = 9.71 [degree] ). We determined that the latter jectory is an ellipse or not. If the trajectory is an
result is valid, because it has smaller errors. ellipse, the system performs the camera calibration.
On the other hand, we applied our method to the After this calibration, the robot trajectory in the im-
same image. The result is θ = 37.94 [degree] and age plane will be able to be transformed into the
β = 10.22 [degree]. orthogonal view, i.e., we can monitor the real RC
The errors of four point correspondence are 0.11 car movement in the scene. Figure 10 (b) shows a
[degree] and 0.29 [degree] for θ and β, respectively. test environment.
Our method achieves smaller errors: 0.06 [degree] Figure 11 (a) shows the robot trajectory when
and 0.22 [degree] for θ and β, respectively. performing environment map generation using the
We notice that the conditions of these calibrations robot body. That is, if we detected that the robot
are slightly different. That is, square based calibra- movement is blocked irrelevant to the action com-
tion requires four points correspondence data, while mand, then we know that there must be an obstacle;
our method requires f and a single image of a sin- if the LEDs disappeared from the image, then we
gle circle but not point correspondence data. know that there must be an occluding object hiding
If we regard the above conditions are balanced the robot from the camera view. In this way, we can
in some sense, we can claim that our method pro- generate the environment map. The resulted map is
vides more precise results than four points corre- shown in figure 11 (b). Both of them are viewed
spondence calibration. from the virtual camera, and the real environment
3.2 Experiments with real data is showed in Figure 10 (b).
3.2.1 Accquisition of an Environmental Map
Our calibration method has been implemented as
an initialization procedure of a robot control sys-
tem. Our system consists of a four wheel RC car
with red and green LEDs, a fixed external camera
(SONY DFW VL500), and a PC with a radio con-
troller as shown in Figure 10 (a). In the system, the
motion trajectory of the RC car is observed by de-
tecting the LEDs from the images at video rate by
the color target detector 15) .
DFW-VL500
with Pan-tilt Unit
(a) (b)
Fig. 11 (a) Some robot’s move louses. (b) An obtained
environmental map
RD-6000
3.2.2 Accuracy Evaluation by other scenes
Our method is not limited to the camera calibra-
tion using a mobile robot. It can be applied to other
scenes (see figure 12), e.g., manholes on flat roads,
Dual CPU PC CD discs on a table, widening rings on the water
with PIC and PIO surface, and so on.
4. Conclusion
RC Car
withLEDs This paper presented a new camera calibration
method using circular pattern(s) drawn by four
(a) System (b) Test environment wheel robot. This approach can solve the prob-
Fig. 10 Experiment scenery lem estimating the extrinsic parameters between the
camera and far flat objects.
When the system is stared, it performs the cam- Estimating homography matrix from point cor-
era calibration automatically. In this calibration, the respondence data suffers from the alternative de-
RC car tries to move at low speed and keeping a composition problem. For solving this problem,
8 IPSJ Transactions on Computer Vision and Image Media Feb. 2098
or cameras.
In addition, we can realize an automatic camera
calibration system based on our method. We will
extend our method estimating image center and the
radial distortion parameters in the future work.
Acknowledgments: This research was partially
supported by the Ministry of Education, Science,
Sports and Culture, Grant-in-Aid for Scientific Re-
search (A), 12308016, 2000. The authors are grate-
ful to associate professor T.Nakamura (Wakayama
University) for helping the real robot experiments.
References
1) O.Faugeras, “Three-Dimensional Computer Vi-
sion: A Geometric Viewpoint”, MIT Press , 1993.
2) X.Meng and Z.Hu, “A New Easy Camera Calibra-
tion Technique Based on Circular Points”, Pattern
Recognition , Vol. 36, pp. 1155-1164, 2003.
3) G.Wang, F.Wu and Z.Hu, “Novel Approach to Cir-
cular Points Based Camera Calibration”,
4) J.S.Kim , H.W.Kim and I.S.Kweon, “A Camera
Calibration Method using Concentric Circles for Vi-
sion Applications”, ACCV pp. 23–25, 2002.
5) Yang, C., Sun, F. Hu, Z.: “Planar Conic Based Cam-
era Calibration”, ICPR, 2000.
6) Long, Q.: “Conic Reconstruction and Correspon-
dence From Two Views”, PAMI , Vol.18, No.2, pp.
151–160, 1996.
(a) Manholes on flat roads 7) S.Avidan and A.Shashua, “Trajectory Triangula-
tion: 3D Reconstruction of Moving Points from a
Monocular Image Sequence”, PAMI , Vol.22, No.4,
pp. 348–357, 2000.
8) J.Heikkila and O.Silven, “A Four-step Camera Cal-
ibration Procedure with Implicit Image Correction”.
9) P.Sturm, “A Case Against Kruppa’s Equations for
Camera Self-Calibration”, PAMI , Vol.22, No.10, pp.
(b) CD discs on a table 348–357, 2000.
10) P.Gurdjos, A.Grouzil and R.Payrissat, “Another
Way of Looking at Plane-Based Calibration: the
Center Circle Constraint”, ECCV , 2002.
11) R.Sukthankar, R.Stockton and M.Mullin, “Smarter
Presentations: Exploiting Homography in Camera-
Projector Systems”, ICCV pp. 247-253, 2001.
12) R.J. Holt and A.N.Netravali, “Camera Calibration
Problem: Some New Result”, CVIU , No.54, Vol.3,
pp. 368–383, 1991.
13) P.Sturm and S.Maybank, “On Plane-Based Cam-
(c) widening rings on the water surface era Calibration: A General Algorithm, Singularities,
Fig. 12 Other scenes Applications”, CVPR , pp. 432–437, 1999.
14) H.Wu, T.Wada and Q.Chen, “ Robot Body Guided
two planar patterns or three cameras are used. Our Camera Calibration”, JPJS SIG Notes, 2003-CVIM-
136, pp. 67–74, 2003.
method also produces dual solutions when refer-
15) T.Wada, “Color-Target Detection Based on Nearest
ring a single circle. However, by referring two or
Neighbor Classifier: Example Based Classification
more circles on same plane, our method produces and its Applications”, JPJS SIG Notes, 2002-CVIM-
a unique solution. This means that we can per- 134, pp. 17–24, 2002.
form the camera calibration without adding planes