Home
Search
Collections
Journals
About
Contact us
My IOPscience
Calibration method for the relative orientation between the rotation axis and a camera using constrained global optimization
This content has been downloaded from IOPscience. Please scroll down to see the full text. 2017 Meas. Sci. Technol. 28 055001 (http://iopscience.iop.org/0957-0233/28/5/055001) View the table of contents for this issue, or go to the journal homepage for more Download details: IP Address: 161.112.19.86 This content was downloaded on 03/03/2017 at 15:07 Please note that terms and conditions apply.
Measurement Science and Technology Meas. Sci. Technol. 28 (2017) 055001 (9pp)
doi:10.1088/1361-6501/aa5fd4
Calibration method for the relative orientation between the rotation axis and a camera using constrained global optimization Zhenqi Niu, Kuo Liu, Yuemin Wang, Shujun Huang, Xiaoting Deng and Zonghua Zhang School of Mechanical Engineering, Hebei University of Technology, Tianjin, 300130 People’s Republic of China E-mail:
[email protected] and
[email protected] Received 25 October 2016, revised 8 February 2017 Accepted for publication 10 February 2017 Published 3 March 2017 Abstract
This paper proposes a novel method for calibrating the relative orientation of a camera fixed on a rotation axis using the constrained global optimization algorithm. Because the camera cannot directly ‘see’ the rotation axis, the calibration procedure uses two checkerboards and another camera with a large viewing-angle. One small checkerboard is rotated with the camera to determine the rotation axis; the other large checkerboard calibrates the relative orientation of the two cameras. The determined rotation axis and the obtained optical axis of the calibrated camera can be represented in the same camera coordinate system, so that we can calculate their relative orientation. In our experimental results, the mean squared error of the angle between the rotation axis and the optical axis of the calibrated camera was 0.027°. Under the given conditions, the two checkerboards and the camera can be randomly placed. Therefore, the proposed calibration method is flexible, effective, and very accurate. Keywords: rotation axis calibration, camera calibration, PCA (principal component analysis), constrained global optimization (Some figures may appear in colour only in the online journal)
1. Introduction
registration and integration that is based on the rotated angle. Therefore, it is important to calibrate the relationship between the rotation axis and the camera. A turntable calibration for implementing rotation scanning has been extensively studied by researchers. Chen et al [4] fixed a checkerboard plate on a turntable and used a camera to take photos to retrieve 3D corner points from a 360° viewpoint. Then, they constructed the circular trajectory plane equation of the retrieved corner points, and obtained the normal vector along the rotation axis direction using the least squares method. However, there are two disadvantages to this method that cannot be ignored. Firstly, to acquire image sequences of the checkerboard that is flat on the turntable, the camera must look down at the checkerboard and turntable so that the
Digital and depth cameras have been extensively used to obtain texture and 3D scene information in various fields [1, 2]. However, such cameras can only obtain data from a field of view that is directed by the optical axis at a time. To obtain scene information from a larger field of view, the camera must be translated or rotated to capture a series of images [3]. In particular, the camera may be rotated around a rotation axis. It is generally difficult to ensure that the optical center of camera stays on the rotation axis and make corresponding compensation to reduce error because the relative orientation between the rotation axis and the camera is unknown. As a result, it is incapable of implementing the subsequent point cloud 1361-6501/17/055001+9$33.00
1
© 2017 IOP Publishing Ltd Printed in the UK
Z Niu et al
Meas. Sci. Technol. 28 (2017) 055001
Figure 1. Schematic diagram of the relationship between camera and rotation axis: (a) separate camera and rotation axis; and (b) camera fixed to the rotation axis.
angle between the optical axis of the camera and the rotation axis is close to 17° (see Chen et al [4]). However, this type of structure is limiting in real industrial applications, because a lot of important information especially the bottom information would be easy to be sheltered when the measured object shape is complex. Generally, the camera of a 3D measurement system [5, 6] cannot accurately acquire image sequences of a checkerboard from a 360° viewpoint, which can increase the calibration error, because the angle between the optical axis of the camera and the rotation axis is close to 90°, as shown in figure 1(a). Secondly, the normal vector was calculated based on the trajectory plane and ignored the impact of the circular trajectory. Consequently, the accuracy of the rotation axis estimate cannot be guaranteed. The calibrated rotation axis and the camera are separate. Li et al [7] mounted a criterion sphere with known diameter on a turntable to calibrate the rotation axis. They used several rotational angles of the turntable and two different sphere heights to determine the rotational axis of the turntable. However, this method requires the positions of several centers of the criterion sphere at different angles and heights. It then fits the centers of each trajectory circle to determine the orientation of the rotation axis. Therefore, the procedure is time-consuming because of the processes required to set and fit the sphere and relies on a highly accurate device. Park et al [8] fixed a checkerboard calibration plate on a turntable and used a camera to take photos to retrieve 3D corner points from a certain angle. The rotation axis was obtained using the coordinate transformation of the checkerboard. However, this method does not take into account that the rotational angle of the calibration plate is limited, which increases the calibration error when binocular camera has a small effective field of view. Moreover, the camera and the rotation axis are separate. Pang et al [9] proposed an automatic method for calibrating the turntable axis without any calibration tools. Given a scan sequence of the input object, the initial rotation axis was first recovered from an automatic registration step and then an iterative procedure was applied to obtain the optimized turntable axis. However, this method depends on the precision of the iterative and registration procedures. Moreover, the camera and the rotation axis are separate. Therefore, all existing rotation axis calibration methods are only suitable in situations where the camera and rotation axis are separate, as shown in figure 1(a). However, in some cases, the camera is fixed on the rotation axis, which means it cannot ‘see’ the rotation axis, as illustrated in figure 1(b).
To accurately merge multiple images, we must determine the relative orientation between the camera and rotation axis. To the best of our knowledge, there are no published methods for calibrating this orientation. This paper proposes a novel method for calibrating the orientation between a camera (called Camera 1 in the following text) and a rotation axis using two calibration plates and another large viewing-angle camera (called Camera 2) using the constrained global optimization algorithm [4]. The two calibration plates are differently sized checkerboards: one is small (Checkerboard 1) and used to determine the rotation axis in Camera 2 coordinate system; the other is large (Checkerboards 2) and used to calibrate the relative orientation of the two cameras. Checkerboard 1 is rotated with Camera 1 around the rotation axis. Camera 2 captures a series of Checkerboard 1 images at each rotated position. The process of calculating the normal vector takes into account the impact of the circular trajectory. In general, the rotation axis passes through all the circle trajectory planes at their circle centers. Therefore, each trajectory circle and its center are determined by fitting a circle using the global least squares method. Using the coordinates of the circle centers, the rotation axis is fitted using PCA (principal component analysis) in Camera 2 coordinate system. Therefore, the relationship between Camera 1 and the rotation axis is obtained in the same camera coordinate system. Our experimental results confirmed that the proposed method is very accurate, flexible, and feasible. In section 2, we introduce the principles of the calibration method and its implementation. We describe our experiments that validate the high-precision and flexibility of the proposed method in section 3. Section 4 contains our conclusions and future research directions. 2. Principle Camera 1 cannot ‘see’ the rotation axis, so an additional auxiliary facility determines their relationship. The procedure for calibrating the rotation axis using Camera 2 and two checkerboards is shown in the flowchart in figure 2. First, the internal parameters of the two cameras are calibrated using multiple views of the two checkerboards. Second, by capturing the Checkerboard 2 images, the relative position of the two cameras is determined in the plate coordinate system. Third, while Checkerboard 1 rotates with Camera 1 around the rotation axis, a series of Checkerboard 1 images is captured by Camera 2. At each rotation position, we determine the corner points of 2
Z Niu et al
Meas. Sci. Technol. 28 (2017) 055001
Figure 3. Ideal pinhole camera model.
A 2D point is denoted by Pu = [u, v]T , which is ideal imaging point of Pw. We use x to denote the augmented vector by adding 1 as the last element: Pu = [u, v, 1]T and Pw = [X , Y , Z , 1]T . The relationship between Pw and Pu is given by λPu = A[R t ]Pw, λ ≠ 0 (1)
where R is a 3 × 3 orthogonal rotation matrix; T is a 3 × 1 translation vector; Pu represents the image coordinates of the object in the computer coordinate system (in pixels); Pw represents the 3D coordinates of the object in the 3D world coordinate system; λ is an arbitrary scale factor such that λ ≠ 0; A is an internal parameters matrix defined as ⎡ fu 0 u 0 ⎤ ⎢ ⎥ A = ⎢ 0 fv v0 ⎥ ⎣0 0 1 ⎦
fu and fv are the effective focal lengths (in pixels) of the camera in the x and y directions, respectively; and u 0 and v0 are the principal point coordinates of the camera. Lens distortions are corrected using
Figure 2. Procedure for calibrating the rotation axis.
Checkerboard 1. The extracted points for each checkerboard corner are used to construct a circle centered on the rotation axis, which is calculated using the constrained global least squares method [10]. Fourth, based on the calculated coordinates of all the center points, the rotation axis is fitted using PCA. Finally, we obtain the direction vector of the rotation axis and an arbitrary point on the rotation axis in the calibrated camera coordinate system. The following subsections elaborate on the details of each step.
⎧ x = x + x (k r 2 + k r 4 ) + 2p x y + p (r 2 + 2x2) 2 d u u 1 u 1 u u 2 ⎨ (2) 2 ⎪ 2 4 2 ⎩ yd = yu + yu (k1r + k2r ) + p1 (r + 2yu ) + 2p2 xuyu ⎪
where r 2 = xu2 + yu2, the parameters (k1, k2, p1 and p2) are image symmetric radial and decentering distortion coefficients. The camera’s internal parameters can be calibrated using multiple views of the checkerboard pattern, by applying the plane-based calibration algorithm. The non-linear nature of equation (2) means that we must use an iterative algorithm to estimate the parameters by minimizing the error between the model and observations. Typically, this procedure is performed using a least squares fit that minimizes the sum of the squared errors. The objective function is expressed as
2.1. Internal parameters of the camera
The internal parameters of a camera include two focal lengths (fu and fv), two principal point coordinates (u0 and v0), and four image symmetric radial and decentering dist ortion coefficients (k1, k2, p1, and p2) [11–15]. Two highly accurate checkerboards were manufactured to calibrate the internal parameters of the camera and its orientation. These internal parameters were calculated from the captured checkerboard image series using the general camera calibration procedure [16]. Cameras 1 and 2 were calibrated using Checkerboard 2. Generally, a real camera imaging model can be divided into two parts: the pinhole camera model and the lens dist ortion model. On the basis of pinhole camera imaging and perspective geometry theory, the ideal camera model is shown in figure 3. A 3D point is denoted by Pw = [X , Y , Z ]T .
N
J = ∑((xmi − xdi )2 + (ymi − ydi )2) (3) i=1
where (x di, ydi ) are the real image coordinates and (xmi, ymi ) are the estimated image coordinates, using 3D world coordinates based on the camera projection model. 2.2. Relative orientation of two cameras
The purpose of this step is to determine the relative orientation of the two cameras. The relationship between the two cameras can be calibrated using a checkerboard 3
Z Niu et al
Meas. Sci. Technol. 28 (2017) 055001
2.3. Rotation axis calculation
Figure 5 shows the structural diagram of a calibration system. In figure 5, there are two checkerboards (Checkerboard 1 and Checkerboard 2), the calibrated camera (Camera 1) which is fixed on the rotation axis and another large viewing-angle camera. Before rotating the rotation axis, Checkerboard 2 was located in the views of Cameras 1 and 2 to determine the two cameras orientation. During the 360° rotation process, Checkerboard 1 was always in the field of view of Camera 2. An arbitrary point is rotated around the rotation axis to create a closed circular trajectory on a plane [17], so all the centers are located on the rotation axis. Therefore, a small checkerboard (Checkerboard 1) can be fixed on Camera 1 to calculate the rotation axis in Camera 2 coordinate system, as shown in figure 5. When Checkerboard 1 rotates around the rotation axis, Camera 2 captures a series of images. At each position, we extract the corner points of the captured Checkerboard 1. Each circle center can be obtained from the extracted coordinates of the same corner point using the constrained global least squares method [18, 19]. Although Camera 2 is a high resolution camera (which was used to capture the images of Checkerboard 1), the extracted corner points are not located on a plane because of noise. They must be projected on the circular trajectory plane to calculate the center of each circle. Assuming that the 3D corner points of Checkerboard 1 are (X i, Y i, Z i) in Camera 2 coordinate system, the corresponding projection points (xi, yi , zi) on the circular trajectory plane can be acquired through calculating the intersections of the plane and lines composed of 3D corners (X i, Y i, Z i) and the normal vector of the plane.
Figure 4. Calibrating orientation of two cameras.
Figure 5. Structural diagram of a calibration system. Camera 1 and Checkerboard 1 are fixed on the rotation axis.
⎧ (b 2 + c 2 )X i − (bY i + cZ i + d ) a ⎪ xi = a2 + b 2 + c 2 ⎪ ⎪ (b 2 + c 2 )X i − (bY i + cZ i + d ) a ⎨ yi = b + Yi (7) (a2 + b 2 + c 2 ) a ⎪ ⎪ (b 2 + c 2 )X i − (bY i + cZ i + d ) a ⎪ zi = c + Zi (a2 + b 2 + c 2 ) a ⎩
(called Checkerboard 2), as shown in figure 4. In principle, Checkerboard 2 can be randomly fixed in the common field of view of the two cameras. To accurately calculate the external parameters, the surface normal of Checkerboard 2 is approximately placed along the angular bisector of the imaging axis of the two cameras. Assuming that the arbitrary corner point coordinate of Checkerboard 2 is P1, the corresponding point coordinate in the Camera 1 coordinate system is Pc1, and the corresponding point coordinate in the Camera 2 coordinate system is Pc2, the transformation from Checkerboard 2 to Camera 1 coordinates and Camera 2 coordinates can be obtained using
where a, b, c, and d are the equation coefficients of the calculated plane, which are estimated by using the least squares method.. If the center coordinate and radius of the circular trajectory are (A, B, C ) and R, respectively, they can be solved using the constrained global least squares method. That is, ⎧ 2 2 2 2 2 2
Rc1P1 + Tc1 = Pc1 (4)
and
⎨ ⎩
Rc2P1 + Tc2 = Pc2 (5)
where Rc1 and Rc2 are 3 × 3 orthogonal rotation matrices, and Tc1 and Tc2 are 3 × 1 translation vectors. Equations (4) and (5) relate the world coordinate system to the Camera 1 and Camera 2 coordinate systems, respectively. Therefore, the relative relationship between Cameras 1 and 2 can be represented as
min(δi ) = min(d i − R ) = min((xi − A) + (yi − B) + (zi − C ) − R ) aA + bB + cC + d = 0
(8)
where a, b, c, and d have the same meanings as that in equation (7). When the plane of Checkerboard 1 is not vertical to the rotation axis, the circular trajectories of all corner points on the checkerboard are not in the same plane and all the centers of circular trajectory form the rotation axis. Using the coordinates of all the center points, the rotation axis is fitted using
1 −1 Rc2 R− (6) c1 Pc1 − Rc2 Rc1 Tc1 + Tc2 = Pc2
4
Z Niu et al
Meas. Sci. Technol. 28 (2017) 055001
⎧→ n = (Rsc1 ⋅ R )−1 ⋅ (n→ − Rsc1 ⋅ T − Tsc1) ⎨ i (15) −1 ⎩ qi = (Rsc1 ⋅ R ) ⋅ (q0 − Rsc1 ⋅ T − Tsc1)
→
PCA in the Camera 2 coordinate system. A vector n denotes the direction of the rotation axis in the Camera 2 coordinate system.
⎪ ⎪
The relative orientation between an arbitrary point on the rotation axis and the rotation axis keep unchanged, although the rotation axis is rotated. To verify the accuracy of the estimated rotation axis, we calculated the angle between the rotation axis and the optical axis of Camera 1.The vector of the optical axis of Camera 1 is denoted as
2.4. Relationship between rotation axis and camera
The rotation axis is calculated in the Camera 2 coordinate system. The rotation axis’s parameters are generally defined in two parts: direction vector n→ and an arbitrary point q0 on the axis [4, 13, 14]. To determine the relative orientation between the rotation axis and Camera 1, the rotation axis must be transformed into the Camera 1 coordinate system. However, Camera 1 is rotated around the rotation axis, so that the relative orientation between the rotation axis and Camera 1 cannot remain unchanged. Then to obtain the accurate relative orientation, it is significant to calibrate the transformation from Camera 1 under each rotational angle to Camera 2. Before rotating Camera 1, Checkerboard 2 is randomly placed in the common field of view of the two cameras to determine their relative orientation, as demonstrated in figure 5. Assuming that the arbitrary corner point coordinate of Checkerboard 2 is P2, the corresponding point coordinates in Camera2 coordinate system and Camera 1 coordinate system are Psv2 and Px2, respectively. Assuming that the arbitrary corner point coordinate of Checkerboard 1 is P1, and the corresponding point coordinates in Camera2 coordinate system and Camera 1 coordinate system are Psv1 and Px1, respectively. The transformations from Checkerboard 2 to Camera 2 and Camera1 can be calculated using
→ nx = [0, 0, m]T , m > 0 (16)
The angle between the optical axis of Camera 1 and rotation axis under each rotational angle can be represented as → → nx ⋅ ni cos θ = (17) → nx × → ni
3. Experiments and results 3.1. Experimental system
To test the proposed method, we set up an experimental system that comprised the camera to be calibrated (Camera 1), a large viewing-angle camera (Camera 2), two checkerboards, and a rotation axis, as shown in figure 6. Checkerboard 1 has 8 × 11 white-and-black checkers that are 20 mm × 20 mm in size; and Checkerboard 2 has 8 × 11 white-and-black checkers that are 45 mm × 45 mm in size. Before rotating the rotation axis, Checkerboard 2 was located in the views of Cameras 1 and 2 to determine the two cameras orientation. During the 360° rotation process, Checkerboard 1 was always in the field of view of Camera 2. The RGB camera of a Kinect V1 [20] was used as Camera 1, which was fixed on the rotation axis. Camera 2 was a SVS-Vistek camera from Germany, model number of ceo655CVGE. It had a CS-mount lens interface, a pixel size of 3.45 µm × 3.45 µm, a resolution of 2448 × 2050, a frame rate of 10 fps at full resolution, and a GigE interface. To obtain a large measurement field, we used a computer lens with a fixed focus-length of 8 mm, which has a C-mount interface. Because of the CS-mount of the chosen camera, we added a 5 mm tube between the lens and Camera 2.
Rsc2P2 + Tsc2 = Psv2 (9)
and Rx c2P2 + Txc2 = Px 2 (10)
where Rsc2 and Rxc2 are 3 × 3 orthogonal rotation matrices, and Tsc2 and Txc2 are 3 × 1 translation vectors. The transformation from Checkerboard 1 to Camera 2 is Rsc1P1 + Tsc1 = Psv1 (11)
where Rsc1 is 3 × 3 orthogonal rotation matrice, and Tsc1 is 3 × 1 translation vector. According to equations (9)–(11), the transformation from Camera 1 to Checkerboard 1 can be represented by RPx1 + T = P1 (12)
and
3.2. Data capturing and processing
−1 −1 ⎧ ⎪R = R R R sc1 sc2 xc2 ⎨ (13) −1 −1 −1 −1 ⎪ ⎩T = R sc1 T sc2 − R sc1 T sc1 − R sc1 R sc2R xc2T xc2
Forty-five images of Checkerboard 2 under different orientations were captured to calibrate the internal parameters of the two cameras. We detected 88 corner points of the checkerboard and used them as the intersections of straight lines fitted to each square. We used Zhang’s calibration method [12, 16, 21] to obtain the internal parameters of the two cameras, which are shown in table 1. Reprojection error is defined as the difference between the detected sub-pixel corners and the reprojected grid corners based on intrinsic and extrinsic parameters. The reprojection error of Camera 1 from camera calibration is shown in the
where R is 3 × 3 orthogonal rotation matrice, and T is 3 × 1 translation vector. Based on equations (11)–(13), the transformation from Camera 1 under each rotational angle to Camera 2 is Rsc1RPx1 + Rsc1T + Tsc1 = Psv1 (14)
The direction vector of the rotation axis and an arbitrary point on the axis in the Camera 1 coordinate system is denoted as 5
Z Niu et al
Meas. Sci. Technol. 28 (2017) 055001
Figure 6. Experimental system. Camera 1 and Checkerboard 1 are fixed on the rotation axis. Table 1. Internal parameters of the two calibrated cameras.
Camera 1
Camera 2
⎡ ⎤ ⎢⎣1078.149 1077.528⎥⎦ Principal point (u0, v0) (Pixel) ⎡ ⎤ ⎢⎣ 636.381 501.413⎥⎦ Distortion coefficients(k1, k2, p1 and p2) ⎡ ⎤ ⎢⎣ 0.0483 −0.1395 −0.0014 −0.0009 ⎥⎦
Focal length (fu, fv) (Pixel)
form of color crosses in figure 7(a). Mean reprojection error is 0.181 and 0.163 pixels in horizontal and vertical directions, respectively. The reprojection error of Camera 2 from the camera calibration is shown in the form of color crosses in figure 7(b). Mean reprojection error is 0.091 and 0.084 pixels in horizontal and vertical directions, respectively. To test the proposed calibration method, we rotated Checkerboard 1 360° around the rotation axis. Camera 2 captured 24 images of Checkerboard 1 that were 15° apart from each other. Eight images that were 45° apart from each other were used to calibrate the parameters of the rotation axis, and the other 16 images were used to verify the accuracy of the calibrated results (as described in the following subsection). We extracted the corner points of Checkerboard 1 (order of the corner points on a certain position were shown in figure 8) from the eight captured images to fit the circular trajectory of 88 corner points using equation (8). The order number of the corner points should keep unchanged in all captured images. Using the center points of all the circles, we estimated the rotation axis direction vector n→ = [ nx n y nz ]T using PCA
⎡ ⎤ ⎢⎣ 2416.132 2416.858⎥⎦ ⎡ ⎤ ⎢⎣1232.240 994.304 ⎥⎦ ⎡ ⎤ ⎢⎣−0.1052 0.2004 −0.0016 0.0005⎥⎦
common field of view of the two cameras to determine their relative orientation. We estimated the angle between the rotation axis and the optical axis of Camera 1 using equation (17), as shown in table 3. The mean value of the angle was 90.883°, and the mean squared error was 0.0161°. 3.3. Analysis and evaluation
The estimated rotation axis parameters are affected by the precision of the spatial circle fit and the line fit. To verify the accuracy of the fitting algorithm, we calculated mean squared errors corresponding to 88 fitted circles, as shown in figure 9. The corner number in figure 9 corresponds to the order of the corner points in figure 8. The maximum error was below 0.708 mm. To simultaneously obtain images of Checkerboard 1 and Checkerboard 2, the rotation axis was nearly located in the center of the image and the angle between the optical axis of Camera 2 and rotation axis was close to 58.9°. Therefore, there are two seasons why the trend of errors in figure 9 sees a slight increasing. The first season is that the accuracy of the corner points of Checkerboard 1 was influenced at different levels by the distortion of Camera 2. And the second season is that the extraction accuracy of the corner points of Checkerboard 1 was influenced because the sizes of checkerboards in the captured images were different. We used the calculated circle centers and radiuses to determine each fitted circle. Then, the mean squared errors of the 88 fitted circles can be defined as
and calculated an arbitrary point q0 on the axis in the Camera 2 coordinate system. The results are listed in table 2. The relative orientation between an arbitrary point on the rotation axis and the rotation axis keep unchanged, although the rotation axis is rotated. To verify the accuracy of the estimated rotation axis, we calculated the angle between the rotation axis and the optical axis of Camera 1. Before rotating Camera 1, Checkerboard 2 was randomly placed in the
6
Z Niu et al
Meas. Sci. Technol. 28 (2017) 055001
Figure 9. Mean squared error of 88 fitted circles. Table 2. Direction vectors of the rotation axis and an arbitrary
point on the rotation axis. Parameter
Estimated value
nx ny nz q0 (mm)
−0.00340 0.856 0.517 [−55.482, 1097.443, 1307.016]
Table 3. Calibration results of angle between the rotation axis and
optical axis of Camera 1.
Rotational angle (°)
Estimated value (°)
0 45 90 135 180 225 270 315
90.868 90.893 90.922 90.856 90.863 90.912 90.887 90.864
Figure 7. Error obtained from camera calibration. (a) Reprojection error of each calibration image of Camera 1. (b) Reprojection error of each calibration image of Camera 2.
Figure 10. Distance between the 88 circle centers and the fitted
line.
m
err =
⎛
∑⎜⎝
i=0
(
(xi − A)2 + (yi − B )2 + (zi − C )2 − R
2⎞
)
⎟ ⎠
m (18) where (A, B, C ), and R are the center coordinates and radius of the circular trajectory, (x i, yi, zi) is calculated using equation (7), and m is the number of images captured to calibrate the rotation axis.
Figure 8. Sequence corner points of Checkerboard 1.
7
Z Niu et al
Meas. Sci. Technol. 28 (2017) 055001
An accurate estimate of the angle between the optical axis of the camera and the rotation axis has an important role, for example, in the fields of 3D scene reconstruction, virtual reality, and augmented reality. Acknowledgment The authors would like to thank the National Natural Science Foundation of China (61171048, 51675160), Key Basic Research Project of Applied Basic Research Programs Supported by Hebei Province (under grant 15961701D), Research Project for High-level Talents in Hebei University (under grant GCC2014049), and Talents Project Training Funds in Hebei Province (NO: A201500503). This project was also funded by European Horizon 2020 through the Marie Sklodowska-Curie Individual Fellowship Scheme (under grant 767466-3DRM).
Figure 11. Error of the angle between rotation axis and optical axis
of Camera 1.
Then we estimated the parameters of rotation axis by using PCA algorithm that contributes to reduce the sensitivity to noise and estimate a global optimal solution. Figure 10 shows the 88 distances between the circle centers and the fitted line; all were less than 5.4 × 10−4 mm. The corner number in figure 10 corresponds to the order of the corner points in figure 8. To further verify the accuracy of the proposed method, we calculated the angle between the rotation axis and the optical axis of Camera 1 using the other 16 images of Checkerboard 1 and equation (17). The mean squared error was 0.027° and the error of the angle was below 0.057°, as shown in figure 11. These experimental results confirm that the proposed method is very accurate.
References [1] Barone S, Paoli A and Razionale A V 2013 Multiple alignments of range maps by active stereo imaging and global marker framing Opt. Lasers Eng. 51 116–27 [2] Yue H, Chen W, Wu X and Liu J 2014 Fast 3D modeling in complex environments using a single Kinect sensor Opt. Lasers Eng. 53 104–11 [3] Zhao H and Shibasaki R 2003 Reconstructing textured CAD model of urban environment using vehicle-born laser range scanners and line cameras Mach. Vision Appl. 14 35–41 [4] Chen P, Dai M, Chen K and Zhang Z 2014 Rotation axis calibration of a turntable using constrained global optimization Optik 125 4831–6 [5] Herráez J, Martínezllario J, Coll E, Rodriguez J and Martin M T 2013 Design and calibration of a 3D modeling system by videogrammetry Meas. Sci. Technol. 24390–2 [6] Li X, Wee W G 2005 Data fusion method for 3D object reconstruction from range images Opt. Eng. 44 107006 [7] Li J, Chen M, Jin X, Chen Y, Dai Z, Ou Z and Tang Q 2011 Calibration of a multiple axes 3D laser scanning system consisting of robot, portable scanner and turntable Optik 122 324–9 [8] Park S Y and Subbarao M 2005 A multi-view 3D modeling system based on stereo vision techniques Mach. Vis. Appl. 16 148–56 [9] Pang X, Lau R W H, Song Z, Li Y and He S 2014 A toolfree calibration Method for turntable-based 3D scanning systems IEEE Comput. Graph. Appl. 36 52–61 [10] Patrick W, Joerg M and Lutz G 2015 Constrained ellipse fitting with center on a line J. Math. Imaging Vis. 53 364–82 [11] Zhou F and Zhang G 2005 Complete calibration of a structured light stripe vision sensor through planar target of unknown orientations Image Vis. Comput. 23 59–67 [12] Zhang Z 2000 A flexible new technique for camera calibration IEEE Trans. Pattern Anal. Mach. Intell. 22 1330–4 [13] Dewitt B A and Wolf P R 2000 Elements of Photogrammetry with Applications in GIS (New York: McGraw-Hill) [14] Mikhail E M, Bethel J and Mcglone J C 2001 Introduction to Modern Photogrammetry (New York: Wiley)
4. Conclusion By fixing a checkerboard on a rotation axis, this paper proposed a novel method for calibrating the relative orientation between the rotation axis and a camera using the constrained global optimization algorithm. Because the camera cannot directly ‘see’ the rotation axis, two checkerboards and another camera with a large-viewing angle were used for the calibration procedure. We used the corner points of the checkerboard and the constraints between them to reduce the global mean error. To reduce the global mean error of the rotation axis, we calculated the coordinates of the circle centers using the constrained global least square method. Using these center points, we fit the rotation axis using PCA. The mean squared error of the angle between the rotation axis and the optical axis of the camera was less than 0.027°. Our experimental results demonstrate that the proposed method estimates the parameters of the rotation axis with respect to the camera effectively and accurately. The proposed calibration method has the following advantages. (1) Flexibility: two checkerboards and one camera can be randomly placed under given conditions. (2) High accuracy: the mean squared error of the calibrated angle was below 0.027° in our experiment. (3) Easy operation: an operator with minimal skill can implement the calibration procedure on site.
8
Z Niu et al
Meas. Sci. Technol. 28 (2017) 055001
[19] Wu Z, Hu X, Wu M and Cao J 2013 Constrained total least-squares calibration of three-axis magnetometer for vehicular applications Meas. Sci. Technol. 24 32–5 [20] Newcombe R A, Izadi S, Hilliges O, Molyneaux D, Kim D, DavisonA J, Kohli P, Shotton J, Hodges S and Fitzgibbon A 2011 Kinect fusion: real-time dense surface mapping and tracking IEEE Int. Symp. on Mixed & Augmented Reality vol 26 pp 127–36 [21] Gu F, Zhao H, Ma Y and Bu P 2015 Camera calibration based on the back projection process Meas. Sci. Technol. 26 125004–13
[15] Clarke T A and Fryer J G 1998 The development of camera calibration methods and models Photogramm. Rec. 16 51–66 [16] Bouguet J Y 2015 Camera calibration toolbox for matlab (www.vision.caltech.edu/bouguetj/calib_doc/htmls/ example.html) [17] Dai M, Chen L, Yang F and He X 2013 Calibration of revolution axis for 360 degree surface measurement Appl. Opt. 52 5440–8 [18] Schaffrin B 2006 A note on constrained total Least-Squares estimation Linear Algebr. Appl. 417 245–58
9