Robot Visual Servo through Trajectory Estimation of a Moving Object using Kalman Filter Min-Soo Kim1, Ji-Hoon Koh1, Ho Quoc Phuong Nguyen1, Hee-Jun Kang2 1
Graduate School of Electrical Engineering, University of Ulsan, 680-749, Ulsan, South Korea 1 2
School of Electrical Engineering, University of Ulsan, 680-749, Ulsan, South Korea
[email protected]
Abstract: In this paper, a robot visual servo control algorithm is proposed by combining the conventional image based robot visual servoing algorithm with a trajectory estimation algorithm of a moving object using Kalman filter. The erroneous image information of a moving object due to the imprecise camera characteristics is compensated by applying Kalman filter to the process model of a moving object. The robot visual servo control algorithm is simulated, implemented and discussed with a Samsung FARA AT-2 robot and a MV50 Camera for its effectiveness, in both cases of with/without a trajectory estimation algorithm of a moving object using Kalman filter.
Keywords: Robot Visual Servo Control, Image based Visual Servo Control, a Moving Object Model, Kalman Filter
1 Introduction In order to allow a robot to have more flexibility and more precision for the various tasks, the robot visual servo control (or robot visual servoing) has been investigated by many researchers[1-5]. Generally, the robot visual servo control is divided into two categories such as position based control and image based control by defining the error signal. Position based control uses the error measured in 3 dimensional Cartesian space, while image based control uses the error in 2 dimensional image plane[2]. In position based control, features are extracted from the image and used with the geometric model of the target to determine the kinematic relationship between the target and the robot. This method requires more precise models of a robot and camera system so that it needs the complicate robot and camera calibration process. In image based control, the error is measured in image plane so that the process of pose estimation is omitted. It means that this method is less sensitive to the preciseness of 2
Corresponding author
2
Min-Soo Kim1, Ji-Hoon Koh1, Ho Quoc Phuong Nguyen1, Hee-Jun Kang2
the robot and camera model. Therefore, the image based control method may reduce computational delay and eliminate errors in modeling and camera calibration. Even though the image based control have the advantages over the position based control as prescribed above, it still has many difficulties for successful control. Due to camera characteristics such as image signal quantization error, lens distortion, charge coupled device senor, the various noises may inevitably occur to lead the pixel level error. This noise contained signal brings the imprecise the estimation of the error. In order to relieve this problem, Chang[6] suggested 3 dimensional motion parameter estimation method for tracking a maneuvering target with Kalman filter, Broida and Chellappa[7] suggests a dynamic model of a rigid object with Kalman filter. These approaches have been done for position based control. In this paper, a robot visual servo control algorithm is proposed by combining the conventional image based robot visual servoing algorithm[3] with a moving object model of LaValle[9] by applying the extended Kalman filter. Computer simulation and real implementation has been performed with FARA AT/2 robot and a MV50 camera for the effectiveness of this algorithm. The results are discussed in terms of the pixel error radii in the image plane from the algorithms with and without Kalman filters.
2. Image based Visual Servo Control In this study, the RMRC(Resolved Motion Rate Control) structure[3] is used with the image Jacobian which relates the joint angle differential to the image space differential as shown in Fig. 1. In Fig. 1,
t6
J q and
t6
J c are Jacobians which relate
differentials of the robot joint motion and the position of the robot mounted camera to the differentials of the robot tip.
I
J c is Jacobian which shows the differential
relationship between image space and camera pose space.
Fig. 1 The Structure of RMRC(Resolved Motion Rate Control)
Robot Visual Servo through Trajectory Estimation of a Moving Object using Kalman Filter
3
3. Visual Servo Control with the Extended Kalman Filter 3.1 A Moving Object Model in Image Plane When the velocity and acceleration of a moving object are given for the current instance, the position for the next instance in Cartesian space, ( Px , Py ) , can be estimated as follows[8] ) ) ) 1 ) 2 Px +δ t = Px + Vxδ t + Axδ t (1) 2 ) ) ) 1 ) 2 (2) Py + δ t = Py + Vy δ t + Ay δ t 2 ) ) ) ) ) ) Where δ t is the sampling period, and ( Px , Py ) , (Vx , Vy ) , ( Ax , Ay ) are the estimated position, velocity, and acceleration of the moving object, respectively. The motion of an object displayed in the image plane can be predicted as discrete time-varying equations[9, 11] by dividing it into a linear velocity, vk , and an angular velocity
ω
k
. The resulting equations are as follows: 1 1 ⎛ ⎞ 2 2 2 ⎝ ⎠ 1 1 ⎛ ⎞ 2 = vk δ t sin ⎜ θ k + ωk δ t ⎟ ≈ vk sin (θ k ) δ t + ωk vk cos (θ k ) δ t 2 2 ⎝ ⎠
δ xk +δ t , k = vk δ t cos ⎜ θ k + ωk δ t ⎟ ≈ vk cos (θ k ) δ t − ωk vk sin (θ k ) δ t
(3)
δ yk +δ t , k
(4)
δθ k +δ t , k = ωk δ t
(5)
And also, the noises and uncertainties in the measuring the linear and angular velocities can be modeled as follows: δ vk + δ t , k = ξ v (6)
δωk +δ t , k = ξω
(7)
where ξ v and ξω are zero-mean Gaussian random variables. In order to apply a model of the moving object to the Kalman filter, Eqs. (3)-(7) can be expressed in form of the discrete time state transition equations and its observation model such as
xk = Φ k , k −1 xk , k −1 + wk −1
(8)
zk = H k xk + vk
(9)
4
Min-Soo Kim1, Ji-Hoon Koh1, Ho Quoc Phuong Nguyen1, Hee-Jun Kang2
where zk , Φ k and H k are a measurement vector, a state transition matrix and an observation matrix, respectively. And also wk −1 and vk are a state transition error and measurement noises. Applying the Eqs. (3)-(7) to the form of Eqs. (8)-(9) gives as follows:
⎡ xk ⎢y ⎢ k ⎢θ k ⎢ ⎢ vk ⎢⎣ ω k
⎡ 1 ⎤ ⎢ ⎥ ⎢ ⎥ ⎢0 ⎥ = ⎢ ⎥ ⎢0 ⎥ ⎢ ⎢0 ⎦⎥ ⎢ ⎣⎢ 0
⎡ xk ⎤ ⎡1 ⎢ y ⎥ = ⎢0 ⎣ k⎦ ⎣
0
0
δ t c o s (θ k − 1 )
1
0
δ t s in (θ k − 1 )
0 0
1 0
0 1
0
0
0
0 1
0 0
0⎤ 0 ⎥⎦
0 0
⎡ xk ⎢y ⎢ k ⎢θ k ⎢ ⎢ vk ⎢⎣ ω k
1 ⎤ v k − 1δ t 2 s in (θ k − 1 ) ⎥ 2 ⎥ 1 v k − 1δ t 2 c o s (θ k − 1 ) ⎥ ⎥ 2 ⎥ δt ⎥ ⎥ 0 ⎥ 1 ⎦⎥
−
(10) ⎡ x k −1 ⎤ ⎡ 0 ⎤ ⎢y ⎥ ⎢ 0 ⎥ ⎢ k −1 ⎥ ⎢ ⎥ ⎢ θ k −1 ⎥ + ⎢ 0 ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ v k −1 ⎥ ⎢ ξ v ⎥ ⎢⎣ ω k − 1 ⎥⎦ ⎢⎣ ξ ω ⎥⎦
⎤ ⎥ ⎥ ⎡γ x ⎤ ⎥+ ⎢ ⎥ ⎥ ⎣γ y ⎦ ⎥ ⎥⎦
(11)
3.2 State Estimation of a Moving Object by Kalman Filter In the given state estimation problem of a moving object with Kalman Filter, the measurement vector is the position of a object in x-y plane which could be obtained from vision image. Based on this measurement, the state variables of the object as x-y position, its direction, its linear/angular velocities shown in Eqs. (10)-(11) would be estimated by using the Kalman Filter in Fig. 2. ) Enter prior estimate x0− and ) its error covariance p0−
Compute Kalman gain:
(
K k = Pk− H kT H k Pk− H kT + Rk
)
Project ahead: ) ) xk−+1 = φk xk
z0 , z1 ,K
−1
Update estimate with measurement zk : ) ) ) xk = xk− + K k zk − H k xk−
(
Pk−+1 = φk PkφkT + Qk
Compute error covariance for updated estimate: Pk = ( I − K k H k ) P
− k
Fig. 2 Kalman Filter Block Diagram
)
) ) x0 , x1 ,K
Robot Visual Servo through Trajectory Estimation of a Moving Object using Kalman Filter
5
4. Computer Simulation for Robot Visual Servo Control In order to show the effectiveness of the state estimation with Kalman filter, a computer simulation of a 6 dof manipulator visual servo control has been performed. Its algorithm is constructed with combination of the image based robot visual servo control in Sec. 2 and the state estimation of a moving object in Sec.3. Fig. 3 shows the simulation setup for a robot visual servo control. Gaussian random noise covariances are intentionally included to the linear velocity, vk , and the angular velocity
ω
k
of the moving object trajectory. They are 1 pixel/sec and 1deg/sec,
respectively. Other state transition noise covariances are assumed to be zero. And also its measurement noise covariance is 1 pixel/sec. Therefore, the resulting state transition noise covariance matrix, Qk , and the measurement noise covariance, Rk , are given in Eq. (12) as ⎡0 ⎢0 ⎢ Qk = ⎢0 ⎢ ⎢0 ⎢0 ⎢⎣
0 ⎤ 0 ⎥⎥ 0 0 0 0 ⎥, ⎥ 0 0 1 0 ⎥ ⎥ 0 0 0 π 180 ⎦⎥ 0 0 0 0 0 0
⎡1 0 ⎤ Rk = ⎢ ⎥ ⎣0 1 ⎦
(12)
The considered real trajectory and its resulting estimated trajectory are shown in Fig. 4 and the differences of their states( xk , yk , θ k , vk and ωk ) as estimation errors are shown in Fig. 5. Since the observation model deals with only xk and yk ,
Fig. 3 A Simulation Setup for a Robot Visual Servo Control
Fig. 4. The trajectory of a moving Object in Image Plane of a Fixed Camera
6
Min-Soo Kim1, Ji-Hoon Koh1, Ho Quoc Phuong Nguyen1, Hee-Jun Kang2
Fig. 5 The Resulting Estimation Error of Each State their estimation errors are relatively smaller as in between -2 and 2 pixel than the other states. In order to show the effectiveness of the above estimation method, a robot visual servo control simulation has been performed with and without the estimation algorithm. The simulation results with/without the estimation are shown in Fig. 6 and Fig. 7, respectively. The maximum pixel errors in the image plane for both cases are 6.6 pixels shown in Fig. 6 and 3.3 pixels shown in Fig. 7. It means that the suggested estimation has good effectiveness to reduce the tracking error under the various noisy environments.
Fig. 6 Pixel Error of Robot Visual Tracking without the Estimation
Fig. 7 Pixel Error of Robot Visual Tracking with the Estimation
Robot Visual Servo through Trajectory Estimation of a Moving Object using Kalman Filter
7
In order to check the robustness to the uncertainties of the system, the relationship between the above maximum error radius and the arbitrarily assigned the noise covariance value has been investigated. The covariance noise matrices of the state transition model, Qk , and measurement model, Rk ,in Eq. (12) are multiplied by weight α and the robot visual servo control simulation has been performed as α changes from 0 to 10 by 0.1. The result is shown in Fig. 8 and the slopes of the maximum error radius in both cases are comparable. It shows that moving object model based estimation algorithm is more effective in the robot visual servo control as the more measurement noise exists.
5. Real Implementation for Robot Visual Servo Control Real Implementation of the proposed robot visual servo control algorithm has been performed like the computer simulation. The total system shown Fig. 9 consists of FARA AT 2 robot, MV50 CCD Camera with 320 by 240 resolution and Meteor II frame grabber and MMC(Multi Motion Controller) based PC controller. In order to reduce the light effect to the camera image, the black background and a white object were used. 8 bit gray level was used to reduce the image processing time. In real implementation, image processing speed is 5~8 frames/sec and its resulting visual control frequencies are 4~7 Hz for no estimation case and 2~5 Hz for the estimation case. The maximum error radii of the both cases as seen in the computer simulation are shown in Fig. 10 and Fig. 11. The results in both cases are 97.2 pixels and 85.2 pixels, respectively. The maximum pixel error is greatly increased when compared in computer simulation case. It is reasoned that the real time control problem might be more serious than the measurement noise. The algorithm without the estimation could
Fig. 8 Maximum Error Radius as the change of the weight α
Fig. 9 Real Implementation of the proposed Robot Visual Servo
8
Min-Soo Kim1, Ji-Hoon Koh1, Ho Quoc Phuong Nguyen1, Hee-Jun Kang2
Fig. 10 Pixel Error of Robot Visual Tracking without the Estimation
Fig. 11 Pixel Error of Robot Visual Tracking with the Estimation
more or less relieves the additional computational effort. However, the algorithm with the estimation algorithm is still more effective for reducing the maximum error radius by about 12 pixels in real implementation.
6. Conclusion In this paper, a robot visual servo control algorithm is proposed by combining the conventional image based robot visual servoing algorithm with a trajectory estimation algorithm of a moving object using Kalman filter. The erroneous image information of a moving object due to the imprecise camera characteristics is compensated by applying Kalman filter to the process model of a moving object. From the simulation of the robot visual servo control algorithm, the pixel error due to the measurement noise could be considerably reduced by using Kalman Filter based on the moving object model. And also, the proposed algorithm is shown to be more effective as the measurement noise grows bigger. Real implementation results shows that the algorithm overcomes the additional computational efforts and has more error reducing capability.
Acknowledgement The authors would like to express financial supports from NARC(Network-based Automation Research Center) in University of Ulsan.
Robot Visual Servo through Trajectory Estimation of a Moving Object using Kalman Filter
9
References 1. H.Sutanto, R.Sharma, and V.Varma, “Image based Autodocking without Calibration,” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, pp974-979, 1997. 2. P. I. Corke. “Visual control of robot manipulators – a review”, In K. Hashimotor, editor, Visual Servoing, pp1-32, World Scientific, 1993. 3. C.S.George Lee, John T. Feddema, O.Robert Mitchell, “Feature-Based Visual Servoing of Robotic Systems,” Visual Servoing, World Scientific Series in Robotics and Automated Systems, pp. 105-138, 1993 4. L. Weiss, Sanderson and C. Neuman, "Dynamic Sensor-based Control of Robots with Visual Feedbace," IEEE Trans. on Robotics and Automation, Vol. 3, No. 5, pp. 404-417, 1987. 5. B. Thuilot, P. Martinet, L. Cordesses and J. Gallice, "Position based Visual Servoing: Keep the Object in the Field of Vision," IEEE Int. Conf. on Robotics and Automation, pp. 16241629, 2002. 6. K. C. Chang, H. J. Lee and C. G. Chung. “ Adaptively Estimation Motion Parameters for Tracking a Maneuvering Target Using Image,” Journal of Robotics & Automation, Vol. 3, pp. 43-48, 1989. 7. R. J. Broida and R. Chellappa, "Estimating the Kinematics and Structure of a Rigid Object from a Sequence of Monocular Images," IEEE Trans. Patten Analysis and Machine Intelligence, Vol. 13, No. 6, pp. 497-513, 1991. 8. R. F. Berg, "Estimation and Prediction for Maneuvering Trget Trajectories," IEEE Transactions on Robotics and Automation AC-28, pp. 294-304, 1983. 9. S. M. LaValle, R. Sharma, “ On Motion Planning in Changing, Partially Predictable Environments,” The International Journal of Robotics Research, Vol. 16, No. 6, pp. 775805, 1997. 10. J. T. Feddema and C. S. G. Lee, "Adaptive Image Feature Prediction and Control for Visual Tracking with a Hand-Eye Coordinated Camera," IEEE Trans. Syst. Man Cybernet. Vol. 20, pp. 1172-1183, 1990. 11. Jae-Hwei. Park, Jae-Mu Yun, and Jang-Myung Lee, "Trajcetory Estimation of a Moving Object using Kalman filter and Kohonen networks," Robotica, vol. 25, pp. 567-574, 2007.