Robot base frame calibration with a 2D vision system

0 downloads 0 Views 3MB Size Report
assembly [21–23], burr detection [24], and automated micro- manipulation [25]. ..... All sharpness func- tions were implemented with C++ and OpenCV [36], and.
Int J Adv Manuf Technol DOI 10.1007/s00170-015-7031-4

ORIGINAL ARTICLE

Robot base frame calibration with a 2D vision system for mobile robotic drilling Biao Mei 1 & Weidong Zhu 1 & Kangzheng Yuan 1 & Yinglin Ke 1

Received: 17 September 2014 / Accepted: 13 March 2015 # Springer-Verlag London 2015

Abstract Mobile robotic drilling for flight control surface assembly at multiple stations demands high positioning accuracy of the equipped industrial robot. However, disturbances, arising from movement from station to station, significantly deteriorate the positioning accuracy. This paper proposes a novel in-process robot base frame calibration method with a 2D vision system. A proposal layout of flight control surface assembly, the developed mobile robotic drilling system, and the 2D vision system are first introduced. Then an iterative measurement scheme is proposed to eliminate measurement errors induced by non-perpendicularity and erroneous object distance during 2D vision-based positioning of preset reference holes. In order to obtain the third-dimension Cartesian coordinates of the reference holes, depth control with autofocus is achieved by using a new sharpness function. Based on the acquired Cartesian coordinates of the reference holes in the robot base frame, and their counterparts measured in advance in the world coordinate system, the description of the robot base frame with respect to the world coordinate system can be obtained with least-squares fitting. Numerical experiments show that depth control with autofocus contributes to enhanced accuracy of robot base frame calibration. Experiments performed on the developed mobile drilling system indicate that the maximum positioning error is about 0.6 mm with the proposed calibration method. Since position errors of drilled fastener holes will be further measured and

* Weidong Zhu [email protected] 1

The State Key Lab of Fluid Power Transmission and Control, Department of Mechanical Engineering, Zhejiang University, Hangzhou 310027, China

compensated for based on the 2D vision system, the proposed in-process robot base frame calibration method is effective. Keywords Mobile robotic drilling . Robot base frame calibration . 2D vision system . Iterative measurement . Depth control with autofocus . Least-squares fitting

1 Introduction Due to high flexibility, low investment, and satisfactory precision, robotic drilling systems are developed and applied to aircraft assembly [1–3]. However, working volume of a robot is limited. To work with large volume or at multiple stations, relocation of one robot or use of more robots is generally needed. But the former is time-consuming while the latter requires large investment. For solving this problem, mobile automated robotic systems with high coverage are technically feasible, which have been studied and developed by some aerospace manufacturers. Electroimpact Inc. developed a mobile automated robotic drilling, inspection, and fastening system, in which an accurate robot is coupled with a mobile platform with a vertical axis. It has been applied to the assembly of aerospace structures such as fuselage sections, wings, and flaps [4]. COMAU Inc. developed a mobile automatic robotic system (MARS), which is a self-contained automated island and can move the robot from one station to another to work at multiple stations [5]. Recently, a flexible mobile robotic drilling system for flight control surface assembly at multiple stations has been developed at Zhejiang University, as shown in Fig. 1. Through active movement of the mobile robotic drilling system, the working volume of the robot is enlarged significantly, and drilling at multiple stations can be realized without introducing more investment and burden of equipment maintenance. However, assurance of position

Int J Adv Manuf Technol

Fig. 1 Mobile robotic drilling system developed at Zhejiang University

accuracy of drilled fastener holes is a challenge. Position errors of fastener holes are required to be kept within ±0.2 mm in order to fulfill high-quality standards of aircraft industry [6], which in turn places a stringent requirement on the positioning accuracy of the mobile robotic drilling system. Although satisfactory accuracy of the robot working at a fixed station can be achieved through accuracy enhancement with a 2D vision system [7], disturbances arising from the movement and repositioning of the mobile robotic drilling system from station to station deteriorate the positioning accuracy of the robot. Therefore, in-process robot base frame calibration is urgently needed to eliminate such negative effects on the positioning accuracy of the robot during drilling at multiple stations. Robot base frame calibration, which determines position and orientation of a robot base relative to a world coordinate system, is a fundamental problem for robotic systems and has been extensively studied [8–17]. Description of a robot base frame in a world coordinate system can be measured directly with a laser tracker [8]. However, elaborate setup and high cost are needed. In addition, this method may not be suitable for an industrial application environment with confined space. In [9], fast robot calibration is achieved with an external CCD camera through capturing positions of laser spots in a world coordinate system, which come from a set of projected laser beams fixed on a calibration device mounted on the robot endeffector. But the calibration result is rough to some extent. In [10], a robot/workcell calibration system for industrial robots is proposed, which depends on precise positioning of a laser beam from a single laser pointer mounted on the robot endeffector onto centers of multiple position-sensitive detectors (PSD) deployed in a custom-built high-precision fixture. Due to the nature of high cost-effective and non-contact, vision systems are widely used in various industrial scenarios, such as robotic drilling [7, 18], robotic welding [19, 20], robotic assembly [21–23], burr detection [24], and automated micromanipulation [25]. For a robotic system with a vision system deployed on its end-effector, the robot base frame and the

hand-eye relationship are often calibrated simultaneously [11–17, 26]. In [11], the robot base frame and the sensor/ end-effector relationship are estimated by solving a homogeneous transformation system AX=YB; drawbacks of which are estimation errors accumulation and need of complete robot pose measurement. In [12], the homogenous transformation system in [11] is used again, but two new solutions including a closed-form method and a non-linear optimization method are provided. In [13], a reliable and accurate closed-form solution for the robot-world and hand-eye calibration problem AX=YB is constructed through using Kronecker product and singular value decomposition (SVD). Calibrations of hand-eye and robot-world relationships with global polynomial optimization are presented in [14]. A procedure for simultaneous calibration of a robot and a camera is developed in [15], in which error propagation is eliminated owing to the simultaneous calibration procedure. However, the absolute distance information of the scene cannot be obtained. For a noisy real world, a closed-form solution is not adequate, thus an iterative approach for calibrating base-world and hand-eye relationships is developed in [16]. To achieve higher reconfigurability of an assembly system, an automated system based on Direct Linear Transformation method is constructed to calibrate the quick plug-in and plug-out of robots by using two CCD cameras [17]. With the automated calibration system, the relationship of two robots can be estimated. Regardless of abundant literature on robot base calibration, there is still no general, cost-effective, easy operating method for all robotic systems. In this paper, for increased efficiency and lower investment, an in-process robot base calibration method using a 2D camera is proposed, in which no complex computation, elaborate setup, and auxiliary devices are needed. In the proposed calibration method, preset reference holes with known positions on assembly jigs are used, and the reference holes are initially for constructing assembly coordinate systems of assembly stations. With the Cartesian coordinates of the reference holes in the robot base frame obtained with the 2D vision system and their counterparts in the world coordinate system, the robot base frame can be calibrated with least-squares fitting. During the process of obtaining the Cartesian coordinates of the reference holes, the 2D information of the preset reference holes is obtained with an iterative vision-based measurement scheme, in which measurement errors induced by non-perpendicularity and erroneous object distance are eliminated. The third dimension information of the reference holes is estimated through depth control with autofocus by using a new robust sharpness function with good reproducibility. The following content is structured as follows: In Section 2, the layout of flight control surface assembly stations and the developed mobile robotic drilling system are presented. The architecture of the 2D vision system and the iterative measurement scheme are discussed as well. Depth

Int J Adv Manuf Technol

control with autofocus based on the new sharpness function and a coarse-to-fine autofocus algorithm to obtain the depth information are provided in Section 3. Section 4 discusses the algorithm of the proposed robot base frame calibration method. In Section 5, numerical experiments and real experiments are provided to study the effect of depth control with autofocus on the accuracy of robot base frame calibration and verify the effectiveness of the robot base frame calibration method, respectively. And conclusions are drawn in Section 6.

Drilling end-effector

Robot control box Automatic tool changer

2 The mobile robotic drilling system The flight control surface assembly layout integrated with the developed mobile robotic drilling system is shown in Fig. 2. The mobile robotic drilling system can accurately drill fastener holes in large working volume spanning multiple assembly stations. Assembly stations of main outboard flap, rear outboard flap, and aileron are distributed from left to right. Flaps and ailerons are firstly loaded in assembly jigs. Then the mobile robotic drilling system will move from one station to another to drill the flaps and ailerons. To achieve high positioning accuracy, the mobile robotic drilling platform is firstly positioned with positioning holes distributed on the floor of each station, and the accuracy of the rough positioning is 4 mm. After that, the robot base frame is calibrated with a low-cost 2D vision system by using the reference holes preset on the jigs at each assembly station. As shown in Fig. 3, the developed mobile robotic drilling system consists of an air cushion unit, an industrial robot, a drilling end-effector, an automatic tool changer, a robot control box, etc. The air cushion unit provides the mobile drilling system with suspending support, movement, and veer. The drilling end-effector, mounted on the industrial robot, is comprised of a flange connection, a pressure foot, linear guide ways, linear scales, a drilling tool, a spindle, normality sensors, a 2D vision system, etc. (refer to Fig. 4). The flange connection provides the end-effector with an interface to connect with the robot. The pressure foot is applied to clamp a workpiece to stabilize the drilling process and eliminate gaps between stacks. And the precise motion of the pressure foot and the spindle is used to control dimple depth of fastener Main outboard flap

Mobile robotic drilling system

Rear outboard flap

Industrial robot Air cushion

Fig. 3 Components of the developed mobile robotic drilling system

holes. The normality sensors are used to adjust the normality of the drilling tool relative to the workpiece surface. The 2D vision system, initially designed for measuring the relative errors between the drilling tool and the workpiece [7], is used for calibrating the robot base frame in this research. The overall architecture of the 2D vision system is shown in Fig. 5. The hardware component of the vision system is comprised of an industrial camera, a ring light, a calibration board, a computer, etc. The axis of the camera is installed to be approximately coincident with the axis of the ring light, and the camera optical axis is adjusted to be parallel to the axis of the spindle. The software component consists of calibration and measurement modules. Functions implemented in the calibration module consist of system configuration, light and camera control, autofocus, vision system calibration, etc. The measurement module includes functions such as camera and light control, autofocus, feature search, feature measurement, etc. The system configuration is used to manage various coordinate systems in the mobile robotic drilling system. The light control is applied to control light intensity to

Aileron

Positioning hole of the mobile robotic drilling system

Fig. 2 The flight control surface assembly proposal layout

Fig. 4 Drilling end-effector mounted on the industrial robot

Int J Adv Manuf Technol

object point and the intersection of the optical axis with the workpiece surface, θ is the degree of non-perpendicularity, and Δz is the variation of object distance. Hence, in robot base frame calibration, the iterative measurement is used to measure the positions of the reference holes. The detailed procedure and an instance of the iterative measurement are shown in Figs. 6 and 7, respectively. With a small number of iterations, the distance s between a reference hole and the Camera TCP will be less than 0.01 mm.

3 Depth control with autofocus To realize robot base frame calibration, Cartesian positions of the reference holes in the robot base frame are needed. However, only 2D position information of the reference holes can be directly obtained by using the 2D vision system. Therefore, we propose to apply depth control with autofocus to acquire the third dimension position information of the reference holes. Fig. 5 Overall architecture of the 2D vision system

3.1 Sharpness function achieve appropriate illumination of a photographed object. The camera control provides adjustment of exposure and gain to capture high quality images. The autofocus function is used to automatically adjust object distance to ensure the photographed object is in-focus according to sharpness of sequence images, which is also used to control the distance between the camera and a workpiece. Camera intrinsic parameters and hand-eye relationship are calibrated with the vision system calibration function. The feature measurement includes two modes: non-iterative measurement and iterative measurement. In the noniterative measurement, the position of a reference hole is measured with one image of the reference hole, while in the iterative measurement, bias of the reference hole and the camera TCP is applied to move the robot to iteratively eliminate the bias until the center of the reference hole is coincident with the camera TCP. According to the following formulae deduced in [7], measurement errors Δs induced by non-perpendicularity and erroneous object distance will be removed by using the iterative measurement because s approaches to zero. zs cosθ Δs ¼ −s z þ s sinθ Δs ¼

zs −s z þ Δz

ð1Þ

ð2Þ

where Δs is the measurement error of the 2D vision system, z is the in-focus object distance, s is the distance between the

Since the camera uses a lens with a fixed focal length, autofocus is achieved by adjusting object distance rather than focal length. Meanwhile, to maintain the selfInput reference hole information including (ID, Type, Nominal hole frame, etc.)

Move robot to drive camera to nominal hole frame, and execute feature search & autofocus

Capture image and compute the deviation of reference hole from camera TCP

Yes

Is termination criteria achieved ? No

Move robot to drive camera for reducing the deviation of reference hole from camera TCP

Compute the coordinates of reference hole with respect to robot base frame

Fig. 6 Detailed procedure of the iterative measurement

Int J Adv Manuf Technol

the image can be accurately estimated with the TDES function. Hence accurate depth control can be expected.

3.2 Coarse-to-fine autofocus procedure

Fig. 7 An instance of the iterative measurement process

contained characteristic of the 2D vision system, the passive autofocus technique [27] based on image sharpness measurement is adopted in autofocus, which avoids additional measurement devices such as infrared or ultrasonic sensors. In depth control with autofocus (shown in Fig. 8), based on the computation and comparison of sequence images’ sharpness, object distance between a workpiece and the camera can be adjusted by moving the robot to make the workpiece be in-focus. Performance of a sharpness function used in autofocus often determines the performance of autofocus and depth control. For well performance of depth control, we propose a Two-Dimensional Entropy Sharpness function (TDES), which is based on the two-dimensional spatial entropy used for object extraction in [28]. F¼−

255 255 X X

Pðx; yÞlog Pðx; yÞ

ð3Þ

x¼0 y¼0

where (x,y) is a joint gray value vector of a pixel and its right adjacent in an image, and the range of the vector is {(x, y)|x ∈ [0, 255], y ∈ [0, 255]}; P(x, y) denotes the twodimensional probability distribution of the vector, which is physically the frequency of the vector in the image. Since spatial relationships of pixels are considered, information of Fig. 8 The flowchart of depth control with autofocus

In the mobile robotic drilling system, the camera is mounted on the end-effector connected to the industrial robot. Thus depth control along the Z-axis of the camera TCP is achieved by moving the robot, based on the sharpness values of acquired sequence images. The sharpness estimation of the sequence images and the robot movement for adjusting object distance are synchronized, which significantly reduces time cost of depth control. For improved robustness of depth control with autofocus, a coarse-to-fine autofocus procedure (shown in Fig. 9) on the basis of global search [29] is proposed as follows: Step 1 Pre-moving before autofocus. Move the robot so that the camera is at the far-end of coarse autofocus interval with respect to a workpiece. Step 2 Coarse autofocus. Drive the camera by moving the robot from the far-end to the near-end of the coarse autofocus interval. Meanwhile, continuously shoot the workpiece and estimate the sharpness values of the acquired images. Thus rough in-focus position is obtained by global maximum search of the sharpness values. Step 3 Move the robot so that the camera is in the rough infocus position. Step 4 Pre-moving before fine autofocus. Move the robot so that the camera is at the far-end of fine autofocus interval. Step 5 Fine autofocus. Drive the camera by moving the robot from the far-end to near-end of the fine autofocus interval. The refined in-focus position is obtained by global maximum search of sharpness values of acquired sequence images. Lower velocity is used for moving the robot in this step.

Int J Adv Manuf Technol Fig. 9 Sketch of the coarse-tofine autofocus procedure

Step 6 Move the robot so that the camera is in the fine infocus position to make the workpiece be in-focus.

4 Robot base frame calibration with 2D vision system Given the Cartesian coordinates of the reference holes predefined on the jig of a control surface assembly station, in the robot base frame and their counterparts known in the world coordinate system, robot base frame calibration can be achieved with least-squares fitting. The Cartesian coordinates of the reference holes in the robot base frame is estimated by the seamless integration of the 2D vision system with depth control. To remove Abbe errors in 2D vision measurement, the camera TCP is defined by translating the traditional camera coordinate system along the optical axis by the in-focus object distance [7]. The camera TCP touches a workpiece when the image of the workpiece is in-focus. Thus estimation of the Cartesian coordinates of the reference holes in the robot base frame can be achieved by the following operations: depth control with autofocus is firstly applied to ensure the reference hole being photographed is infocus; then the iterative measurement is executed until the center of the reference hole is coincident with the image center; after that, the Cartesian coordinates of the reference hole is coincident with the camera TCP in the robot base frame, which can be read out from the robot control system. The operations are repeated until the Cartesian coordinates of all reference holes are obtained. For high reproducibility in depth control, the proposed TDES function is applied for autofocus. Owing to the use of the iterative measurement, measurement errors induced by non-perpendicularity and erroneous object distance are removed. The flowchart of the proposed method of robot base frame calibration is shown in Fig. 10. According to the correspondence of Cartesian coordinates of the reference holes in the world coordinate system and robot base frame, we can have piA ¼ RpiR þ P

ð4Þ

 T w h e r e piA ¼ xAi ; yAi ; zAi ; i ¼ 1; 2; ⋯; n a n d pRi ¼  i T xR ; yRi ; zRi ; i ¼ 1; 2; ⋯; n are Cartesian coordinates of the reference holes in the world coordinate system and robot base frame, respectively, R and P are the rotational matrix and translational vector of the robot base frame with respect to the world coordinate system, respectively. However, due to manufacturing errors and measurement uncertainty, the pose parameters R and P are not consistent for all reference holes. Thus an algorithm involving the SVD [30] is applied to estimate the least-squares solution of the pose parameters to minimize the objective function J, which reflects the combined influence of manufacturing errors and measurement uncertainty. J¼

n X  i  i  p − Rp þ P 2 A R

ð5Þ

i¼1

Centroids of the sets of the Cartesian coordinates of the reference holes in the world coordinates and robot base frame can be denoted. n

pA ¼

n

1X i 1X i p pA ; pR ¼ n i¼1 R n i¼1

ð6Þ

Then let qiA ¼ piA −pA ; qiB ¼ piB −pB

ð7Þ

If the obtained pose parameters R and P are of the leastsquares sense, the Cartesian coordinates of the reference holes in the world coordinates system piA, and the values of (RpiR + P) in 3D space have the identical centroid, i.e., pA ¼ RpR þ P

ð8Þ

Thus the objective function J in Eq. (5) can be simplified as n X  i  q −Rqi 2 J¼ A R i¼1

ð9Þ

Int J Adv Manuf Technol Fig. 10 Flowchart of the proposed method of robot base frame calibration

Define the camera TCP as the offset of the camera coordinate system along the optical axis by in-focus object distance

Apply depth control with autofocus to adjust the camera TCP onto the photographed surface of the workpiece

Input Cartesian coordinates in world coordinate system of the reference holes used for robot base frame calibration

Find the least-squares solution T by using the singular value decomposition (SVD)

Execute the iterative measurement to align the center of the photographed reference hole to the image center

Read out Cartesian coordinates of the reference hole equivalent to the origin of the camera TCP in the robot base frame from the robot control system

No

Cartesian coordinates of all reference holes are obtained?

Convert the obtained homogeneous transformation matrix T to the vector of pose parameters (X0, Y0, Z0, A0, B0, C0)

Compute the solution of the robot base frame (X, Y, Z, A, B, C) based on the Levenberg-Marquardt algorithm Using the initial value obtained in last step

Yes

Obtain reference hole coordinates in the robot base frame

Thus minimizing J is the same as maximizing of the function Q below.

Expand Eq. (9), then we can have



n X  i¼1 n

¼

qiA −RqiR

X h i¼1

qiA

T

T 

qiA −RqiR



 T  T qiA þ qiR qiR −2 qiA RqiR

Calculate the description of the robot base frame (X, Y, Z, A, B, C)

ð10Þ i

Fig. 11 Instance of autofocus curve in terms of various sharpness functions



n h X  i¼1

qiA

T

i RqiR ≤ trðRQÞ

where Q=qiR(qiA)T.

ð11Þ

Int J Adv Manuf Technol

Let the SVD of Q be Q ¼ UDVT

Then P is obtained with Eq. (8), as ð12Þ

where D is a diagonal matrix, and U,V are orthonormal matrices. By using Eq. (12), R can be calculated by using R ¼ VUT 2

cos B cos C 6 cos B sin C T¼6 4 − sin B 0 where 2

cos B cos C R ¼ 4 cos B sin C − sin B

ð13Þ

P ¼ pA −RpR

ð14Þ

In aircraft manufacturing, pose parameters of a coordinate system is often formatted as a 6-vector S=[X,Y,Z,A,B,C], where X,Y,Z are the origin of the coordinate system and A, B,C are Euler angles of the axes of the coordinate system with respect to the world coordinate system. The equivalent transformation matrix T for the 6-vector S=[X,Y,Z,A,B,C] can be represented as 3 X Y7 7 Z5 1

sin A sin B cos C − cos A sin C sin A sin B sin C þ cos A cos C sin A cos B 0

cos A sin B cos C þ sin A sin C cos A sin B sin C þ sin A cos C cos A cos B 0

sin A sin B cos C −cos A sin C sin A sin B sin C þ cos A cos C sin A cos B

2 3 3 X cos A sin B cos C þ sin A sin C cos A sin B sin C þ sin A cos C 5; P ¼ 4 Y 5 Z cos A cos B

ð15Þ

With R, the Euler angles A,B,C in the 6-vector can be obtained by the following equations.     r11 r11 cos C þ r21 sinC ; B ¼ arctan ; ð16Þ A ¼ arctan r21 −r31   r22 cos C − r12 sin C C ¼ arctan r13 cosC −r23 sin C

Given P, the position parameters X,Y,Z in the 6-vector can be obtained as

where rij is the value of the rotational matrix at the ith row and jth column.

Considering the importance of the reference holes are often different in real applications, different weights will be

Fig. 12 Reproducibility values of various sharpness functions

Fig. 13 Time costs for sharpness estimation of various sharpness functions

½X ; Y ; Z  ¼ P T

ð17Þ

Int J Adv Manuf Technol

imposed on the errors of different reference holes in the objective function J, thus Eq. (5) can be rewritten as J¼

n X i¼1

  2 ωi piA −f piR ; S 

ð18Þ

where ωi is the weight of the ith reference hole, f(piR,S) is the model with the parameter vector S. Thus the 6-vector S0 =[X0,Y0,Z0,A0,B0,C0], solved by the non-iterative algorithm involving the SVD, is used as the starter of the non-linear least-squares involving the Levenberg-Marquard (L-M) algorithm [31] to find the

optimized parameters S=[X,Y,Z,A,B,C] so that J becomes minimal. The L-M algorithm is the interpolation of the Gradient descent method and the Gauss-Newton (G-N) algorithm, which is robust than the G-N algorithm. In the L-M method, a modified Hessian is used. HðS; λÞ ¼ 2JT J þ λI

ð19Þ

where J is the Jacobian matrix, λ is the damping factor which is adjusted at every iteration, I is the identity matrix. If λ is small, H approximates the G-N Hessian. Otherwise, H is near

Fig. 14 Numerical results of robot base frame calibration without depth control

(a)

(b)

(c)

(d)

(e)

(f)

Int J Adv Manuf Technol

to be the identity, and the Gradient descent procedure will be taken. The procedure of the L-M algorithm is briefly given as follows. Step 1 Let λ be 0.001. Step 2 Compute δS=−H(S,λ)−1g, where δS is the increment of the estimated parameter vector S. In the GN algorithm, g = −2JTJδS, while in the Gradient descent method, g = −λδS. Step 3 When f(Sn +δS)>f(Sn), λ=10λ, then go to step 2. Step 4 Otherwise, λ=0.1λ, and Sn+1 =Sn +δS, then go to step 2.

5 Experimental results 5.1 Experiments of depth control To verify the performance of the TDES function for depth control with autofocus, considering variety and usage frequency, the following sharpness functions are selected for comparison, including the sharpness function based on DCT (DCTS) [32], the sharpness function based on Prewitt Gradient Edge Detection (PS) [33], the Laplacian sharpness function (LS), and the variance sharpness function (VS) in [34], the onedimensional entropy sharpness function (ODES) based on the Monkey Model Entropy (MME) [35]. For evaluating the

Fig. 15 Numerical results of robot base frame calibration with the aid of depth control

(a)

(b)

(c)

(d)

(e)

(f)

Int J Adv Manuf Technol Fig. 16 Experimental platform for robot base frame calibration with the 2D vision system

sharpness functions, performance indices in [33, 35] such as focusing range, sensitivity to environmental parameters are referred. Moreover, reproducibility is treated as one key performance index in autofocus for depth control, because the third component of the reference hole’s Cartesian coordinates in the robot base frame is estimated with the aid of depth control with autofocus, and the accuracy of the estimation is assured by accurate depth control. Different from the index “accuracy” in literature [35], the reproducibility is defined as twice the standard deviation (the radius of a 95 % confidence interval) of the series of measured in-focus depths in multiple autofocus processes. The lower the reproducibility value is, the better the depth control can be achieved. Time cost for computation of a sharpness function is another necessary index for in-process applications. To systematically and accurately evaluate the sharpness functions, depth control trails with autofocus were performed on a Coord3 coordinate measuring machine (CMM). The depth was changed by moving the CMM. All sharpness functions were implemented with C++ and OpenCV [36], and executed on a DELL OPTIPLEX 380 (2.93 GHz Intel Core CPU & 2G RAM). For covering real scenarios in the robotic drilling application, lighting intensity, illumination uniformity, etc. were used as changeable factors in the experiments of depth control with autofocus. One instance of autofocus curve in terms of various sharpness functions is shown in Fig. 11, and reproducibility values and time costs for various sharpness functions are presented in Figs. 12 and 13, respectively. From Figs. 11, 12, and 13, it can be observed that the VS function loses the fundamental bell-shape property of a sharpness function, and its reproducibility is 8.908 mm. The LS function has very narrow focusing range. Time costs of the DCTS and PS functions are significantly high. Overall, the

TDES function has the smallest reproducibility value, reasonably time cost, and wide focusing range. Therefore, the TDES function is selected for depth control with autofocus in robot base frame calibration. To analyze the effect of depth control with autofocus on the accuracy of robot base frame calibration, 10,000 numerical experiments of robot base frame calibration without and with the aid of depth control have been performed, and the results are presented in Figs. 14 and 15, respectively. ΔX,ΔY,ΔZ,ΔA,ΔB,ΔC are the deviations of the parameters X,Y,Z,A,B,C in robot base frame calibration. The results suggest that depth control with autofocus contributes to enhanced accuracy of robot base frame calibration. In the numerical experiments, the uncertainty intervals of the vision-based positioning in the R1~R4: reference holes on jig for calibration P1~P8: reference holes on workpiece for verification

R2

R1

P1~P8

R4

R3

Fig. 17 Sketch of the layout of the reference holes for calibration and verification

Int J Adv Manuf Technol Fig. 18 Software interface of the 2D vision system integrated in the mobile robotic drilling system

XY-plane of the camera TCP are set to Δx= [−0.1,0.1] and Δy=[−0.1,0.1] because the 2D vision system approximately achieves a measurement accuracy of 0.1 mm [3]. The uncertainty interval of object distance along Z-axis of the camera TCP is set to Δz=[−2,2] without depth control, and is set to Δz = [−0.275, 0.275] with the aid of depth control.

Table 1

5.2 Experiments of calibration and verification To verify the validity of the proposed method for robot base frame calibration in mobile robotic drilling for flight control surface assembly, experiments of robot base frame calibration and reference hole measurement were conducted on the developed mobile robotic drilling system. The

Cartesian coordinates of the reference holes for verification measured by laser tracker and vision system

Hole ID

Measuring equipment

x

y

z

P1

Laser tracker 2D vision system Laser tracker 2D vision system Laser tracker 2D vision system Laser tracker 2D vision system Laser tracker 2D vision system Laser tracker 2D vision system Laser tracker 2D vision system Laser tracker 2D vision system

7247.560 7247.903 8042.603 8042.355 8034.369 8033.948 7231.964 7232.002 7322.578 7322.693 7962.880 7962.578 7958.617 7958.237 7315.031 7314.942

−1164.633 −1165.100 −1885.959 −1885.911 −1879.627 −1879.392 −1151.619 −1151.761 −1231.839 −1231.965 −1813.111 −1813.029 −1809.933 −1809.392 −1225.603 −1225.684

−536.407 −536.211 −504.619 −504.620 −88.960 −88.859 −122.696 −122.644 −425.386 −425.009 −400.910 −400.904 −197.995 −197.869 −224.023 −223.746

P2 P3 P4 P5 P6 P7 P8

Δx

Δy

Δz

Distance (mm)

0.343

−0.467

0.196

0.611

−0.248

0.048

−0.001

0.252

−0.421

0.235

0.101

0.492

0.038

−0.142

0.052

0.155

0.115

−0.126

0.377

0.413

−0.302

0.082

0.006

0.312

−0.38

0.541

0.126

0.673

−0.089

−0.081

0.277

0.302

Int J Adv Manuf Technol

equipment of the experimental platform is shown in Fig. 16, which includes an air cushion unit, an industrial robot, a laser tracker Leica AT901-B (measurement uncertainty: ±15 μm + 6 μm/m), etc. In the experiments, Cartesian coordinates of two types of reference holes in the world coordinate system were measured by using the Leica AT901-B, and were used for robot base frame calibration as well as verification of the calibrated results. The layout of the two types of reference holes is given in Fig. 17. The detailed experimental procedure is provided as follows. Step 1 Store Cartesian coordinates of the reference holes for robot base frame calibration in the world coordinate system in an MS EXCEL file. Step 2 Import the known camera TCP by using the control system of the mobile robotic drilling system. Step 3 Import the Cartesian coordinates of the reference holes stored in the MS EXCEL file in step 1 by using robot base frame calibration module of the control system. Step 4 Select one of the reference holes used for robot base frame calibration, and then adjust the reference hole to the field of view by moving the camera with the robot through executing the function of feature search in the developed 2D vision system shown in Fig. 18. Step 5 Position the camera by the robot to make the photographed reference hole be in-focus with the aid of depth control with autofocus. Step 6 Align the center of the reference hole with the camera TCP by executing the iterative measurement function of the 2D vision system. Step 7 Record the pose of the camera TCP in the robot base frame. Due to accurate depth control with autofocus and the iterative measurement, the origin of the camera TCP is coincident with the Cartesian coordinates of the reference hole. Hence, the position of the camera TCP is treated as the Cartesian position of the reference hole in the robot base frame. Step 8 Execute step 4 to step 7 until all reference holes are measured. Step 9 Compute the robot base frame with respect to the world coordinate system with the measured coordinates of the reference holes. The calibrated robot base frame is [X, Y, Z, A, B, C] = [−672.805, 7500.075, 1299.625, −51.123, 0.841, 0.676]. Step 10 Measure Cartesian coordinates of the reference holes for verification in the world coordinate system by using the 2D vision system and the calibrated robot base frame in step 9.

Step 11 Compute distances of the corresponding Cartesian coordinates, measured by the 2D vision system and the laser tracker, of the reference holes for verification in the world coordinate system. The results are shown in Table 1, which indicate the validity of the proposed robot base frame calibration method. Considering that position errors of drilled fastener holes can be further compensated for with the 2D vision system [7, 37], the proposed in-process method for robot base frame calibration is effective. Overall, it provides a cost-effective and satisfactory solution for robot base frame calibration in the mobile drilling of flight control surface with multiple stations.

6 Conclusions In this paper, an in-process robot base frame calibration method with a 2D vision system is proposed for the mobile robotic drilling system for flight control surface assembly. The layout of flight control surface assembly, the developed mobile robotic drilling system, and the 2D vision system are introduced. An iterative measurement scheme is used for 2D vision positioning of the preset reference holes without introducing measurement errors induced by non-perpendicularity and erroneous object distance. Depth control with autofocus, by using a new sharpness function and moving the robot, is applied to obtain the third-dimension Cartesian coordinates of the reference holes. The algorithm of robot base frame calibration based on the acquired coordinates of the reference holes in the robot base frame and in the world coordinate system is provided to compute the description of the robot base frame in the world coordinate system. In machine vision applications, 2D vision system and depth control with autofocus are traditional research topics. However, the 2D vision system is rarely used to obtain Cartesian coordinates of a target with a single shoot, and autofocus is often used for improvement of image sharpness as well as 3D reconstruction in real applications. In this paper, we propose to integrate the 2D vision system with depth control using autofocus to measure geometrical feature in 3D space without using extra sensors. Based on this seamless integration, a cost-effective and satisfactory robot base frame calibration method is provided for mobile robotic drilling applications in aircraft assembly. Experiments of depth control, numerical experiments of the influence of depth control on the accuracy of robot base frame calibration, and experiments of calibration and verification were performed. The well performance of the TDES function for autofocus is verified through the experiments. Results of the numerical experiments reveal that depth control with autofocus contributes to enhanced accuracy of robot base frame calibration. By using the developed robot base frame calibration method, the

Int J Adv Manuf Technol

maximum positioning error was about 0.6 mm, which meets the accuracy standard of robot base frame calibration in the mobile robotic drilling application because position errors of drilled fastener holes can be measured and further compensated for by using the 2D vision system. Acknowledgments This research was supported by the Science Fund for Creative Research Groups of National Natural Science Foundation of China (Project No.51221004) and National Natural Science Foundation of China (Project No. 51205352).

11.

12. 13.

14.

Compliance with ethical standards 15. Funding This study was funded by the Science Fund for Creative Research Groups of National Natural Science Foundation of China (Project No.51221004) and National Natural Science Foundation of China (Project No. 51205352).

16.

Conflict of interest The authors declare that they have no conflict of interest.

17.

Statement of human and animal rights This article does not contain any studies with human participants or animals performed by any of the authors.

19.

Informed consent Individual participants were not studied and included in this article.

20.

References

18.

21. 22.

1.

2.

3. 4.

5.

6.

7.

8.

9.

10.

Olsson T, Haage M, Kihlman H, Johansson R, Nilsson K, Robertsson A, Björkman M, Isaksson R, Ossbahr G, Brogårdh T (2010) Cost-efficient drilling using industrial robots with highbandwidth force feedback. Robot Comput Integr Manuf 26:24–38 DeVlieg R, Sitton K, Feikert E, Inman J (2002) ONCE (ONe-sided cell end effector) robotic drilling system. SAE Technical Paper 2002-01-2626, doi:10.4271/2002-01-2626 Bi S, Liang J (2011) Robotic drilling system for titanium structures. Int J Adv Manuf Technol 54:767–774 Gray T, Orf D, Adams G (2013) Mobile automated robotic drilling, inspection, and fastening. SAE Technical Papers 2013-01-2338, doi:10.4271/2013-01-2338 COMAU AEROSPACE Mobile Automatic Robotic System (MARS) http://www.comau.com/eng/offering_competence/ aerospace/technologies/drilling/Pages/robot_platform.aspx. Accessed 1 Jan 2014 Summers M (2005) Robot capability test and development of industrial robot positioning system for the aerospace industry. SAE Technical Paper 2005-01-3336, doi:10.4271/2005-01-3336 Zhu W, Mei B, Yan G, Ke Y (2014) Measurement error analysis and accuracy enhancement of 2D vision system for robotic drilling. Robot Comput Integr Manuf 30:160–171 Nubiola A, Bonev IA (2013) Absolute calibration of an ABB IRB 1600 robot using a laser tracker. Robot Comput Integr Manuf 29: 236–245 Sun L, Liu J, Sun W, Wu S, Huang X (2004) Geometry-based robot calibration method. In: Proceedings of the 2004 I.E. International Conference on Robotics and Automation. pp. 1907–1912 Liu Y, Shen Y, Xi N, Yang R, Li X, Zhang G, Fuhlbrigge TA (2008) Rapid robot/workcell calibration using line-based approach. In:

23. 24. 25.

26. 27. 28. 29.

30. 31.

32.

33.

Proceedings of the 4th I.E. International Conference on Automation Science and Engineering. pp. 510–515 Zhuang H, Roth ZS, Sudhakar R (1994) Simultaneous robot/world and tool/flange calibration by solving homogeneous transformation equations of the form AX = YB. IEEE Trans Robot Autom 10:549– 554 Dornaika F, Horaud R (1998) Simultaneous robot-world and handeye calibration. IEEE Trans Robot Autom 14:617–622 Shah M (2013) Solving the robot-world/hand-eye calibration problem using the kronecker product. J Mech Robot 5:031007-1031007-7 Heller J, Henrion D, Pajdla T (2014) Hand-eye and robot-world calibration by global polynomial optimization. arXiv preprint arXiv:1402.3261 Zhuang H, Wang K, Roth ZS (1995) Simultaneous calibration of a robot and a hand-mounted camera. IEEE Trans Robot Autom 11: 649–660 Hirsh RL, DeSouza GN, Kak AC (2001) An iterative approach to the hand-eye and base-world calibration problem. In: Proceedings of the 2001 I.E. International Conference on Robotics and Automation. pp. 2171–2176 Arai T, Maeda Y, Kikuchi H, Sugi M (2002) Automated calibration of robot coordinates for reconfigurable assembly systems. CIRP Ann Manuf Technol 51:5–8 Liang J, Bi S (2010) Design and experimental study of an end effector for robotic drilling. Int J Adv Manuf Technol 50:399–407 Ma H, Wei S, Sheng Z, Lin T, Chen S (2010) Robot welding seam tracking method based on passive vision for thin plate closed-gap butt welding. Int J Adv Manuf Technol 48:945–953 Xu Y, Fang G, Chen S, Zou JJ, Ye Z (2014) Real-time image processing for vision-based weld seam tracking in robotic GMAW. Int J Adv Manuf Technol 73:1413–1425 Bone GM, Capson D (2003) Vision-guided fixtureless assembly of automotive components. Robot Comput Integr Manuf 19:79–87 Jayaweera N, Webb P (2007) Adaptive robotic assembly of compliant aero-structure components. Robot Comput Integr Manuf 23: 180–194 Huang S-J, Tsai J-P (2005) Robotic automatic assembly system for random operating condition. Int J Adv Manuf Technol 27:334–344 Lee K-C, Huang H-P, Lu S-S (1993) Burr detection by using vision image. Int J Adv Manuf Technol 8:275–284 Bilen H, Hocaoglu MA, Unel M, Sabanovic A (2012) Developing robust vision modules for microsystems applications. Mach Vis Appl 23:25–42 Gan Y, Dai X (2011) Base frame calibration for coordinated industrial robots. Robot Auton Syst 59:563–570 Chen C-Y, Hwang R-C, Chen Y-J (2010) A passive auto-focus camera control system. Appl Soft Comput 10:296–303 Pal NR, Pal SK (1991) Entropy: a new definition and its applications. IEEE Trans Syst Man Cybern 21:1260–1270 Kehtarnavaz N, Oh H-J (2003) Development and real-time implementation of a rule-based auto-focus algorithm. Real Time Imaging 9:197–203 Arun KS, Huang TS, Blostein SD (1987) Least-squares fitting of two 3-D point sets. IEEE Trans Pattern Anal PAMI-9:698–700 Ma C, Jiang L (2007) Some research on Levenberg-Marquardt method for the nonlinear equations. Appl Math Comput 184: 1032–1040 Zhu R, Zhang Y, Liu B, Liu C (eds) (2010) Information computing and applications. In: Barbosa S, Chen P, Cuzzocrea A, Du X, Filipe J, Kara O, Kotenko I, Sivalingam KM, Slezak D, Washio T, Yang X (eds) Communications in Computer and Information Science. Springer, Heidelberg Berlin, pp 217–225 Shih L (2007) Autofocus survey: a comparison of algorithms. In: Proceedings of SPIE-IS&T Electronic Imaging. pp. 65020B-165020B-11

Int J Adv Manuf Technol 34.

Mateos-Pérez JM, Redondo R, Nava R, Valdiviezo JC, Cristóbal G, Escalante-Ramírez B, Ruiz-Serrano MJ, Pascau J, Desco M (2012) Comparative evaluation of autofocus algorithms for a real-time system for automatic detection of Mycobacterium tuberculosis. Cytometry A 81:213–221 35. Firestone L, Cook K, Culp K, Talsania N, Preston K (1991) Comparison of autofocus methods for automated microscopy. Cytometry A 12:195–206

36.

37.

Bradski G, Kaehler A (2008) Learning OpenCV: Computer vision with the OpenCV library. O’Reilly Media, Incorporated, Sebastopol Zhu W, Qu W, Cao L, Yang D, Ke Y (2014) An off-line programming system for robotic drilling in aerospace manufacturing. Int J Adv Manuf Technol 68:2535–2545

Suggest Documents