Vision Based Navigation for Omni-directional Mobile ... - Science Direct

11 downloads 0 Views 973KB Size Report
An industrial robot is applied to complete the drilling work for a rocket shell. ..... B., et al., Robot base frame calibration with a 2D vision system for mobile robotic ...
Available online at www.sciencedirect.com

ScienceDirect Procedia Computer Science 105 (2017) 20 – 26

2016 IEEE International Symposium on Robotics and Intelligent Sensors, IRIS 2016, 17-20 December 2016, Tokyo, Japan

Vision Based Navigation for Omni-directional Mobile Industrial Robot Shuai Guoa, Qizhuo Diaoa, Fengfeng Xib* a

Shanghai University, No.99,Shangda Road,Baoshan District, Shanghai,200444,China b Ryerson University, 350 Victoria Street, Toronto, Ontario, M5B2K3,Canada

Abstract In this paper, an Omni-directional mobile industrial robot drilling system for aerospace manufacture is introduced. Mecanum wheels are used for the robot’s maneuverability in congested workspace. An industrial robot is applied to complete the drilling work for a rocket shell. A vision system is applied to enhance the precision of mobile drilling. Additional sensor systems such as laser measurement system and displacement measurement system are equipped to do the autonomous navigation and anti-collision job. To increase the flexibility and working volume of the mobile industrial robot, the autonomous mobile drilling scheme is presented. In order to fulfil the requirement for drilling precision in aerospace industry, a vision-based deviation rectification solution is developed. Some experiments are carried out to compare the influence of different calibration targets on the robot system. Numerical tests show that the rectification system is able to satisfy the accuracy of the positioning in the autonomous drilling work. © 2017 2016 The Authors. Elsevier © Published byPublished Elsevier by B.V. This isB.V. an open access article under the CC BY-NC-ND license Peer-review under responsibility of organizing committee of the 2016 IEEE International Symposium on Robotics and Intelligent Sensors (http://creativecommons.org/licenses/by-nc-nd/4.0/). (IRIS 2016). under responsibility of organizing committee of the 2016 IEEE International Symposium on Robotics and Intelligent Peer-review

Sensors(IRIS 2016).

Keywords: Omni-directional mobile industrial robot; Aerospace manufacturing; Autonomous mobile drilling; Mecanum wheel; Accurate positioning; visionbased rectification; Calibration target; vision-based measurement

1. Introduction Mobile robots has been receiving increasing researches in an enormous range of applications. In the past decade, mobile robotics has drawn attentions from both academy and industry [1-3]. A review of existing mobile robots demonstrates that good performance has already been achieved based on their respective application scenarios. Nevertheless, there still exists challenge in the application of mobile robots used in industrial environments like aerospace manufacturing, which requires high accuracy and high manoeuvrability at the same time. Drilling is an important process in aircraft assembly, occupying a large proportion in the total amount of labour in aerospace manufacturing. What’s more, drilling quality and accuracy has a critical influence on the performance of aircraft. While stationary industrial robots can obtain pretty high precision, mobile industrial robots is not able to achieve precise positioning because of the coarse localization of its mobile platform. KUKA Robotics has built the “moiros” mobile industrial robot system for machining extremely large components like ships, aircraft and wind power plants. The “moiros” is able to manoeuvre safely to the desired position in confined spaces and to achieve a positioning accuracy of f5 mm. Since 2009, the Southwest Research Institute (SwRI) has been working on a mobile manipulator demonstration system called MR ROAM for

* Corresponding author. Tel.: +001-416-979-5000-7091; fax: +001-416-979-5265. E-mail address: [email protected]

1877-0509 © 2017 Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of organizing committee of the 2016 IEEE International Symposium on Robotics and Intelligent Sensors(IRIS 2016). doi:10.1016/j.procs.2017.01.182

Shuai Guo et al. / Procedia Computer Science 105 (2017) 20 – 26

commercial and military aircraft painting. The system had a positioning accuracy of 12.7mm [4]. The positioning accuracy of the referred robots are far from satisfied with the accuracy of f0.2mm [5], which is required in drilling for aerospace products. In this paper, a vision-based rectification is applied to compensate for the deviation of the mobile platform. In the production of aerospace components, large parts are placed in a stationary production cell. Over a period of days, multiple shifts of workers complete the assembly and inspection tasks. In such a production environment, stationary robotic systems are not economical. The employment of mobile industrial robot is a solution for achieving low-cost manufacture by allowing one robot to be used for similar tasks in multiple stations. Mobile industrial robot system have great advantages compared with traditional industrial robots in both efficiency and flexibility. In this paper, an Omni-directional mobile industrial robot is built to manoeuvre in compact working site of aerospace manufacturing factories. An Omni-directional mobile industrial robot is one that can move independently and simultaneously in all three degrees of freedoms of plane motion: longitudinal (forward/reverse), lateral (right/left), and rotary. Thus, Omni-directional wheels can significantly improve the manoeuvrability. Typical Omni-directional wheels include Omni wheel [6-8], orthogonal wheel [9], off-centred wheel [1], tracked wheel [10], and Mecanum wheel [11-16]. Considering the performances of the above wheels, we deployed the Mecanum wheel on our mobile industrial robot. 2. Description and Implementation of the Mobile Drilling System The Omni-directional mobile industrial robot in this study consists of a Mecanum wheeled Omni-directional mobile platform, an industrial robot, controllers, sensors and a software system. In this project, the robot should complete the accurate drilling work for a stationary rocket shell automatically [17-20] in the workspace as illustrated in Fig.2. The work includes autonomous path planning, autonomous deviation correcting and automatic drilling. A solution of a mobile drilling robot on a guiding rail has been promoted using a laser tracker and a hand-eye vision system to figure out the exact position between the robot base frame and the drilling point [5]. In this paper, a new method of mobile drilling system is developed, and the system is characterized by its Omni-directional mobile manipulator, positioning laser sensors and a single calibrated camera on the mobile platform. 2.1. Mecanum wheeled Omni-directional mobile manipulator The Omni-directional mobile manipulator, as shown in Fig.1, is built by mounting a manipulator on the Omni-directional mobile vehicle. The project aims to carry out drilling tasks on large surfaces at all assembly stations in the aerospace factory in an adaptive and flexible way.

Fig. 1. Review of the Omni-directional mobile industrial robot

2.2. Sensors, industrial robot and control units An Omni-directional mobile industrial robot should be designed to exhibit autonomous behaviors such as collision avoidance and self-localization, while moving in an arbitrary direction. Thus, related sensors such as industrial cameras, laser trackers and displacement sensors are deployed on the Omni-directional mobile industrial robot, collaborating with the mobile manipulator under the commanding of the control system. As illustrated in Fig.2, the robotic operation can be divided into 3 configurations: In the 1st configuration, the robot use its laser trackers to find its location (position 1) in the workspace and navigate to the suitable working position (position 2); In the 2nd configuration, the robot use its industrial camera to calculate its accurate pose with respect to the rocket shell, by 3D vision measurement [21]; In the 3rd configuration, the robot performs the mobile drilling operation [22] according to the calculated pose, and finishes the work in position 3.

21

22

Shuai Guo et al. / Procedia Computer Science 105 (2017) 20 – 26

Fig. 2. Implementation schematic for mobile drilling system

Fig. 3 depicts the autonomous drilling strategy for the mobile industrial robot. During the motion of the industrial mobile robot, the laser localization system functions as a positioning equipment to tell the robot ‘where am I’ by its environment remapping characteristic. However, based on mainstream laser sensors, there still remains a rough 50mm error in the localization of the mobile platform, which is absolutely not sufficient for industrial operations, let alone the following positioning in the drilling work, Hence, the 3D vision deviation rectification system is developed for a refined positioning of the end-effector. The displacement sensor gives the final anti-collision protection for the end-effector.

Fig. 3. Mobile autonomous drilling strategy

3. Vision-Based Deviation rectification Typical methods of 3D measurements include using multiple cameras like binocular stereo vision, a single camera with a calibration target and a single camera with additional hardware such as structured light and laser. There has been varieties of robot vision systems differ from specific applications. Vision navigation systems are developed with single or binocular cameras for different purposes [23]. Typical vision measuring systems include 3D pose recognition using a calibrated single camera [24, 25] or binocular stereo cameras [26]. What is worth mentioning is that a binocular vision system for recognition and guiding of initial welding position is built with an positioning error of around 3mm [27]. But, as we can see, this method cannot meet the requirement of high precision. In general, 3D measurements can be done if two or more images of the same object are taken at the same time with multiple cameras at different spatial positons. Nevertheless, during the process of industrial measurement, we hardly have sufficient time to do the stereo matching process using multiple cameras. The task can also be solved by projecting an optical ray (also called structured light) onto the measuring plane, but this would be costly and vulnerable to environment. With this, it is possible to obtain the world coordinates of an object lying in the specified plane with a single camera and a high-accuracy calibration target [28, 29]. The process have two essential prerequisites. The first rule is that the referred single camera must be calibrated. After that, there must be a calibration target placed on the measuring plane. A calibration process is to obtain an optimized model of the available camera. For a successful calibration, a calibration target with exactly known metric properties is needed. 3.1. Camera calibration Before proceeding the 3D measurement, sufficient images of the calibration target, as shown in Table 1, are taken in different poses.

23

Shuai Guo et al. / Procedia Computer Science 105 (2017) 20 – 26 Table 1. Metric properties of the calibration target Outline dimension(mm)

Dot diameter(mm)

Dot array

Dot space(mm)

200 h 200

12.5

7h7

1mm

The calibration process calculates all camera parameters including internal and external parameters (the pose of the calibration target relative to the camera) to describe the characteristics of industrial cameras. The root mean square error (RMSE) of the calibration process is obtained to reflect the accuracy of the calibration and indicate whether the calibration is successful. In our final test, the RMSE =1.39057e-006m. 3.2. Industrial robot control As illustrated in Fig. 4 B, C, W, O and T respectively represents the robot base frame, the camera frame, the world frame, the work point frame and the end-effector frame

Fig. 4. The coordinate transformation in robot control module

Transformation ‫ܪ‬௕௖ (the position and orientation between the camera and robot base) and ‫ܪ‬௪௢ (the position and orientation between the world and work point) can be measured.The pose of the end-effector to the working point ‫ܪ‬௢௧ directly decides the quality of the drilling work, and it is the purpose of the mobile robot’s control and planning. Transformation ‫ܪ‬௪௖ (the position and orientation between the world and camera) is obtained by our 3D vision measurement. Ultimately, the end-effector coordinates in the robot (‫ܪ‬௕௧ ) is computed by Eq. (1). The results are shown in Table 2. (1) Table 2. The result of coordinate transformation in drilling process (In millimeter) ‫ܪ‬௕௖

‫ݐ‬௫

‫ݐ‬௬

‫ݐ‬௭

¢

£

¤

value

500

0

450

0

0

0

‫ܪ‬௪௢

‫ݐ‬௫

‫ݐ‬௬

‫ݐ‬௭

¢

£

¤

value

0

1000

1000

0

0

0

‫ܪ‬௢௧

‫ݐ‬௫

‫ݐ‬௬

‫ݐ‬௭

¢

£

¤

value

0

0

0

0

0

0

‫ܪ‬௖௪

‫ݐ‬௫

‫ݐ‬௬

‫ݐ‬௭

¢

£

¤

value

-20.07872

37.1342

715.27

357.852

1.93

8.156

‫ܪ‬௕௧

‫ݐ‬௫

‫ݐ‬௬

‫ݐ‬௭

¢

£

¤

value

479.92128

1027.1342

2165.27

357.852

1.93

8.156

4. Tests and Conclusions 4.1. The pose calibration test In the vision system, calibration targets with different specifications highly influences the performance of the robot. Thus, a series of comparison tests of calibration targets with 3h3, 5h5, 7h7 and 9h9 dots, are implemented to find out the influence. Note that all the dots have the same diameter of 12.5mm. Fig. 6 illustrates its influence on pose calibration.

24

Shuai Guo et al. / Procedia Computer Science 105 (2017) 20 – 26

Fig. 5. (a) Influence of different calibration targets on Trans-X; (b) Influence of different calibration targets on Trans-Y; (c) Influence of different calibration targets on Trans-Z; (d) Influence of different calibration targets on Rot-X; (e) Influence of different calibration targets on Rot-Y; (f) Influence of different calibration targets on Rot-Z.

As shown in the Fig. 5, the translation precision is kept within 0.01mm, and the rotation precision is kept within 0.2e (except for 3h3 calibration target). This will have little impact on the system precision. To obtain an optimized performance, the 7h7 calibration target is used in the system. 4.2. The drilling points distribution test To measure the final drilling precision of the Omni-directional mobile industrial robot drilling system, a delicate test is performed using a vision based measurement, which is illustrated in Fig. 6.

Fig. 6. Precision test of the Omni-directional mobile industrial robot drilling system

When the test procedure starts, the drilling manipulator draws a drilling-point-centred rectangle on a white board, which is fixed on the surface of the rocket shell. Then, the stationary measuring camera grabs an image of the white board and works out the image coordinates of the centre of the rectangle. As depicted in Fig. 7, the drilling points figured out by the camera distribute in a range of 50 pixels. Thus, the positioning error is 0.1875mm = 50 pixel h 0.00375mm (the dimension of one pixel). The standard deviation is 0.0628mm.

Fig. 7. Distribution graph of the drilling points

Shuai Guo et al. / Procedia Computer Science 105 (2017) 20 – 26

A further experiment has been performed to evaluate the influence of different calibration targets on the distribution of final drilling points. In this experiment calibration targets with 3h3, 5h5, 7h7 and 9h9 dots respectively are tested. Note that all the dots have the same diameter of 12.5mm, and the results are shown in Fig. 8 and Table 3.

Fig. 8. Distribution of the drilling points using different calibration target Table 3. Drilling point’s distribution range using different calibration targets (In millimeter) Calibration target specification

Distribution range (in pixels)

Distribution range (in mm)

3h3

500

1.875

5h5

200

0.75

7h7

50

0.1875

9h9

40

0.15

As shown in the test, more dots on the calibration target will enhance the accuracy and repeatability precision, while calibration with 7h7 or more dots satisfy the standard of drilling. But due to the measurement and mount errors, the accuracy can hardly be improved when it reaches around 0.15mm. 4.3. Conclusions In this paper, an Omni-directional mobile industrial robot is introduced and a navigation scheme for the robot is demonstrated. The vision-based deviation rectification method is developed for accurate positioning. Based on the rectification, final positioning errors of the drilling work is kept within f0.2mm. Furthermore, some experiments are carried out to compare the influence of different calibration targets on the robot system. From this we can know the mobile drilling system highly satisfies the high-quality standard of aerospace manufacturing industry. Acknowledgements This work was supported by the Shanghai Municipal Science and Technology Commission Grant No. 14111104502 and 15550721900. References 1. Campion, G., G. Bastin, and B. D’Andrea-Novel, Structural properties and classification of kinematic and dynamic models of wheeled mobile robots. IEEE transactions on robotics and automation, 1996. 12(1): p. 47. 2. Er, P., et al., Mobility Assistance Design of the Intelligent Robotic Wheelchair. International Journal of Advanced Robotic Systems, 2012: p. 1. 3. Gopalakrishnan, B., S. Tirunellayi, and R. Todkar, Design and development of an autonomous mobile smart vehicle: a mechatronics application. Mechatronics, 2004. 14(5): p. 491-514. 4. Feng, H., X. Qin, and R. Wang, Developing Trend of Industrial Robot in Aerospace Manufacturing Industry. Aeronautical Manufacturing Technology, 2013(19): p. 32-37. 5. Mei, B., et al., Robot base frame calibration with a 2D vision system for mobile robotic drilling. The International Journal of Advanced Manufacturing Technology, 2015. 80(9-12): p. 1903-1917. 6. Huang, L., et al. Design and analysis of a four-wheel omnidirectional mobile robot. in 2nd International Conference of Autonomous Robots and Agents. 2004. 7. Song, J.-B. and K.-S. Byun, Design and Control of a Four-Wheeled Omnidirectional Mobile Robot with Steerable Omnidirectional Wheels. Journal of Robotic Systems, 2004. 21(4): p. 193-208. 8. Asama, H., et al. Development of an omni-directional mobile robot with 3 DOF decoupling drive mechanism. in Robotics and Automation, 1995. Proceedings., 1995 IEEE International Conference on. 1995. 9. Mourioux, G., et al. Omni-directional robot with spherical orthogonal wheels: concepts and analyses. in Robotics and Automation, 2006. ICRA 2006. Proceedings 2006 IEEE International Conference on. 2006. 10. Ben-Tzvi, P., A.A. Goldenberg, and J.W. Zu, Design, Simulations and Optimization of a Tracked Mobile Robot Manipulator with Hybrid Locomotion and Manipulation Capabilities. 2008. 11. Bräunl, T., Omni-Directional Robots. Embedded Robotics: Mobile Robot Design and Applications with Embedded Systems, 2006: p. 113-121. 12. Dickerson, S.L. and B.D. Lapin. Control of an omni-directional robotic vehicle with Mecanum wheels. in Telesystems Conference, 1991. Proceedings. Vol.1., NTC '91., National. 1991. 13. Jefri Efendi, M.S., Y. Sazali, and M.J. Mohamed Rizon, Omni-Directional Mobile Robot with mecanum wheel. 2005. 14. Kumra, S., R. Saxena, and S. Mehta, Navigation System for Omni-directional Automatic Guided Vehicle with Mecanum Wheel. IOSR Journal of Electrical and Electronics Engineering (IOSRJEEE) ISSN, 2012: p. 2278-1676.

25

26

Shuai Guo et al. / Procedia Computer Science 105 (2017) 20 – 26 15. Muir, P.F. and C.P. Neuman, Kinematic modeling for feedback control of an omnidirectional wheeled mobile robot, in Autonomous robot vehicles. 1990, Springer. p. 25-31. 16. Viboonchaicheep, P., A. Shimada, and Y. Kosaka. Position rectification control for Mecanum wheeled omni-directional vehicles. in Industrial Electronics Society, 2003. IECON '03. The 29th Annual Conference of the IEEE. 2003. 17. Atkinson, J., et al. Robotic drilling system for 737 aileron. in SAE 2007 AeroTech Congress & Exhibition, Los Angeles, CA, USA. SAE Technical Papers. 2007. 18. Bi, S. and J. Liang, Robotic drilling system for titanium structures. The International Journal of Advanced Manufacturing Technology, 2011. 54(5-8): p. 767774. 19. DeVlieg, R., et al., ONCE (one-sided cell end effector) robotic drilling system. SAE Technical Paper, 2002(2002-01): p. 2626. 20. Olsson, T., et al., Cost-efficient drilling using industrial robots with high-bandwidth force feedback. Robotics and Computer-Integrated Manufacturing, 2010. 26(1): p. 24-38. 21. Zhu, W., et al., Measurement error analysis and accuracy enhancement of 2D vision system for robotic drilling. Robotics and Computer-Integrated Manufacturing, 2014. 30(2): p. 160-171. 22. Zhu, W., et al., An off-line programming system for robotic drilling in aerospace manufacturing. The International Journal of Advanced Manufacturing Technology, 2013. 68(9-12): p. 2535-2545. 23. Winters, N., et al. Omni-directional vision for robot navigation. in Omnidirectional Vision, 2000. Proceedings. IEEE Workshop on. 2000. IEEE. 24. Collet, A., et al. Object recognition and full pose registration from a single image for robotic manipulation. in Robotics and Automation, 2009. ICRA '09. IEEE International Conference on. 2009. 25. Motta, J.M.c.S.T., G.C. de Carvalho, and R.S. McMaster, Robot calibration using a 3D vision-based measurement system with a single camera. Robotics and Computer-Integrated Manufacturing, 2001. 17(6): p. 487-497. 26. YE, J.-h., H.-b. WU, and T.-y. CHEN, Research on Mobile Robot Binocular Stereo Vision Measuring System. Measurement &Control Technology, 2008. 27(9): p. 15-17. 27. Xizhang, C., et al., VISION-BASED RECOGNITION AND GUIDING OF INITIAL WELDING POSITION FOR ARC-WELDING ROBOT. ᵪỠᐕ〻ᆖ ᣕ˄㤡᮷⡸˅, 2005. 18(3): p. 382-384. 28. Zhang, Z., O. Faugeras, and R. Deriche, An effective technique for calibrating a binocular stereo through projective reconstruction using both a calibration object and the environment. Videre: Journal of Computer Vision Research, 1997. 1(1): p. 58-68. 29. Zhengyou, Z., A flexible new technique for camera calibration. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 2000. 22(11): p. 13301334.

Suggest Documents