Virtual triangulation sensor development, behavior ... - CiteSeerX

8 downloads 174 Views 528KB Size Report
Virtual triangulation sensor development, behavior ..... Start measuring the distance between the laser and the fillet joint. 3. Obtain five .... IOS Press, March 1995.
Virtual triangulation sensor development, behavior simulation and CAR integration applied to robotic arc-welding PER CEDERBERG

MAGNUS OLSSON

GUNNAR BOLMSJÖ Div. of Robotics, Lund University, P.O. Box 118, S-221 00 Lund, SWEDEN [email protected] www.robotics.lu.se Abstract Background: An increasing number of industrial robots are being programmed using CAR (Computer Aided Robotics). Sensor guidance offers a means of coping with frequent product changes in manufacturing systems. However, sensors increase the uncertainty and to preserve system robustness, a tool is needed that makes it possible to understand a sensor guided robot system before and during its actual operation in real life. Scope: A virtual sensor is developed and integrated in a CAR hosted environment. The real sensor is of a type commonly used in the arc-welding industry and uses a triangulation method for depth measurements. The sensor is validated both statically and dynamically by matching it with a real sensor through measurements in setups and by comparing a welding application performed in a real and a virtual work-cell created with a CAR application. The experimental results successfully validates its performance. In this context, a virtual sensor is a software model of a physical sensor with similar characteristics, using geometrical and/or process specific data from a computerized model of a real work-cell. Key words: robotics, industrial robotics, sensors, simulation, CAR, autonomous, arc welding, virtual sensors, simulated sensors

1 Introduction Real-time information from sensors plays an important role for the applicability of industrial robots in an uncertain environment or process. Considering the number of robots deployed in industrial automation, the use of advanced sensors such as vision, laser scanners or force/tactile sensors is still small. This is for a good reason — under what conditions will the sensors guide the robot a certain distance away from the nominal trajectory? We claim that key elements to preserve system robustness are adequate virtual sensors i.e. computer models that produce a result comparable to industrially deployed sensors, and environments suitable for hosting the virtual sensors. Important work has been carried out 1

in the teleoperation area, mostly in space robotics [7, 8, 6, 9], telemedicin [3, 18] and for supervision and monitoring [17, 19]. In the area of industrial robotics, several examples exist where sensors are created to illustrate the need for dynamic interaction of sensors in a virtual environment [11, 13, 12]. Research also include for example sensor interfaces for real-time feedback [14] sensors for local calibration and autonomy [16, 5, 10]. In industrial robotics, arc welding is a process where sensors are likely to appear more often in the future. This is due to fast product changes and to new materials, which decrease the overall dimension and increases the need for keeping tight tolerances. Previous research with respect to arc welding include control with fuzzy logic, task level programming and master-slave teleoperation with human interaction [20, 4, 15]. The use of robots generally requires an integrated approach where product data defined within a CAD environment is taken as input and applied within a CAR (Computer Aided Robotics) software that enables modeling, simulation and programming of robot operations. A simulation environment with its use of sensors provides an attractive way of testing a robot work-cell before and during its actual operation in real life since sensors with realtime connection to trajectory generation may later lead to malfunction in the real robot cell. Problems that may occur are numerous including out of joint limits, collisions with objects in the workspace, moving into singularities with resulting robot configuration changes, etc. However sensor simulation is currently not supported by CAR applications unless they are extended with code provided by the user that include both a virtual sensor model of the real sensor operated and a framework to handle parallel activities during simulation. In this paper a virtual triangulation sensor is developed and integrated in a CAR hosted environment. The sensor is validated both statically and dynamically. Statically by matching it with a real sensor through measurements in setups and dynamically by comparing a welding application performed in a real and a virtual work-cell created with a CAR application. The work-cell includes a robot guided by a laser sensor commonly used in the industry. During simulation and operation, it is controlled by a second CAR application containing a similar work-cell with the workpiece in a nominal pose. The latter workcell is augmented with objects reflecting the workpiece’s actual pose. These objects are updated based on information provided by the virtual/real sensor in real-time. The paper is organized as follows. Section 2 gives an overview of robot programming in CAR environments. Section 3 and 4 describes the physical and the virtual laser scanner respectively. Section 5 describes statically and dynamically performed experiments after which results, discussion and conclusion follows.

2 Conceptual Ideas The normal process in off-line robotic programming using a CAR application is to build the work cell, create trajectories and finally generate a program for the specific target robot. If the work-cell is created with sufficient accuracy and fixtures prevent changes in geometry during the manufacturing process, the program will run correctly on the target system. In reality, however, such a procedure is not adequate for robots using sensors, specifically if they are used for trajectory guidance in real-time. Moreover, the feed-back loop used in most robot systems are implemented as a local loop, only considering the specific instructions used to define the robot task as a set of motions. Another approach should be to make full use of the sensor information and integrate it to

2

the world model and application process models. Real-time sensor feedback to the world model means that the information from sensors will actually update the world model, including updating object positions in real time as required or creating objects not included beforehand. Through this mechanism, the use of sensors can be validated in a CAR environment as similar tests should be made in a real, physical set-up.

3 Experimental Setup The M-Spot sensor is a triangulation 3D scanner with a CSR-4000 control unit from ServoRobot Inc. The CAR system is Envision, a simulation software from Delmia. The M-Spot laser scanner is mounted on an ABB IRB-2000 robot, approximately 50-mm ahead of the welding gun in the welding direction. The scanner analyzes the workpiece geometry with a frequency of 20 Hz and returns five key points describing the joint and the track-point, see Figure 1.

Figure 1: System running during experiments. The real system (left) is captured from a video recording [1]. The virtual system (right) is from a Quicktime movie [2].

3.1 Laser scanner The laser beam projection is called the optical plane and the camera can only image objects that intersect the optical plane within the effective depth of field of the camera. The coordinate system of the camera is defined in the optical plane. A 3D model as represented in the CAR system is shown in Figure 2. Depending on the camera head, it has a practical resolution ranging from less than 0.015 millimeter at close working distances (100 mm) to 1.5 millimeters at far working distances (1000 mm).

3.2 Control unit The controller maintains the camera head at its optimal operating point. It adjusts the power and/or sensitivity in order to cope with varying surface finishes and digitizes the video signal of the camera and performs low-level vision processing.

3

Figure 2: A 3D model representation in the CAR system of a welding torch with attached seam tracker.

3.2.1

Processing algorithms

Several image-processing algorithms are provided dedicated to the following standard joints: fillet, corner, lap, butt and v-groove, as shown in Figure 3. Several techniques are used to ensure the system robustness. A scratch on the surface of the plates or shiny surface conditions may cause outliers. An outlier is a point that is far away from most of the others. The outliers are detected and eliminated at the initial stage of image processing. The algorithm of each joint includes validation features taking into account the obstacles that can be seen in the field of view of the camera. 3.2.2

Image processing region and break points

Each profile contains 256 or 512 points. The boundary size defines the image-processing region which may be specified for two reasons: to restrict the vision-processing region in order to avoid unnecessary features that may confuse the vision analysis, and to reduce the vision processing time by eliminating unnecessary profile points. Break points are feature points extracted from profile. In each image-processing algorithm, the break points are extracted from the profiles based on the joint features. They are numbered from 0 to 7 and can be used to define the tracking point position or to extract further information. The quantity of break points depends on the quantity of joint features but it cannot exceed 8. The break points are labeled as B , ..., B . 3.2.3

Weld joint recognition

To be able to extract features from the sensor data, image processing must take place. A basic function is to filter the data and create line segments that match the criteria of specific weld joints and their tolerances. The details in this process are beyond the scope of this paper but includes in principle (1) elimination of outliers, (2) creation of line segments,

4

Figure 3: Standard joints: fillet and corner in left column. Lap, butt and v-groove in right column.

(3) merging of line segments with similar geometrical characteristics and (4) validation of joint parameters, for instance angle and gap, see Figure 4. Features from the profile are extracted and matched against predefined templates. The weld joint templates relate to standard joint types and parameters within each type that specifies the tolerance for the respective weld joint. The parameters that define a weld joint correspond to what is considered as the normal geometrical features of the joint. For fillet joints we are mainly interested in joint angle (minimum and maximum) and gap (minimum, maximum). There are some general observations to be made concerning fillet joints outlined in Figure 4. If the line segments and has been extracted, then the gap can be calculated as the perpendicular distance between and the endpoint . This value is in the general case somewhat smaller than the true gap between the plates , depending on the joint angle, but this has little practical value. In practice, the joint preparation may also affect the result of detecting the edge of a plate. Thus, when moving the sensor along the joint, measured features may fluctuate and filtering techniques must be adapted to the specifics of the joint.

4 Virtual sensor parts The virtual system consists of a package, which logically may be divided into four units, the virtual laser scanner, the segmentation process, the fillet joint template matcher and the graphical user interface (GUI). The laser scanner is the only of the units, which utilizes information from the world model and is thus dependent on the CAR system. In our implementation, where a perfect world model is used, elimination of outliers has been omitted. The virtual scanner communicates through two interfaces defined by the CAR manufac-

5

1 Rays

Plates

2

3 α

dmax s

g

pmax

Outlier elimination

Split when dmax > tsplit and s >= smin

Rough surface

4

Fillet Joint Template Matching.

L 01 B0

Merge when |α| < αmerge and g < gmax

α

gap

B4

L 34

Accept if αmin < α < αmax and gapmin < gap < gapmax

B3

Create breakpoints B0..B4 and process parameters α and gap

L 12

B2 B1

Figure 4: Principle of the joint filtering process: (1) elimination of outliers, (2) creation of line segments, (3) merging of line segments with similar geometrical characteristics and (4) validation of joint parameters.

6

turer. The first interfaces allow streams of binary command information to communicate with a particular device (robot, sensor etc.) in the simulation environment. Several such connections can be defined, one stream per device. During a running simulation, each connection is checked several times per simulation update. The maximum update rate depends on the workstation hardware, but can be as high as 30 Hz. The second interface is an API (Application Programmer Interface) in which software can be integrated with the CAR application’s kernel. It allows kernel-related code to be accessed or replaced. The library of functions provides a set of routines that may be used to build an application based on resources in the CAR program.

4.1 Virtual laser scanner The virtual scanner is represented by only one function, scan_profile() which emits virtual rays over a certain angle. For each ray, the CAR API AxsEntityRayCast() is called which yields the distance in millimeters and the point of intersection to the closest part (plates, weld joint). The distance is compared to a maximum hit distance that limits the measurable area. Measurements above this distance are not taken into account. Finally, a complete sweep is returned as a raylist containing a vector with measured points of intersection in the coordinate system of the virtual camera and the total number of successful measurements.

4.2 Virtual control unit The segmentation process and fillet joint template matcher represent the control unit in the physical world. 4.2.1

Segmentation process

The segmentation process consists of calling the split procedure followed by the merge procedure, see Figure 4. The points of intersection form a profile of straight lines. Initially, the profile is considered consisting of only one segment. The split algorithm divides the ,s and s . Only segments with length initial profile into several according to t between end points larger than the split threshold, t , will be split. To increase the calculation speed, the s parameter may be set to an integer value greater than 1. A value of 2 means that every other ray will be omitted in calculations. The segments are stored in segmentlist along with num_segments, the number of segments. Split is implemented as a is less than t or the recursive algorithm which divides the segment until either d region is smaller than the predefined minimum size s .d is the maximum distance between the intersection of a line between the end points of the segment and the normal to this line to a point included in the segment. Next, the merge algorithm is called. Two segments that are close enough to each other and the and fulfill angular requirements i.e. less than the maximum merge gap, g maximum merge angle, respectively are merged to one. A merge gap may occur after outliers are eliminated. The merge angle is the angle between two segments. The iteration continues through all segments in the profile.

7

Table 1: Parameters used by the segmentation algorithm Parameter

Explanation Split threshold Split interval Minimum segment size Maximum merge gap Maximum merge angle

Value 0.4 2 8 10 15

Unit mm point point point degree

Table 2: Distances and gaps used in experiments Label A B C D E F 4.2.2

Distance [mm] 63 103 156 55 98 143

Gap [mm] 0 0 0 2 2 2

Fillet joint template matching

The result from the segmentation process consists of a number of segments that fulfill stated terms of linearity. The simple template-matching module implemented uses these segment to check if the segments correspond to a pre-defined angular and gap restrictions and to create break points.

5 Experimental Work 5.1 Static Experiments A set of static measurements was performed to validate the implementation of the segmentation algorithms. The general idea was to verify that breakpoints from the physical system agreed to those calculated by the algorithms in the virtual world. The parameters in Table 1 which are necessary for the segmentation algorithm where used. In addition, degrees. the fillet joint template matching algorithm used Measurements were done both with and without gap and with different distances between weld torch and welding point, all with a 45 degree angle to both plates (90 degrees fillet weld joints) as seen in Figure 5 and Table 2. The sensor program was set up on a PC and was connected via RS232 to the control unit to show the breakpoints calculated as result of processing the laser scanner view as seen from the M-Spot camera. All distances were measured with an approximate accuracy of 1 mm. Both the virtual and the physical camera used right-handed orthogonal coordinate systems that were placed differently with the z-axis in opposite directions and apart from each other, see Figure 5. The virtual coordinate system is the one that has its origin in the camera while the “physical” camera coordinate system that is used by the sensor is the one below the fillet weld joint. The breakpoints refer to those defined in Figure 4, and the values of y and z are the values of respective coordinate. 8

a

The setup was then modeled in the CAR system, using the same distances. A GUI showing the calculated breakpoints after processing by the segmentation and fillet joint template matching algorithms can be seen in Figure 7, where breakpoints and gaps were compared between the physical and virtual system.

Y

b

gap

Z

45

X

o

Z X Y

Figure 5: Experimental setup with different values of a and gap between plates. Note that the virtual coordinate system (upper right) has different position and orientation than physical (lower left).

5.2 Dynamic Experiments The task was to weld two sides using a robot on a workpiece that essentially was a square shaped object with a bottom plate, see Figures 6, 7, 8 and [2]. The operation was performed using a slave robot in a real and a virtual work-cell. Acting as master, a robot in the virtual world model guided the sensor equipped slave robot in both work-cells. The sensor interface handled sensor communication between the master and the slave and hid the slave from the master. Thus, the master was unable to tell if a real or virtual work-cell was used as slave. In the virtual model of the real work-cell, the behavior of the real sensor was modeled as a virtual sensor. During the welding operation, the orientation and position of the seam object were continuously changed reflecting the actual position of the workpiece, which geometry was obtained from the slave work-cell through a sensor interface. Between the two welds, the workpiece on the workstation was re-calibrated automatically to reflect the position of the first seam object. The second seam object therefore was closer to the actual position of the real workpiece than the first seam object originally was. This yielded a faster search for the second start point. The initial pose of the real workpiece was changed compared to the nominal pose shown by the virtual work cell in the CAR system, see Figure 6. The experimental sensor interface implementation allowed the master to set it in an autonomous mode. In this mode, it 9

Figure 6: Virtual session with seam object attached to the Workpiece. The master is shown in the left picture and the slave in the right. The seam object reflects the actual position of the workpiece in the slave work-cell. Since the system is searching for the start point, no deviation between nominal and actual pose of the workpiece is yet detected

Figure 7: When the start point is found the system begins to weld. After one additional point is received, the seam object reflects the “real” position of the workpiece and the master directs the slave accordingly in joint coordinate space. The seam object is then continuously updated during the weld.

Figure 8: After the weld is completed, the workpiece is snapped to the seam object to reflect its actual position. This yields a better start point for the second weld.

10

recognized that when requested to do a seam search, it would 1. Light the laser scanner and confirm that the laser was on. When the fillet joint was identified and the start point eventually was found, respond with the start point coordinates. 2. Start measuring the distance between the laser and the fillet joint. 3. Obtain five positions (breakpoints) that defined the fillet joint geometry and the gap between the two plates, i.e. enough information for the master to determine position, orientation and if any change in the welding process (e.g. weaving) had to take place. The information was acquired at a frequency of 20 Hz. However, it was only sent to the master when explicitly requested (approximately every other second). 4. Wait for the master to issue a “look for end of seam” command. When the end point was found, the sensor interface alerted the master. 5. Turn off the laser and idle until the sensor interface received the next request. The virtual work cell had an equivalent real work cell as shown in Figure 1. Data flow between different units was unchanged independent of if the real system or the virtual system was used. The master process was virtually unaware of which slave that was responding. In the test system, only the base point and orientation of the scanner coordinate system was differently calibrated and had to be altered between test runs. Actual welding was not performed due to unsuitable environmental lab-conditions.

6 Results 6.1 Static results The static results are summarized in 3. As seen in Table 3, all measured values except for (B , z ), (B , y ), (B , z ), (B , y ) and (B , z ) have values within the range of mm which is about as accurate as measurements was performed during the physical experiments. Table 3: Static comparison between physical and virtual sensor’s measured data. Measurements A-C are without gap and measurements D-F with gap = 2.0 mm between plates. B . . . B are the breakpoints and (y, z) the respective values for the breakpoints. Deviations from average values.

A B C D E F

y z y 1.0 0.9 0.0 0.7 2.0 -0.3 4.2 0.5 -0.3 0.0 -0.2 0.0 0.2 -0.1 0.2 1.0 0.2 -0.5

z y z y z y z -1.1 -0.5 -0.1 -0.5 0.4 0.0 -0.1 0.0 0.2 -0.5 -0.3 -1.0 -0.3 -0.5 0.5 -0.3 0.5 -0.3 0.0 -2.8 -1.5 0.3 0.0 0.3 0.0 0.3 0.0 0.0 0.4 0.2 -0.1 0.2 1.0 -0.3 -0.6 -0.8 -0.5 -0.3 0.5 0.7 -0.5 0.2

11

6.2 Dynamic results No apparent differences between the virtual and real experiments could be seen. Calibration which was very time consuming in the real system, was easy to manage in the virtual system. 1. The overall performance, robustness and algorithms for information feedback were possible to study and analyze at an early stage. 2. The implementation of the sensor to the real, physical robot was a seamless procedure and no major changes were made except for issues related to calibration. 3. The platform developed has the advantage to handle simulation and actual operation. The same code was used in both modes.

7 Discussion The results from the static experiment show that the virtual sensor created is a good model of its real counterpart. A few values, (B , y ) and (B , y ) are significantly higher and with different signs. The deviant values are presumably caused by a small angular error, i.e. not keeping a strict 45 degree angle between the laser scanner and the two plates during measurements, see Figure 5. The methodology described in this paper makes it possible to produce and validate sensor guided robot programs. Robustness is increased in the sense that the robot operation has been verified in a simulated environment. With defined tolerances in the world model, a nominal robot program can be produced that will most likely succeed in a real time operation. The research responds to the demands for quality, safety and a reliable view of the advancing process during the process of development. The robot programs developed for real-time decisions require a completely different simulation environment. The traditional way of off-line programming and downloading of ready-to-go programs will not work, as greater autonomous behavior with sensor feedback is needed. Today’s methods are only sufficient for large-scale manufacturing such as in the automotive industry. In one-off and small batch production, sensors give the robot system more flexibility and speed up product changeover. The sensor model mimics closely the behavior of the real sensor by using similar characteristics and aspects related to sensor and robot control. By using the approach, task level control strategies can be validated through controlled experiments in a simulated environment. A further step would be to introduce sensor models from sensor vendors in a similar way as robot vendors now offer software for CAR systems for more realistic robot simulation (RRS). Future enhancements with respect to the developments described in this paper will consist of extending the sensor capability to be able to handle different joint-types besides fillet joint, for instance corner, lap butt and v-grove in a similar way as the real sensor operates. This would create a foundation for simulation of more complicated welding scenarios than the test case described. Also, a filtering function may be added to be able to simulate outliers from rough surfaces in materials.

12

8 Conclusions The method described to model and implement sensor functionality in a general available CAR system opens new possibilities for simulation and programming of robot systems in realistic industrial applications. This is important for more advanced manufacturing systems and specifically for rapid and virtual development of products where time is important for the process for developing systems that produce the product. Thus, the simulation environment must be able to represent real world processes as they appear in the context of industrial automation. The virtual sensor developed acts in the tested cases similar to its real counterpart and has been shown easy to manage in a well-established simulation environment.

Acknowledgment The paper is a result from a recent project, “Robotics in Autonomous Systems” sponsored by the Swedish National Board for Industrial and Technical Development (NUTEK) within the program “Complex Technical Systems”. Participating departments in this cooperative project were: the Division of Robotics at the Department of Mechanical Engineering, the Department of Automatic Control, the Department of Computer Science and the Department of Industrial Electrical Engineering and Automation, all at Lund University.

13

References [1] The thinking machine, part 3 (video recording). In Autonomous Industrial Robotics. Demonstration performed by Magnus Olsson. Also published and narrated in English on http://www.robotics.lu.se. NUTEK, 1999. [2] Quicktime movie of virtual system. can be downloaded from our website http://www.robotics.lu.se. Division of Robotics, Lund University, 2001. [3] F. Arai et al. Multimedia telesurgery using high speed optical fiber network and its application to intravascular neurosurgery. In Proceedings ICRA’96, Minneaposis, MN, pages 878–883. ICRA, August 1994. [4] T. Arai et al. A task-level visual robot teaching system. In International Symposium on Assembly and Task Planning, ISATP 97, pages 31 – 35. IEEE, August 1997. [5] A. Behrens and E. Roos. A method to adapt off-line programmed manufacturing tasks to the real environment using high resolution sensor devices. In 28th International Symposium on Automotive Technology and Automation, pages 121–129. ISATA, 1995. [6] A. K. Bejczy. Toward advanced teleoperation in space. In Progress in Astronautics and Aeronautics, volume 161, ch. 5. AIAA, August 1994. [7] B. Brunner et al. Multisensory shared autonomy and tele-sensor-programming – key issues in the space robot technology experiment rotex. In Proceedings of the 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 2123– 2139. IEEE/RSJ, July 1993. [8] B. Brunner et al. Task directed programming of sensor based robots. In International Conference on Intelligent Robots and Systems, pages 1080–1087. IEEE/RSJ, September 1994. [9] B. Brunner et al. A universial task-level ground control and programming system for space robot applications. In 5th International Symposium in Artificial Intelligence, Robotics and Autonmation in Space. SAIRAS, June 1999. [10] D. Kugelmann. Autonomous robotic handling applying sensor systems and 3D simulation. In Proceedings of the International Conference on Intelligent Autonomous Systems, pages 196–201. IEEE, 1994. [11] Y. F. Li and J. G. Wang. Incorporating dynamic sensing in virtual environment for robotic tasks. In IEEE Instrumentation and Measurement Technology Conference, pages 123–127. IEEE, May 1998. [12] Y. F. Li and J. G. Wang. Incorporating contact sensing in virtual environment for robotic applications. In IEEE Transactions on Instrumentation and Measurement, volume 48, pages 102–107. IEEE, February 1999. [13] Y. López de Meneses and O. Michel. Vision sensors on the webots simulator. In Virtual Worlds 98, pages 264–273. LNAI, July 1998. 14

[14] O. Madsen and H.Holm. A real-time sensor interface for 3D-tracking in welding. In Proceedings EURISCON ’94, pages 1712–1722. EURISCON, August 1994. [15] H. Ming et al. On teleoperation of an arc welding robotic system. In Conference on Robotics and Automation, Minneapolis, MN, USA, volume 2, pages 1275 – 1280. IEEE, April 1996. [16] M. Olsson, P. Cederberg, and G. Bolmsjö. Integrated system for simulation and realtime execution of industrial robot tasks. In Scandinavian Symposium on Robotics 99, October 14-15 1999. [17] J. Romann. Virtual reality as a control and supervision tool for autonomous systems. In Proceedings of the 4th International Conference of Intelligent Autonomous Systems, Karlsruhe, Germany, pages 344–351. IOS Press, March 1995. [18] A. Rovetta et al. A new telerobotic application: Remote laparoscopic surgery using satellites and optical fiber networks for data exchange. International Journal of Robotics Research, 15:267–279, June 1996. [19] Y. Wakita et al. Intelligent monitoring sytem for limited communication path: Telerobotic task execution over internet. In Proceedings of the IEEE/RSJ IROS’95, Pittsburgh, PA, volume 2, pages 104–109. IEEE, August 1995. [20] G. Xiangdong et al. Application of fuzzy logic controller in the seam tracking of arcwelding robot. In 23rd International Conference on Industrial Electronics, Control and Instrumentation, 1997. IECON 97, volume 3, pages 1367–1372. IEEE, November 1997.

15