Accuracy of the software solution and the distortion of cameras ............................. ...... Isteden for kostbare fullautomatiserte løsninger som kun jobber ut i fra CAD ...
Interactive vision-based robot path planning
Budapest University of Technology and Economics Narvik University College
Gabor Sziebig
Supervisors:
Peter Korondi, Ph.D., Hungary
Bjorn Solvang, Ph.D., Norway
Faculty of Electrical Engineering and Informatics 2007
Acknowledgements
Acknowledgements To my supervisor and friend Peter Korondi, Ph.D., his professional knowledge and continuous support helped me starting and advancing in my academic career. This support is priceless and I will never forget this in my life. For the warm welcome above the Arctic Circle and supervision of my work at Narvik University College, I would like to thank Bjorn Solvang, Ph.D. To a woman, who followed me to the world of reindeers and polar bears, where she created a warm home in the arctic cold and gave up everything to be with me. Yes Suzie, this woman is you, I hope you will marry me. Last, but not least to my mom, father and brother, without their support and deliberate care I would be a different man, who cannot reach his dreams. I hope I made all of you proud.
2
Disclaimer
Disclaimer I declare that this thesis, submitted in fulfillment of the requirements for the award of Master of Science in the Faculty of Electrical Engineering and Informatics, Budapest University of Technology and Economics, is my own work unless otherwise referenced or acknowledged. This thesis has not been submitted for qualifications at any other academic institution.
__________________ Gabor Sziebig 10/05/2007
3
Contents
Contents Acknowledgements ................................................................................................................. 2 Disclaimer .............................................................................................................................. 3 Contents .................................................................................................................................. 4 List of Figures ........................................................................................................................ 6 List of Tables .......................................................................................................................... 7 Publications ............................................................................................................................ 8 Abstract ................................................................................................................................... 9 Kivonat.................................................................................................................................. 10 Sammendrag ........................................................................................................................ 11 1.
Introduction.................................................................................................................. 12 1.1.
Thesis objective .............................................................................................................. 13
1.2.
Thesis outline.................................................................................................................. 14
1.3.
Notations ......................................................................................................................... 14
2.
Literature Overview ..................................................................................................... 16
3.
Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR) . 18
4.
3.1.
ABB IRB 2000 industrial robot and S3 M91 robot controller................................... 20
3.2.
Digital cameras .............................................................................................................. 26
3.3.
Personal Computers ...................................................................................................... 27
3.4.
Workpieces ..................................................................................................................... 28
Software Development ................................................................................................. 29 4.1.
2D track detector ........................................................................................................... 32
4.1.1. Input image types ...................................................................................................................... 39 4.1.2. Image processing ...................................................................................................................... 40 4.1.2.1. Mathematical background ............................................................................................... 40 4.1.2.2. Used filters ...................................................................................................................... 44 4.1.2.3. Edge detectors ................................................................................................................. 45
4.2.
IGRIP GSL program..................................................................................................... 50
4.3.
Robot Controller ............................................................................................................ 55
4.3.1. 4.3.2.
5.
ADLP-10 message structure ..................................................................................................... 56 ADLP-10 instruction numbers .................................................................................................. 58
Validation ..................................................................................................................... 60
4
Contents 5.1. 5.1.1. 5.1.2. 5.1.3. 5.1.4.
5.2. 5.2.1.
5.3. 5.3.1. 5.3.2. 5.3.3. 5.3.4.
Accuracy of the software solution and the distortion of cameras ............................. 60 Test case 1: straightness of lines ............................................................................................... 62 Test case 2: punctuality of line beginning and ending .............................................................. 63 Test case 3: punctuality of points .............................................................................................. 64 Results ....................................................................................................................................... 66
Accuracy of robot control system ................................................................................. 67 Results ....................................................................................................................................... 68
Overall accuracy of system ........................................................................................... 68 Test case 1: point accuracy ....................................................................................................... 68 Test case 2: curve following ..................................................................................................... 69 Test case 3: curve following with surface drop ......................................................................... 70 Results ....................................................................................................................................... 70
6.
Future Plans................................................................................................................. 71
7.
Conclusions .................................................................................................................. 72
8.
Appendix ....................................................................................................................... 73 8.1. 8.1.1. 8.1.2. 8.1.3. 8.1.4.
8.2.
9.
IGRIP .............................................................................................................................. 73 Menu ......................................................................................................................................... 73 GSL ........................................................................................................................................... 74 CLI ............................................................................................................................................ 75 Shared Library .......................................................................................................................... 75
GSL Language ............................................................................................................... 75
References .................................................................................................................... 79
5
List of Figures
List of Figures Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig. Fig.
1-1 Replacement of worker in unhealthy working environment .................................. 13 3-1 Industrial use of SAPIR ......................................................................................... 19 3-2 Movement structure of ABB IRB 2000 ................................................................. 21 3-3 Coordinate systems used at programming and programmed running .................... 22 3-4 Euler ZYX (Roll, Pitch, Yaw) Sequence ............................................................... 23 3-5 Path following principle ......................................................................................... 24 3-6 Experimental setup of ABB IRB 2000 robot and S3 M91 robot controller........... 25 3-7 Experimental setup of digital cameras ................................................................... 26 3-8 Manufactured CAD models ................................................................................... 28 4-1 System setup........................................................................................................... 31 4-2 2D track detector use-case diagram ....................................................................... 33 4-3 Initial screen ........................................................................................................... 34 4-4 Sample screen of main window ............................................................................. 35 4-5 Sample screen of view window.............................................................................. 36 4-6 See-through window .............................................................................................. 38 4-7 Sample screen of generate path window ................................................................ 39 4-8 Examples for image filters ..................................................................................... 42 4-9 Demonstration of matrix convolution .................................................................... 43 4-10 Image processing sequence .................................................................................. 44 4-11 Used materials properties ..................................................................................... 47 4-12 Sobel edge detector result .................................................................................... 48 4-13 Canny edge detector result ................................................................................... 49 4-14 Laplace edge detector result ................................................................................. 50 4-15 Simulation work cell ............................................................................................ 51 4-16 Models in simulation work cell ............................................................................ 52 4-17 2D – 3D transformation steps .............................................................................. 53 4-18 Calculation of robot arm’s tool center point rotation ........................................... 54 4-19 Usage of robot arm’s tool center point rotation ................................................... 54 4-20 Robot Controller main window ............................................................................ 56 4-21 Message capsule ................................................................................................... 57 5-1 Distortion errors of experimental cameras ............................................................. 61 5-2 Test case 1 images .................................................................................................. 63 5-3 Test case 2 images .................................................................................................. 64 5-4 Test case 3 images .................................................................................................. 65 5-5 Test case 1 result image ......................................................................................... 69 5-6 Test case 2 result image ......................................................................................... 69 5-7 Test case 3 result image ......................................................................................... 70
6
List of Tables
List of Tables Tab. 3-1 Position input format .............................................................................................. 23 Tab. 3-2 Difference among zone types ................................................................................. 24 Tab. 3-3 Operator’s PC description ...................................................................................... 27 Tab. 3-4 Controller’s PC description .................................................................................... 27 Tab. 4-1 Coordinate structure in robot instructions .............................................................. 54 Tab. 4-2 RS232C setup ......................................................................................................... 56 Tab. 4-3 Communication primitives ..................................................................................... 57 Tab. 4-4 Byte order of message text ..................................................................................... 58 Tab. 4-5 Instruction bytes ..................................................................................................... 59 Tab. 5-1 Software solution error rate .................................................................................... 66
7
Publications
Publications 1. G. Sziebig, A. Gaudia, P. Korondi, and N. Ando, “Video image processing system for rtmiddleware,” in Proc. 7th International Symposium of Hungarian Researchers on Computational Intelligence (HUCI’06), Budapest, Hungary, Nov. 2006, pp. 461–472. 2. G. Sziebig, B. Takarics, V. Fedak, and P. Korondi, “Virtual master device,” in Proc. 5th Slovakian-Hungarian Joint Symposium on Applied Machine Intelligence (SAMI’07), Poprad, Slovakia, Jan. 2007, pp. 29–40. 3. G. Sziebig, A. Gaudia, P. Korondi, N. Ando and B. Solvang, “Robot Vision for RTMiddleware Framework”, in Proc. IEEE Instrumentation and Measurement Conference Technology (IMTC’07), 2007, 1-3 May, Warsaw, Poland 4. B. Solvang, P. Korondi, G. Sziebig and N. Ando, “SAPIR: Supervised and Adaptive Programming of Industrial Robots”, in Proc. IEEE 11th International Conference of Intelligent Engineering Systems, 2007, 29 Jun – 1 Jul, Budapest, Hungary
8
Abstract
Abstract Cast parts have inconsistent geometry and grinding and deburring operations have to be carried out based on individual observation of every single workpiece. Normally, these operations are carried out manually by humans. However, due to the health risk associated with the grinding process, there is a strong incentive to explore new automated solutions. Industrial robots are viewed as strong components for this job. Programming industrial robots is traditionally done by Teach or Offline programming methodologies. Both methods encounter problems in grinding/deburring operations. In traditional Offline programming the robot path is generated from a CAD model of the workpiece. This CAD model holds no information on the irregularities (burrs) and then the necessary path cannot be created. This thesis presents a new approach for supervised robot programming, which allows new field of application of industrial robots. In the near future, automatization of manufacturing processes with industrial robots in small and medium sized enterprises would be common. Instead of a costly, but fully automated solution, which works only from the CAD model of a workpiece and does not provide 100% satisfying result, an operator is involved in robot programming. The result is a 90% automated solution with the expertise of the worker. This interactive vision-based robot programming adds the required information into the Offline programming environment. Thus, location and shape of any irregularities can be identified and the necessary deburring path can be created.
9
Kivonat
Kivonat Egy munkadarab felszíne hibákat tartalmazhat a gyártási folyamat eredményeként. Ezeket a hibákat általában a csiszolással/köszörüléssel, kézi erővel távolítják el, egyedileg minden egyes legyártott munkadarabon. A munka elvégzése nagy egészségügyi kockázatot rejt magában (káros anyag belélegzés, szemkárosodás, stb.), így ennek a munkaszakasznak az automatizálása ipari robotokkal kézenfekvő megoldás lenne. A legelterjedtebb két módszer ipari robotok programozására a Teach (Tanítás) és az Offline (Kapcsolat nélküli) programozás. Azonban ezek a módszerek egy az egyben nem alkalmazhatóak a köszörülés/csiszolás alkalmazásoknál. Offline típusú robot programozás esetén a köszörülés útvonalát a CAD modell alapján állapítják meg, ami semmiféle információt sem tartalmaz a külvilág helyzetéről, valós munkadarab felületéről és hibáiról. Így az útvonal előállítása nem lehetséges. Teach típusú robotprogramozás túlságosan időigényes, mivel az útvonal megtervezéséhez a robotot a kívánt pozíciókba kell navigálni. Ez a diplomamunka egy új robotprogramozási módszert mutat be: a felügyelt robotprogramozást. Ennek a módszernek az alkalmazásával új piacok nyílnak meg az ipari robotok felhasználásában. Így a közeljövőben általánossá válhatnak az ipari robotok a kis és közép méretű vállalkozásoknál. Robotok alkalmazásával nagyobb fokú automatizálás, magasabb haszon és jobb versenyképesség érhető el ezeknél a vállalkozásoknál. Szemben egy teljesen automatizált megoldással, az új módszer egy kezelő segítségét igényli. Az így előálló rendszer kilencven százalékban automatizált, pontosabb és olcsóbb, mint a teljesen automatizált. A felügyelt robot programozási módszer alapja egy kamerarendszer, ami elegendő információt szolgáltat az Offline típusú robotprogramozási környezetnek; így a hibák helyei, alakjai azonosíthatók és a szükséges köszörülés / csiszolás útvonalai már előállíthatók.
10
Sammendrag
Sammendrag Geometrien innenfor et sett ”like” støpte emner kan variere og sliping /avgraderings operasjoner må planlegges og utføres basert på en individuell observasjon av hvert enkelt arbeidsstykke. Normalt utføres disse operasjonene manuelt (av mennesker), men på grunn av helserisikoen forbundet med selve slipeprosessen er det sterke drivkrefter som taler for utforskningen av nye automatiserte løsninger. Industriroboter er sett på som sterke kandidater til å løse denne jobben. Programmering av industriroboter blir tradisjonelt gjort ved å benytte "Led og Lær" eller "Offline/Syntetisk" programmering. Begge metodene byr på utfordringer. ”Led og Lær er veldig tidskrevende. I tradisjonell "Offline/Syntetisk" programmering blir robotens bane generert fra en CAD modell av arbeidsstykket, denne CAD modellen inneholder ingen informasjon angående uregelmessighetene på de støpte emner og den nødvendige banen kan dermed ikke genereres automatisk. Denne hovedoppgaven presenterer en ny programmeringstilnærming kalt ”overvåket robot programmering”. Isteden for kostbare fullautomatiserte løsninger som kun jobber ut i fra CAD modellen (og dermed ikke gir ett 100 % tilfredsstillende resultat), involverer vår nye metode en operatør i programmeringsprosessen, resultatet er en 90 % automatisert løsning som inkluderer arbeiderens ekspertise. Denne interaktive kamera baserte robot programmerings metodikk identifiserer form og lokaliserer uregelmessigheter slik at disse igjen kan fjernes automatisk. Metoden åpner for nye applikasjonsfelt for industriroboter.
11
Introduction
1. Introduction Many manufacturing processes leave irregularities (burrs) on the surface of its workpiece. Burrs are often triangular in shape and found after casting, forging, welding, and shearing of sheet metals [1]. Burrs are often identified and removed by human operators using a chip removal process known as manual grinding or deburring. Manual grinding often means hard and monotonous work and the workers need to protect themselves by wearing goggles, gloves, earmuffs, etc. [2]. Grinding is often carried out at a final stage in the manufacturing process where the workpiece is expected to reach its endgeometry. Any manufacturing error at this stage could be very expensive and could even lead to the dismissal of the entire workpiece. Apart from possible operator errors, the deburring process itself is said to add up to 10% of the total manufacturing cost [1]. Due to the health risk associated with grinding and the possible added cost from operator errors, there is a strong incentive to explore new automated solutions. Industrial robots are viewed as strong components for this job. New target markets for industrial robots are the small and medium sized enterprises (SMEs). These companies cannot afford a fully automated solution for the above mentioned problems. Not just because of the high price of industrial robots, but the scalability and flexibility of the automated systems are low. A small company will never have enough money for a whole manufacturing chain; only step by step growth is conceivable. The flexibility could be solved by using open source and open architecture systems. An automated system cannot fully replace a worker in the above mentioned situations. The expertise of the worker (man in the loop) can be added to manufacturing process, which results a 90% automated system. The worker’s eyes are replaced with a camera and the worker’s arms are replaced with the industrial robot. This concept can be seen in Figure 1-1.
12
Introduction
Fig. 1-1 Replacement of worker in unhealthy working environment
Programming industrial robots is traditionally done by the Teach or the Offline programming
methodologies.
However,
both
methods
encounter
problems
in
grinding/deburring operations. The Teach method is time-consuming as it requires the operator to manually define all poses (robot end-effector position and orientation) on the workpiece. In traditional Offline programming, the robot path is generated from a CAD model of the workpiece. This CAD model holds no information on the irregularities (burrs) and then the necessary path cannot be created. Vision based robot programming (Figure 11) adds information of the real workpiece into the Offline programming environment. Thus, location and shape of any irregularities can be identified and the necessary deburring path can be created.
1.1. Thesis objective The objective of this thesis is to develop a new vision-based path planning methodology. This new methodology involves a camera system used for edge detection and an interactive software solution, where an operator can select edges and define the
13
Introduction necessary material removal. Furthermore, the system accuracy has been tested in industrial environment.
1.2. Thesis outline The structure of the thesis is as follows:
Section 2: gives a brief overview of previous results in industrial robot programming and shows state of the art of it
Section 3: describes the concept of the Supervised and Adaptive Programming of Industrial Robots
Section 4:
shows the steps of software development to achieve SAPIR
requirements
Section 5: validates the created system
Section 6: contains goals for further development
Section 7: concludes the thesis
Section 8: appendix, which might be useful for better understanding
Section 9: contains references of thesis
1.3. Notations The following notations are used throughout this document:
{a, b, …}: scalar values
{O, P, p,…}: points
{a, b, …}: vectors
{A, B, …}: matrices
A mn : indicating that the matrix A has m rows and n columns
{I 1 , I 2 , …}: images
{F 1 , F 2 , …}: image transformations, filters 14
Introduction
i , j ,k , : indexes determining some lower-level structure (for example the jth
element of the ith row of matrix A is (A) i, j = a i, j )
15
Literature Overview
2. Literature Overview Most widespread programming of industrial robots can be done by two ways: Offline programming or Teaching. Both programming methodologies suffer from drawbacks in grinding or deburring applications, which are presented below. In case of Offline programming, the robot is instructed by a robot-program, which is uploaded to the robot after it has been compiled to machine code. The robot program is written in a text based editor and this text is compiled to machine code. The machine code can be run on the robot, where the instructions are processed and executed step by step. In case of grinding/deburring, the coordinates and instructions are based on the CAD model of the workpiece. The CAD model does not contain any information about the manufactured workpiece, only the surface and dimensions are known, thus this cannot be used for this type of application. The method of Offline programming is the most common type of robot programming. Also, newer types of Offline programming tools were developed to simulate the exact work cell of the robot. A Constraint in Offline programming is the low accuracy of the robot. Accuracy of an industrial robot highly depends on its built up and loadcapacity. The average load capacity of an industrial robot is 400kg, meanwhile the accuracy is 1 mm, but both values can be improved. Ken Young and Craig G. Pickin [3] showed that this can be enhanced by proper calibration process of the industrial robot. In case of Teaching, the robot is controlled by the teach pendant, the operator moves the robot along a path in the work cell. The path is built up from points. These points are stored and velocity, orientations are assigned to them. This method is very time consuming and now rarely used in real industrial applications. With Predictive Robot Programming (PRP) [4] improvements have been made to achieve faster Teaching, but the results are used only in manipulator type robots. Many pioneering activities were started to make robot programming more efficient and flexible [5] [6] [7]. All of these activities used Offline programming methodology. In the beginning, the repeatability was emphasized. An industrial robot was used to do the same task over thousand or million times. As the production number dropped, the robot
16
Literature Overview programming time rose as a constraint in fast production type switching. Not only mass production enterprises wanted to use industrial robots, but the SMEs appeared as new market. SMEs were expected to use industrial robots in their production chains several years before. In the mass production of the large enterprises, the higher automation degree and higher production number was achieved by using industrial robots. In 1999, industrial robot programming was made easier for SMEs with the introduction of PIN (PC-based interactive programming system for industry robots). Wenrui Dai and Markus Kamper [8] [9] created a robot simulation environment for Offline programming of industrial robots. Their expectations of the spreading of the new methodology in welding processes were high, but did not come through. The industrial robots were still too expensive for SMEs. These days industrial robots are getting cheaper and robot programming is becoming easier. Specialized solutions and projects exist to deal with small batch sizes [10].
17
Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR)
3. Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR) An industrial robot usually works in a production line. The robot is programmed once and the program is repeated over a thousand times. In case of grinding and deburring applications, usually the robot has to be reprogrammed based on individual observation of every workpiece. Reprogramming the robot with Offline programming each time is not acceptable, as it is time consuming and needs highly qualified operators. Usually grinding and deburring is carried out by not so qualified people, who are capable of doing tedious and monotonous work in hard and unhealthy work-environment. Using an industrial robot for grinding and deburring, which is at least as punctual as the human (Section 5), would be a good alternative in the above mentioned application. A fully automated grinding or deburring system would be expensive and the overall result could be worse than in case of the manual grinding or deburring. Utilizing the experience from manual grinding, an operator helped with a 90% automated system will result in the best efficiency, healthier working environment, higher production number and less cost. These benefits make SAPIR powerful and low cost. SAPIR consist the following elements (Figure 3-1):
Conveyor belt, where workpieces pass
Camera, which examines the workpieces and the result is shown on a PC monitor
Light, for the best result
PC, operator interacts with this and the PC calculates the robot program
Industrial robot, programmed by the PC
Workpieces, result of the previous production steps
18
Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR)
Fig. 3-1 Industrial use of SAPIR
The typical industrial use of SAPIR can be seen in Figure 3-1. The image from the camera is passed through many filters to help the operator in error identification. The operator’s task is to identify the errors (burrs, irregularities) of workpiece in the image provided by the camera. Error identification is done by the operator by drawing lines, curves or regions on the image. The Operator is involved only in this task; all other system components are fully automated (Robot program generation, robot program compile, robot program upload). For security reasons, the experimental setup of SAPIR is modified. The error detection and robot program upload are separated. The experimental setup consist the following elements:
ABB IRB 2000 industrial robot, with S3 M91 robot controller
Two digital cameras with different resolution
Two PCs, one for the communication with the robot controller and one for calculating the robot path and for error detection
Three different kinds of workpieces
In the next subsections these elements will be introduced.
19
Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR)
3.1. ABB IRB 2000 industrial robot and S3 M91 robot controller The ABB (Asea Brown Boveri) IRB (Industrial Robot) 2000 industrial robot [11] primarily designed for arc welding, sealing, gluing operations and also usable for assembly, water jet cutting, laser cutting and material handling. The robot is designed for fast acceleration, deceleration and top-end-speed, with a slender arm that has a reach of more than 1.5m allowing for a large working envelope. When the track or the pedestal is mounted, the IRB 2000 can be utilized for producing applications like palletizing. The IRB 2000 has 6-axis and a maximum handling capacity of 10kg (22 lbs), depending on the distance of the load from the wrist. The robot has 6 degrees of freedom (DOF). In Figure 3-2 the movement structure and main part can be seen (Source: Programming Manual of Robot Control System S3, Chapter 3, page 1). The robot’s movement pattern can be described as follows:
Axis 1 (C): Turning of the complete mechanical robot arm system
Axis 2 (B): Forward and reverse movement of the lower arm
Axis 3 (A): Up and down movement of upper arm
Axis 4 (D): Turning of the complete wrist center
Axis 5 (E): Bending of wrist around the wrist center
Axis 6 (P): Turning of mounting flame
All of the actuators are servo-controlled, brushless AC motors, specially adapted to each axis. The measurement system consists of a resolver on each motor shaft and a measurement board mounted on the robot. The resolver is used for gathering speed and position data. The incremental movement is approximately 0.125 mm and repeatability value is less than 0.1 mm in both directions.
20
Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR)
Fig. 3-2 Movement structure of ABB IRB 2000
The S3 (Serial 3) M91 robot controller system maneuvers the robot. The controller instructs the motors what movements they should do. Furthermore, the controller calculates all the necessary positioning instructions (velocity, position, orientation). The coordinate systems that are used in programming and programmed running of the robot system are shown in Figure 3-3 (Source: Programming Manual of Robot Control System S3, Chapter 3, page 3).
21
Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR)
Fig. 3-3 Coordinate systems used at programming and programmed running
Position and orientation values are specified with respect to the base coordinate frame. All frames are defined in Cartesian coordinate system, but any kind of coordinate system can be used for position input, the transformation between Cartesian and other coordinates systems are automatic. The orientation of the robot’s wrist is described by an Euler ZYX (Roll, Pitch, Yaw) sequence, which is shown in Figure 3-4. For position input format see Table 3-1. Four types of predefined positions exist. A type of position can be: FINE, PATH, CORNER1 or CORNER2 point. These denominations define paths following parameters and velocities.
22
Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR)
X coordinate Y coordinate 100.0 100.0
Z coordinate 100.0
Roll angle 0.0
Pitch angle 0.0
Yaw angle 0.0
Tab. 3-1 Position input format
Fig. 3-4 Euler ZYX (Roll, Pitch, Yaw) Sequence
To ensure accurate path following with high speed, the S3 controller uses a new servosteered method: servo path following (SPF). The SPF system influences positions, straight paths, curved paths and corners. The servo-steered system holds the robot tool center point (TCP) close to straight line between the two programmed positions and compensates for changing moment of inertia, gravity and robot dynamics without overshoot. When a change of direction in TCP movement is programmed, the main processor automatically generates a parabolic corner path. The corner path starts at a defined distance from the programmed position. This distance is called the corner zone or just zone (Figure 3-5 (Source: Programming Manual of Robot Control System S3, Chapter 3, page 20)). The width of zone can be predefined and also dependent on the current velocity.
23
Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR)
Fig. 3-5 Path following principle
The sizes of the different zones are shown in Table 3-2. The negative values in table mean velocity dependent zones. Zone type PATH
Value -25
CORNER1
15
CORNER2
-100
FINE
2
Description for path building for taking corners in connection with wrist orientation for fast curves for positioning followed by pause
Tab. 3-2 Difference among zone types
Velocity between the path points and in corner paths are evaluated based on the above mentioned properties. The actual value calculation and cornering deviations can be found in [12] in Chapter 3.6.3 – 3.6.6. In Figure 3-6 the ABB IRB 2000 robot, robot gripper, manual programming unit (teach pendant) and S3 M91 robot controller can be seen in the experimental setup.
24
Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR)
ABB IRB 2000 robot
ABB S3 M91 controller
Robot gripper
Manual programming unit
Fig. 3-6 Experimental setup of ABB IRB 2000 robot and S3 M91 robot controller
In the experimental stage, instead of a grinding tool, a pen was used to demonstrate the punctuality. All experiments have been done with this pen.
25
Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR)
3.2. Digital cameras Two different types of digital cameras have been used; a low resolution (640*480 pixels, 1.3 Mega pixels) AXIS 206 Network camera [13] to demonstrate the lowest acceptable resolution in industrial environment and a higher resolution (2856*2142 pixels, 6.1 Mega pixels) Kodak DX7630 compact digital camera [14] to demonstrate that better resolution makes higher accuracy in image processing. Both instruments have been mounted to a stage at a height of 80 centimeters. The stage has adjustable lighting and height. The experimental test setup and both cameras can be seen in Figure 3-7 (Source for camera pictures are [13] and [14]). The communication between the PC and AXIS camera was solved through Ethernet, between PC and Kodak camera was USB based.
AXIS camera
Kodak camera
Camera’s stage Fig. 3-7 Experimental setup of digital cameras
26
Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR)
3.3. Personal Computers For operator interaction a desktop computer has been installed in the laboratory. The details of the operator’s computer can be found in Table 3-3. Processor Memory Hard drive Monitor Operating System Programming Environment
3 GHz 1 GB 80 GB 19” CRT Windows XP Professional SP2 Microsoft .NET 2.0 C#
Tab. 3-3 Operator’s PC description
The operator’s PC was connected to the university’s 100 MBit/s network. The AXIS network camera could be reached through this network on the following IP address: http://158.39.26.115/ with the following username / password: public / letmeseeit. The robot controller was also connected to a desktop computer. The communication link between the controller and PC was an RS232C serial link on COM port 2. The controller’s PC details can be found in Table 3-4. Processor Memory Hard drive Monitor Operating System Programming environment
1 GHz 256 MB 30 GB 17” CRT Windows NT 4.0 SP5 .NET 2.0 C#
Tab. 3-4 Controller’s PC description
The controller’s PC was also connected to network, which is absolutely separated from the internet. This is only used by machines (CNC milling machine, SCARA robot, etc.) in the laboratory.
27
Concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR)
3.4. Workpieces For system testing three different types of workpieces were designed and manufactured. Each of them was created for different kind of testing: plane drawing, freefall drawing, drops and spurs drawing. The CAD model was created with PTC Pro/Engineering CAD modeler [15]. The CAD model and the manufactured workpieces can be seen in Figure 3-8.
Raw workpiece CAD model
Manufactured raw workpiece
Drops and spurs workpiece CAD model
Manufactured drops and spurs workpiece
Freefall workpiece CAD model
Manufactured freefall workpiece
Fig. 3-8 Manufactured CAD models
28
Software Development
4. Software Development Programming a robot can be done by various methodologies as seen before. The best way and probably the most used is the Offline programming methodology. Using Offline programming, the time consuming teaching can be skipped, but without knowledge of real world new problems arose. Newer types of Offline programming tools were developed to model the environment in the virtual reality and also to help programmers with robot simulations. Line text editor based offline programming tools are replaced with 3D virtual reality simulators, where robot movements are designed and simulated. In the virtual reality, the robot’s real working environment is also modeled. High precision measurement of the environment is needed to create good quality program code. This measurement cannot be automated and the punctuality of the overall system highly relies on this step. In virtual reality the robot stands on a smooth plane and the workspace where the robot interacts with workpieces is also a smooth plane. But in the real world this cannot be granted, some simplifications must be done to limit system complexity. A little modification in the environment must be also traced in the virtual environment. In the virtual environment the robot movement is planned and the movement is stored in the robot program, which can be later uploaded to the robot. The generated robot program code is hardly legible and still needs a post processing step. The post processing step compiles the source code to robot code, which is called machine code. The methodology described above is used almost in all of the new type Offline programming tools [16] [17] [18]. This makes robot programming faster than teaching, but less accurate. Instead of spending hours of measuring the real world, the same result can be achieved when only the workpiece position is used for calculations. The idea behind this is that the position of tables, machines are not important for the robot, when the robot is only interacting with the workpieces. The origin and the coordinate system of the workpiece are needed by the robot. The origin of the workpiece can be
29
Software Development calculated with many types of measuring technologies. The technologies can be split into two groups; the contact (force feedback, mechanical probing) and the contact less (electromagnetic feedback, ultrasonic sensing, thermal sensing or vision based) technologies. The contact based technologies are hard to automate and usually rely on a contact less technology, but this combination provides high precision. The more accurate the used contact less technology, the more has to be paid for that. Using a camera as vision based contact less measuring is not an unusual solution. In industrial environments, it is preferably used only as a second sensor, supporting probably a laser based measuring system. With the help of an operator, a camera based system becomes very powerful and remains fast, simple and cheap.
The drawback of offline programming tools is the need for well prepared input of the real workpiece and the preprocessed robot task. In this case the offline programming tool needs only to run a simulation without need of operator interaction. Well prepared input of real workpiece can be achieved by using a camera as a contact less measuring system and the robot task can be also defined on the workpiece based on the image of camera. The operator is only involved in defining the task, which is drawing lines and curves on the 2D image of the workpiece. This is less complicated than instructing the robot in the offline programming tool. After the simulation in the offline programming tool the robot program is compiled to machine code. This complied code can be directly uploaded to the robot. In industry remote management is a key task and the current ABB (Asea Brown Boveri) S3 (Serial 3) controller did not support this. Using the ADLP-10 (ABB Data Link Protocol 10) and ARAP (ABB Robot Application Protocol), the remote operability was also solved.
The proposed system, which is depicted in Figure 4-1, operates as mentioned above.
30
Software Development
Fig. 4-1 System setup
The system can be split into three parts.
The first part (Section 4.1) creates the robot task (can also be called robot 2D path, robot moves along this) from the image, acquired by the camera. The goal is to reach the optimum in accuracy and complexity; the robot moves along points, which describes a geometrical form. This form can be a line, described by a start and an end point, but it can be also a Bezier curve described by 500 or more points. It is the operator’s role to make the decision. Using an operator here is a simplification, but not in the mean of retardation of quality. Instead of accuracy drop, the operator makes the system as accurate (or more) as it could be with using teaching or online programming, but without the time consumption. The simplification makes robot programming not only absolutely automated, but it can be done remotely, far away from the robot. The scope of the thesis is to create a vision-based, cheap and easily usable robot programming environment. If we used a laser sensor measuring the origin of a workpiece, the system would become totally automated, but the total cost of ownership would be much higher.
31
Software Development The second part (Section 4.2) uses an existing robot simulation solution for the design and offline programming of complex workplaces fitted with several systems, called IGRIP (Interactive Graphics Robot Instruction Program) and the ABB machine code compiler called OLP3 (Offline Programming 3). The workpiece data and CAD (Computer Aided Design) model is imported to the virtual environment in IGRIP and the simulation is run there. The source code provided by IGRIP is compiled by OLP3 to machine code. The detailed description of IGRIP can be found in Appendix 8.1.
The third part (Section 4.3) makes the system remotely usable.
The following subsections introduce the three parts, in the same sequence as mentioned above.
4.1. 2D track detector The goal of the application was already presented in the previous section, now the details and the transformation steps (from raw image to 2D workpiece coordinates) will be presented. In Figure 4-2 the application use-case diagram can be seen. In this figure, the transformation steps can be clearly identified. The program starts with an initial screen, which is depicted in Figure 4-3.
32
Software Development
Fig. 4-2 2D track detector use-case diagram
The program is controlled from the menus. The menu functions and structure are presented below:
File menu o Save Input Image As…: Save input image for further use o Save Output Image As…: Save output image for further use o Exit: Exit application
Configuration o Camera Image Source: Choosing input image type (see Section 4.1.1)
IP Camera
Web Camera
Digital Camera
Image file: Load an image file as input 33
Software Development
Image processing: Modify parameters of the image processing
Opacity of View Window: Modify opacity of view window
Help o About: Displays authors dialog
Fig. 4-3 Initial screen
After choosing the input image source, the image is shown in the left part of the application, in the Image from camera box. If it is a live video or an IP camera the live video is shown. We can stop the video whenever we want, and process the actual image shown in the Input image box. The processed image can be seen in the right part of application in the Processed image box. The process means that the picture is sent through image filters, and the resulting image is in the Processed image box. The image processing steps, algorithms and filters are presented in Section 4.1.2. If we are satisfied with the image processing, we can step forward with pushing View in Window button, or we can start the whole process from the beginning to get a better result with tuning some parameters of image processing. A sample screen can be seen in Figure 4-4.
34
Software Development The View Window shows the same picture as in the Processed image box, but in full size. If the picture does not fit in the screen it can be resized, or even cropped. But every modification in the picture resets all curves and lines, reference points and the origin of the workpiece. The crop, resize functions can be accessed by clicking with the right mouse button on the picture. In the View Window the operator can identify the errors of the workpiece and can instruct the robot for deburring / grinding parts of workpiece by creating lines curves and region for robot movement. The line, curve and region functions can be selected from the menu on the bottom of the window. A status box instructs the operator what to do and how many points are needed for a line, curve or region. The reference points (also the reference distance) are also set up through the menu. After setting up the reference distance, the image is rotated to be parallel with the window. The origin can be set up right after the reference distance has been set. A sample screen of the View Window is shown in Figure 4-5.
Fig. 4-4 Sample screen of main window
The following geometrical figures can be created on the surface of the workpiece:
Line: contains one start and one end point
35
Software Development
Curve: contains four points, one start and one end point, and two control points
Curves: contains connected curves, connection on end and start point
Region: contains at least three points
The curve is represented as Bezier curves and the calculation of Bezier curves are based on the following equations: n
Parametric Bezier Curve Equation:
Pt Bi J n,i t
0 t 1
i 0
Bezier basis or blending function:
n n i J n,i t t i 1 t i
The parameters of the equations can be modified easily to fit the needed resolution for a curve. By default, a curve is constructed from 15 smaller lines, which is far enough for a small workpiece.
Fig. 4-5 Sample screen of view window
36
Software Development
By clicking with the right mouse button on one point of a line, curve or region, the geometrical figure can be deleted or the thickness of the line can be modified. This application does not know anything about the CAD model of the workpiece; it is only for identifying errors. The 3D coordinate generation is done by IGRIP, which is introduced in Section 4.2. However it is possible to match the workpiece with the original CAD model. The opacity of View Window can be modified from menu, and the window can be moved over the CAD modeler (e.g.: Pro/E). A sample see-through picture can be seen in Figure 4-6. The final step before the 2D coordinates are generated is to set up the tool diameter, and set the resolution of the steps for each line. Step resolution means that a line will be split to points, and the distance between these points is the step resolution in millimeters. There is also a step accuracy parameter that cannot be modified. Step accuracy parameter shows the operator the maximum of the step resolution. This parameter is defined when the reference distance and reference points are set. The distance between the reference points on the screen and the distance of reference point on the real workpiece are divided and the result is the step accuracy.
37
Software Development
Fig. 4-6 See-through window
When all parameters are set up correctly and meet the required accuracy; by pushing the save button the 2D coordinate path is generated in the program directory (positions.txt). Also a configuration file is created, which is used in IGRIP (see Section 4.2). The
38
Software Development configuration file contains parameters of step accuracy, tool size and workpiece information. A sample window of path generation window can be seen in Figure 4-7.
Fig. 4-7 Sample screen of generate path window
Usability and transformation steps in 2D track detector were presented in this Section. In the following Subsections more detailed description of input image types and the image processing are presented.
4.1.1.
Input image types
2D track detector can work from various input sources. In Figure 4-1, an example setup can be seen with a network camera. To support wide range of input sources many existing software components were examined and implemented in 2D track detector. As 2D track detector runs in Windows environment the built in Windows Image Acquisition (WIA) Type library is used as one type of input. A detailed introduction to WIA can be found at [19]. WIA implementation provides the following input sources: 39
Software Development
Web camera (USB or FireWire)
Digital camera (allows full control: remote picture take, zoom, focus)
TV-tuner card (PCI or USB)
Even more input sources are available with using Microsoft DirectX Software Development Kit (SDK). A detailed introduction to DirectX SDK can be found in [20]. DirectX implementation provides the following input sources:
Live video stream (AVI or MPEG)
Scanners (USB, LPT, or FireWire)
Finally, the capability of reading IP camera images has been implemented. This means that the most common type of IP cameras can be used:
Motion JPEG (MJPEG)
JPEG
Multi Media Stream (MMS)
Providing this wide range of input image types, it cannot be a problem using it in industrial environment.
4.1.2.
Image processing
Before starting the detailed description of various image filters, a short mathematical overview might be useful for better understanding.
4.1.2.1. Mathematical background Usually an image is represented as a matrix of pixels in the computer’s memory.
40
Software Development C1,1 C 2,1 I C m,1
C1, 2
C m, n C1, n
Cm,n 0255, 0255, 0255 represents the pixel’s color. The three values
give the decomposition of the color in the three primary colors: red, green and blue. Almost every color that is visible to human can be represented like this. For example white is coded as (255, 255, 255) and black as (0, 0, 0). This representation allows an image to contain 16.8 million of colors. With this decomposition a pixel can be stored in three byte. This is also known as RGB encoding, which is common in image processing. As the mathematical representation of an image is a matrix, matrix operations and functions are defined on an image. Image processing can be viewed as a function, where I 1 is the input image and I 2 is the result image after processing.
F I 1 I 2
Where F is a filter or transformation. Applying a filter changes the information content of the image. Many types of filters exist. Some of them are linear and others are non-linear. Range is from basic filters (Color extraction, Grayscale converter, Resize, Brightness, Rotation, Blending functions) to Matrix convolution filters (Edge detectors, Sharpen filters). The list is of course not complete. A few example filter inputs and outputs can be seen in Figure 4-8 (Input
image
source:
http://gardening.about.com/od/yourgardenphotos/ig/Rose-Photo-
Gallery/Rose-Standard.htm).
41
Software Development
Original image
Grayscale image
Inverted image
Sharpened image
Oil painting image
Edge detected (Sobel) image
Fig. 4-8 Examples for image filters
The most important and best result filters are the matrix convolution filters. The basis of the convolution filters comes from signal processing [21] [22] [23]. When you have a filter you can compute its response to an entering signal xt , by convolving xt and the response of the filter to a delta impulse ( ht ).
42
Software Development Continuous time: yt ht xt
h xt d ht x d
Discrete time:
yk hk xk
i
i
hi xk i hk i xi
The same way that we can do for one-dimensional convolution, it can be easily adapted to image convolutions. To get the result image the original image has to be convolved with the image representing the impulse response of the image filter. Two-dimensional formula:
yr , c
M 1 M 1 1 h j , i xr j , c i h i , j j 0 i 0 i, j
Where y is the output image, x is the input image, h is the filter and width and height is M. For demonstration see Figure 4-9.
Fig. 4-9 Demonstration of matrix convolution
After this mathematical introduction, the filters used in 2D track detector image processing are presented in the following section.
43
Software Development
4.1.2.2. Used filters The goal of the image processing in 2D track detector is to reveal the errors, show the differences to the operator. This can be achieved by using sequence of image filters. The sequence steps can be seen in Figure 4-10.
Input image from source
Grayscale image
Sharpened image
Edge detected (Canny) image
Inverted Canny image Fig. 4-10 Image processing sequence
44
Software Development
The sequence starts with a grayscale conversation, this step cannot be left out, because it is faster and easier to apply filters to grayscale images. A grayscale image only contains one byte information per pixel, what is a great reduction compared to three bytes per pixel. The next step is the sharpening filter. This is a pre-processing step for the edge detection. Sharpening is also a convolution filter, and can be given with its filter matrix.
1 1 1 1 9 1 1 1 1
With a sharpening filter, the contour of the objects in the image is accentuated. Other type of pre-processing filters could be also applied, but the sharpen filter is the most commonly used. For special workpieces (special surface material), there could be a better solution, but it needs individual experiments. Even in special cases the sharpen filter does not provide the best result, this fact is acceptable. In this sequence, the Canny edge detector is used. There exist many kinds of edge detectors, but our experiments shows that this type is the best for error detection. The next subsection deals with the different kinds of edge detectors. The last step in our sequence is the inverting. This step could be left out, if the black based edge detected image is better for the operator. Previous work of author (DIMAN: Distributed Image Analyzer) [24], made a stable background of the implementation of the image processing steps.
4.1.2.3. Edge detectors In this section the different kinds of edge detectors are compared with special respect to relationship with workpiece material and illumination. The hardest problem with taking pictures of workpieces is choosing the right type of illumination. There are many types of workpiece materials; every material has different
45
Software Development type of reflection properties. Two types of workpieces were used: wooden and aluminum based. The wooden workpiece is used for experimental tests, because it has no direct surface reflection, which causes errors in edge detection. The aluminum workpiece has much higher level of reflection and with good lighting condition it can also be used as input. The difference between the two materials, and sample results are shown in Figure 411.
Wooden type
Aluminum type
Direct lighting from 10 degree
Diffuse lighting from 10 degree
46
Software Development
Direct lighting from 90 degree
Diffuse lighting from 90 degree
Fig. 4-11 Used materials properties
The pictures above clearly identify the need for appropriate illumination. The two types of workpiece need different kind of lighting conditions. The same applies to edge detection technologies.
4.1.2.3.1.
Sobel
The Sobel edge detection performs a 2D spatial gradient measurement on the image. Two filter matrices are used for the function, one estimating the gradient in x direction and the other estimating it in the y direction.
1 0 1 G x 2 0 2 1 0 1
2 1 1 Gy 0 0 0 1 2 1
Gradient approximation formula: G G x G y
The Sobel edge detection only works with a 3*3 pixels at one time. The result of the edge detection can be made better with preprocessing the input image. Detailed description of the Sobel edge detection can be found in [25] pages 106-112. The experimental result image can be seen in Figure 4-12.
47
Software Development
Original image
Sobel edge detected image Fig. 4-12 Sobel edge detector result
4.1.2.3.2.
Canny
The canny edge detection is known as the optimal edge detector [26]. With low error rate and low multiple responses (An edge is detected only once). Canny edge detection is built up from several steps for the best results. The steps contain smoothing filter, searching for edge strength (gradient of image), finding edge direction and eliminating streaks. The detailed step descriptions can be found in [26]. Only the filter matrix and gradient formula is presented below.
1 0 1 G x 2 0 2 1 0 1
2 1 1 Gy 0 0 0 G G x G y 1 2 1
The filter matrix is the same as in the Sobel filter matrix, only the pre- and postprocessing steps differ. These differences make the canny edge detector better than the Sobel edge detection, the result is depicted in Figure 4-13.
48
Software Development
Original image
Canny edge detected image
Fig. 4-13 Canny edge detector result
4.1.2.3.3.
Laplace
The Laplace edge detector uses convoluted mask instead of gradient approximation. The matrix filter is presented below.
1 1 1 1 1
1 1 1 1 1
1 1 24 1 1
1 1 1 1 1
1 1 1 1 1
This makes the Laplace edge detector more sensitive to noise. A detailed description of the Laplace edge detector can be found in [25] pages 121-123. The noise sensitivity is demonstrated in Figure 4-14.
49
Software Development
Original image
Laplace edge detected image
Fig. 4-14 Laplace edge detector result
4.2. IGRIP GSL program The result of the previous section was the 2D coordinates of deburring processes. In this section the 2D coordinates will be transformed to 3D real world coordinates. The 2D – 3D mapping is based on the CAD model of the workpiece. IGRIP is a powerful robot simulation tool, in which complete robot manufacturing cells can be constructed and controlled in the virtual reality. A detailed introduction to IGRIP can be found in Appendix 8.1. In IGRIP, a simulation work cell was constructed. By default, IGRIP has much type of built in robots, tools, objects. An ABB IRB 2000 robot and a table were imported to the simulation work cell. The work cell is illustrated in Figure 4-15.
50
Software Development
Fig. 4-15 Simulation work cell
As the 2D - 3D mapping is based on the original CAD model of the workpiece, these models needed to be imported to IGRIP simulator. Three different kinds of workpieces were manufactured for experimental tests, thus these three models were used. The models were placed on the table, where the robot can reach them. The models can be seen in Figure 4-16. The Simulations can be started by operator or instructed from command line. Command line execution of simulations is faster, but it is not possible to visually follow the process and harder to debug. The 2D – 3D mapping could be automated (as mentioned above), but in the experimental stage it is important to see the results and using an operator in experimental stage is acceptable. However, the use of the operator is only limited to start the simulation. Graphics Simulation Language (GSL) was used to control a simulation. GSL provides many commands for robot controlling and simulating the whole work cells (Movement of robot joints, movement of tool center point (TCP), I/O channel interaction, etc.). The list of capabilities can be found in Appendix 8.2.
51
Software Development
Fig. 4-16 Models in simulation work cell
The 2D – 3D transformation steps can be seen in Figure 4-17. The figure describes the transformation from the aspect of the robot. The GSL program simulates a “hit and fall back” force sensor. Every point of the 2D coordinates is checked against the workpiece surface. A grinding pen is attached to the robot arm, and the robot tries to reach the surface of the workpiece from the predefined 2D coordinate. If the robot hits the surface, the 3D position is stored. The hit in simulation environment means, that there is a collision in the simulation work cell. The colliding model parts are painted in red. In real life this hit could be measured via force sensors. In case of a plain surface workpiece, the algorithm will stop, because there is no surface change (No change in depth). In case of a workpiece that contains drops or spurs in the surface, the above described “hit and fallback” algorithm does not provide the best result. This is caused by the fact that the robot always approaches the workpiece from the top.
52
Software Development
Fig. 4-17 2D – 3D transformation steps
A post processing step is needed to correct the “hit and fallback” algorithm. A function searches for surface changes in the stored 3D coordinates. If it identifies a change, the robot moves to the specified position and tries to rotate the robot arm’s tool center point to get the maximum depth in that position. The rotation is based on the position of the current 3D coordinate and the next 3D coordinate in the row. Figure 4-18 shows the calculation of the N base rotation angle and Figure 4-19 shows the actual usage of the rotation. From the 3D points the x and y coordinates are used for calculation of the rotation and the depth (z) is left out. The z coordinate is used after the rotation of N base is evaluated. The z coordinate defines the direction of A base rotation.
53
Software Development
Fig. 4-18 Calculation of robot arm’s tool center point rotation
Fig. 4-19 Usage of robot arm’s tool center point rotation
After the post processing step the 3D coordinates are saved in a robot coordinate file, which is based on the real life coordinates and the following structure as seen in Table 4-1. Position number #1 #2
X Y Z Coordinate coordinate coordinate coordinate parameter type 764.4 188.5 978.3 ZYX 792.824 123.315 778.3 ZYX
Roll angle
Pitch angle
Yaw angle
0.0 0.0
1.5 1.5
66.9 66.9
Tab. 4-1 Coordinate structure in robot instructions
54
Software Development At the same time, the corresponding robot program is generated. Real life coordinates means that the origin of the system is in the base of the robot.
4.3. Robot Controller The final step of robot programming is the communication with the robot. The ABB IRB 2000 has an S3 M91 type robot controller (Section 3.1). This controller can be instructed via RS232C serial port. The previously introduced Offline Programming 3 (OLP3) has the capability of uploading, downloading, compiling and decompiling robot programs, but it lacks the remote operation mode. The OLP3 cannot be used to remotely monitor the state of the robot, read register values, tool center point values, frame values, etc. In industrial and experimental environment, remote controlling and monitoring are key tasks [27] [28] [29], and all the programs that were introduced in the previous sections allow this operation mode. This justifies the need for the Robot Controller program. However, the OLP3 must be used for robot program compiling. The Robot Controller currently runs only in local mode, because of security reasons. The communication over the RS232C serial port (COM port) is based on a standardized protocol; in the case of ABB S3 controller, it is called ADLP-10 (ABB Data Link Protocol 10). The lack of protocol description made the implementation task hard. The OLP3 communication with the robot controller was eavesdropped and from analyzing the messages sent from PC to controller and back, the structure of messages, meaning of bytes were revealed. Not all messages were understandable, but the main functions (program downloading, uploading, erasing, starting, stopping, reading register values, etc.) are functional. The main window of Robot Controller is shown in Figure 4-20. In Subsection 4.3.1 the message structure will be described and in Subsection 4.3.2 the revealed functionalities are described.
55
Software Development
Fig. 4-20 Robot Controller main window
4.3.1.
ADLP-10 message structure
Communication over RS232C needs preceding setup on the port. The setup has to be done only once, before sending the first message. The setup for the S3 M91 controller is the following (see Table 4-2). Property name
Property value
Baud rate
9600 bits
Bata bits
8 bits
Stop bits
1 bit
Parity
Even
Description Maximum number of bits per seconds Number of bits per a block Number of bits signifying end of block Block error check type
Tab. 4-2 RS232C setup
56
Software Development The setup ensures that the communicating parties understand each other and if there is an error, it should be caused by false sending or receiving. The data messages are sent in byte representations (0…255), but for better readability the values are presented here in hexadecimal. There are some primitives that are used as signals in communication. The primitives can be found in Table 4-3. Primitive value 05 06
Primitive short name ENQ ACK
0E
WACK
0F 15 10 02 82 03 04
RVI NAK DLE STXeven STXodd ETX EOT
Primitive full name Enquiry Acknowledgement Positive acknowledgement and wait Reverse interrupt Negative acknowledgement Data link escape Start of even text Start of odd text End of text End of transaction
Tab. 4-3 Communication primitives
The architecture of communication is similar to client/server architecture. One of the communication parties plays the server role and the other the client role. If one of the parties wants to send something first an ENQ byte must be sent to the other party. This byte sets the other party to data receiving mode. The messages are encapsulated by carrier bytes. There are two that always come before the message text (DLE and STXeven or STXodd, which depends on the size of message) and there are two closing bytes (DLE and ETX). After sending ETX a checksum value is also sent, which is the XOR value of every byte in the message text plus the ETX byte, for error checking. This structure is shown in Figure 421.
Fig. 4-21 Message capsule
As the massage is delivered to the receiver, it calculates the checksum of the received message and compares the value of the received message’s checksum value. If it equals an ACK byte is sent back to the sender, if it fails a NAK byte is sent.
57
Software Development After the sender received ACK it sends EOT to the receiver, or the sender received a NAK the retransmission of message follows. The retransmission is only allowed for four times, after that an EOT byte must be sent (This means that communication link is broken). The message text has various lengths, and can contain any byte values. If a DLE byte is sent, it is doubled and when DLE is received, it is skipped and the next byte is stored, this mechanism is utilized to securely detect the ETX byte, which means the end of the message. The message text length is limited to 128 byte, if the data is longer it must be split into 128 byte long messages. The message text byte order is listed in Table 4-4. Position number 0 1
Possible values 0 0…80
2,3
1, 0 or 0, 1
4
0…FF
5
1, 2, 3, 6, 9, A
6 7..127
0 0..FF
Description Not used Size of text message Message direction type 1, 0: message from PC to controller 0, 1: message from controller to PC Instruction number for controller or for PC Message type 1: query message 2: response message 3: automatic message from controller 6: warning message 9: multipart message from PC A: multipart message from controller Not used Data
Tab. 4-4 Byte order of message text
Sending and receiving messages were introduced; the instruction commands will be presented in the next subsection.
4.3.2.
ADLP-10 instruction numbers
The detailed byte order of the different kinds of instructions is presented in the well commented source code, therefore only an overview of the different functions will be shown here. These functions are listed in Table 4-5.
58
Software Development
Instruction number
Short description
Possible values
1
Upload program
0…9999
2
Start program
0…9999
3
Stop program
No
4
TCP value
0…29
5
Location value
0…119
6 7
0…119 0..15
B 13 14
Register value Sensor value Configuration value Frame value Status list Operation mode
15
Program list
No
16
Erase program
0…9999
19
Arc Weld value
0…9999
1D
Download program
0…9999
2D
ARAP version
No
2E
Resolver value Automatic status update
No
A
7F
0…9999 0…5 No No
No
Detailed description Upload a program to a specified position Start the specified program Stop execution of the running program Get the specified TCP values Get the specified location register value Get the specified register value Get the specified sensor value Get the configuration data of the specified program Get the specified frame value Retrieve robot actual status Change operation mode of controller Get program numbers and free space count Erase the specified program Get Arc Weld data for specified program Download the specified program Get ARAP (ABB Robot Application Protocol) version Get motor resolver values Status message generated by controller
Tab. 4-5 Instruction bytes
There are still a few undiscovered instruction numbers, without those 80% of the capabilities of the controller can be reached. This is acceptable in the stage of experimental testing.
59
Validation
5. Validation The test of the overall system can be split into two parts. The first part checks the error of the software solution (Section 4). The second part checks the punctuality of the positioning system of the robot controller system (ABB S3 M91 controller introduction can be found at Section 3.1). Based on these two parts, the overall system results have been also presented. In the following subsection the part test (Section 5.1-5.2) and the overall system test (Section 5.3) will be presented. Every test series are followed by the results. These results are concluded in Section 7.
5.1. Accuracy of the software solution and the distortion of cameras The largest problem with vision-based systems is the distortion of cameras. Two types of distortion exist: pincushion and barrel. Pincushion distortion causes the image to be pinched at their center. Barrel distortion causes horizontal and vertical lines bend outwards toward the edges of the image. Every digital camera has these distortion effects, even the most expensive ones. The level of distortion depends on the quality of the lenses and the image sensor. To model both distortions two types of digital cameras were used; a low resolution IP camera (AXIS 206 Network camera [13]) and a higher resolution compact camera (Kodak DX7630 digital camera [14]). The AXIS camera has a maximum resolution of 640*480 pixels, which means 1.3 Mega pixels. The Kodak camera has a maximum resolution of 2856*2142 pixels, which means 6.1 Mega pixels. The AXIS camera has a built in wide angle lens and the Kodak camera has a built in zoom lens. The main effect of using wide angel lenses is the barrel distortion, while the
60
Validation effect of zoom lenses is the pincushion distortion. In Figure 5-1 these effects are demonstrated.
AXIS camera barrel distortion
Kodak camera pincushion distortion
Fig. 5-1 Distortion errors of experimental cameras
The experimental tests were based on the following sequence:
Drawing a figure on a millimeter paper
Taking a picture of the millimeter paper with both cameras, from the same height (80 cm was used)
Using the operator to identify the figures on the picture (Section 4.1)
Generating the 2D track for the robot
Calculating the distances of the generated 2D track and the figures on the millimeter paper
Based on this sequence, three different kinds of tests were executed, which are presented in the next subsections. The millimeter papers were cut by machines (10*15 cm papers, normal workpiece size), but we cannot rely on the correctness of the paper-corners, thus the corners are not useable as the origin of the workpiece. An inner point on the paper was used as an origin.
61
Validation
5.1.1.
Test case 1: straightness of lines
In this case, a rectangle was drawn on the millimeter paper. The rectangle is ideal for measuring long line correctness in images. A reference distance of 40 millimeter was used. In Figure 5-2 the image of both cameras and the operator interactions can be seen. AXIS camera
Kodak camera
Original image
Original image
Edge detected image
Edge detected image
62
Validation
After operator identification
After operator identification
Fig. 5-2 Test case 1 images
5.1.2. Test case 2: punctuality of line beginning and ending In this case, two lines were drawn. One line runs in horizontal direction, the other one runs vertically. With using edge detection (Section 4.1.2) the line is identified by its contours. This makes the start and end point recognition hard, because in case of drawing points and creating edge detected picture of them, the operator knows that the point is the center of the “contour circle”. The test is done with a reference distance of 40 millimeters. Figure 5-3 shows the image of both cameras and the operator interactions. AXIS camera
Kodak camera
Original image
Original image
63
Validation
Edge detected image
Edge detected image
After operator identification
After operator identification
Fig. 5-3 Test case 2 images
5.1.3.
Test case 3: punctuality of points
In this case, three points have been drawn. These points define the corners of a triangle. This test is expected to give the highest accuracy. A reference distance of 40 millimeter was used. In Figure 5-4, the image of both cameras and the operator interactions can be seen.
64
Validation AXIS camera
Kodak camera
Original image
Original image
Edge detected image
Edge detected image
After operator identification
After operator identification
Fig. 5-4 Test case 3 images
65
Validation
5.1.4.
Results
After the execution of the tests, the mean deviation for each line, point and curve was used for evaluation. The overall result for Test 1-3 can be seen in Table 5-1. AXIS camera
Kodak camera
Test 1 (Normal line) 0.877 mm 0.295 mm
0.833 mm 0.448 mm
Test 2 (Line ending) 2.575 mm 0.686 mm
0.725 mm 0.160 mm
Test 3 (Points) 0.342 mm 1.592 mm
0.360 mm 0.204 mm
Summarized (Average) 1.265 mm 0.858 mm
0.640 mm 0.270 mm
Overall results (Mean deviation) 1.061 mm
0.455 mm Tab. 5-1 Software solution error rate
The results show that by using a higher quality digital camera, higher accuracy can be reached. This result was expected before the tests were run, because this is trivial. The accuracy can be made better using higher and higher quality digital camera (e.g.: Digital SLR (Single Lens Reflex) camera), but higher quality cameras are not common in industrial environment. The AXIS camera was used in our experiments, because the resolution of the camera matches those used in industrial environments. The results of this camera in grinding and deburring applications look promising. A mean deviation of 1 millimeter in these types of applications is much better than manual grinding caused damages. One interesting thing can be noticed in Table 5-1. The value is usually twice higher than the value. The experimental test showed that this might be caused by the rotation
66
Validation used in Section 4.1. To rotate the image, a Bilinear Interpolation method has been used, which can be considered to bethe best among others (Nearest neighbour, Bilinear, Bicubic). When rotating the image, a little image quality loss occurs in image quality. The rotation step could be left out if the position and orientation of every workpiece were the same, but this cannot be guaranteed furthermore workpiece dimensions could also differ.
5.2. Accuracy of robot control system The positioning precision of an industrial robot cannot be compared to a CNC (Computer Numerically Controlled) milling machine. The field of application in case of an industrial robot and a CNC milling machine is different. The industrial standards also differ. A CNC milling machine must be capable of 0.001 millimeter accuracy, which is commonly achieved by moving the workbench of the milling machine and keeping the tool stable. In industrial robots this cannot be achieved, as the power of the robot is the moving capabilities, the wide working range, repeatability and speed. Another constraint in industrial robots is the resolution of smallest step in a linear movement. Usually this is provided by the robot manufacturer. ABB IRB 2000 has 0.125 millimeter as the smallest step. This does not mean that the robot cannot achieve higher precision, but under this value the accuracy of movement cannot be predicted and highly depends on the angle and position of the previous path point of the robot. The last and probably the largest constraint in a robot control systems is the programming capability of the industrial robot. In the case of Offline robot programming the robot programmer enters the coordinates of the movement. If these coordinates are only represented with integers (only whole numbers) the accuracy of the system is limited to millimeters. In case of the ABB IRB 2000 the robot coordinates are represented with numbers with one decimal place. The robot control system (Section 3.1) was tested with Teaching and Offline programming. The same task, drawing a square on a millimeter paper, was executed in both cases. The squares’ corners were identified by Teaching and the coordinates from Teaching were also used in Offline programming.
67
Validation The tests were executed on low speed for achieving the highest accuracy. Higher execution speed results a lower accuracy.
5.2.1.
Results
Robot programming with Teach proved the greatest power of an industrial robot: repeatability. The robot ran 100 cycles, without any fluctuation in accuracy. Offline robot programming showed almost the same accuracy, only a minimal (0.025 millimeter) mean deviation has been noticed. The results proved the manufacturers’ statement of the minimal step of 0.125 millimeter.
5.3. Overall accuracy of system Testing individual parts of systems does not provide any exact details for the overall system. This can be only achieved with whole system tests. Three tests were executed to demonstrate worst case scenarios; the low resolution AXIS camera was utilized during the tests.
5.3.1.
Test case 1: point accuracy
In this case three points were drawn on a millimeter paper. The three points represent the corners of a triangle. The robot path contained 453 points, which results in a 0.263 millimeter step size in robot movement. The result of the test can be seen in Figure 5-5. The red dots represent the three points and the green line was drawn by the robot.
68
Validation
Fig. 5-5 Test case 1 result image
5.3.2.
Test case 2: curve following
In this case a curve was drawn by hand on a millimeter paper. The robot path contained 492 points, which results in 0.265 millimeter steps in robot movement. The result of the test can be seen in Figure 5-6. On the left side, the hand drawn curve and on the right side, the robot drawn curve can be seen.
Hand drawn curve
Robot drawn curve Fig. 5-6 Test case 2 result image
69
Validation
5.3.3.
Test case 3: curve following with surface drop
In this case, a curve was drawn on the surface of a workpiece. This curve was redrawn by the robot. The workpiece and the result of the test can be seen in Figure 5-7. The curve was constructed from 343 robot coordinate points, which results in a 0.6815 millimeter step size in the robot movements. The red line is drawn by hand and the blue line is drawn by the robot.
Hand drawn curve
Robot drawn curve
Workpiece in Pro Engineering CAD modeler Fig. 5-7 Test case 3 result image
5.3.4.
Results
Figure 5-5, 5-6 and 5-7 shows the results of these tests. The accuracy of the overall system in these cases depends highly on the AXIS camera. System punctuality is less than a millimeter, which is acceptable in grinding and deburring applications.
70
Future Plans
6. Future Plans Only the final step of robot code generation results in robot specific code. Until that stage only world coordinates are generated, which are independent of industrial robot type. Development of different types of robot program compilers would result in much higher flexibility in industrial robots. Flexibility also means some kind of adaptive control, which results in wide area of industrial robot application. The SAPIR concept contains a brain in the loop. This simplification was decided after the capabilities and limitations of image processing had been studied. Currently no automated solution exists with so high level of intelligence, which the SAPIR concept needs. The human’s capabilities, decision making or error correction cannot be exactly replaced with fully automated solutions. As industrial robot tasks have become more complex and the field of industrial robot applications widens, the need for human-machine interfaces rose. SAPIR is a solution for human-machine interfaces. Further development of SAPIR can be done by introducing new technologies and methodologies. One of the promising technologies would be the cognitive vision. With cognitive vision higher degree of automation could be reached. This means, that the robot could autonomously find the origin of the workpiece, based on its sensors. But development of a cognitive sensor exceeds the limits of a master thesis.
71
Conclusions
7. Conclusions The thesis objectives (development of a new vision-based path planning methodology for industrial robots) were reached and satisfied while the concept of Supervised and Adaptive Programming of Industrial Robots (SAPIR) was evaluated and validated (Section 5), which is based on the above mentioned methodology. The concept uses a vision system, human in the loop and it is applied to grinding / deburring applications, where industrial robots should replace the hard and monotonous work of humans. Overall system test results showed high accuracy for these applications. A positioning accuracy of 0.5 mm is acceptable, compared to manual work. The accuracy of the overall system can be improved by using higher resolution cameras. Based on these results the SAPIR concept is useable for small and medium sized enterprises (SME) or even for large companies. The developed software solution also supports remote operation, which is a key task in industrial environments. Remote operation is a new feature for the industrial robot (ABB IRB 2000), which was used in experimental testing. Some further steps (Section 6) were introduced, but these exceed the limits of a master thesis. The developed software solution makes robot programming fast and flexible. The modular structure of system enables easy integration of another type of industrial robot to the system. Supervised robot programming allows and supports small batch size production, which is important for SMEs. Adaptive and fast industrial robot programming allows spreading industrial robots in new markets, such as SMEs. Using a brain in the loop is acceptable if it results in a system, which is highly productive and provides high accuracy. SAPIR copes with this and makes the proposed system powerful.
72
Appendix
8. Appendix
8.1. IGRIP This subsection contains information on the Deneb Robotics product: Interactive Graphics Robot Instruction Program (IGRIP). For overall system description see [30]. IGRIP provides an interactive, 3D graphic simulation tool for design, evaluation, and analysis. Any manufacturing process may be constructed, programmed and analyzed for cycle time, collisions and motion constraints. IGRIP is divided into three primary systems: the IGRIP Menu System, Graphic Simulation Language (GSL), and Command Line Interpreter (CLI). Advanced functionality is available through the use of the Shared Library.
8.1.1.
Menu
The IGRIP system provides a mouse driven, point and click approach to simulation. There are ten major components of the IGRIP menu system. These include: CAD: provides features for creating a three-dimensional visual representation of Parts. These Parts can be modeled from scratch or geometry can be imported from other CAD systems. DEVICE: is the Context for creating Devices using Parts which originated in the CAD system. The Devices can have kinematics. If kinematics is used, the user has the ability to give the Devices speeds, acceleration rates, dynamic properties (with Dynamics option), travel limits, along with a variety of other Device attributes. Basic examples of Devices include robots, grippers, fixturing, etc. LAYOUT: is used to assemble the Devices, connect the necessary I/O, create and manipulate Tag Points and Paths. Creating Paths includes any necessary auxiliary axes data 73
Appendix or user-defined data. This is also the Context that contains the Calibration functions to adjust Paths. MOTION: contains functions used to test, optimize, and run the Work cell. These functions include Popup for analyzing machine behavior which can then be saved and plotted, cycle time analysis, program manipulation, motion recording and playback, tool traces, defining collision lists and collision detection, Device placement optimization, Path planning, and teaching pendant-like control of the various Devices. PROG: is used to generate programs using on-screen menus to script the syntax automatically into the Device's program. DRAW: is a two-dimensional World with the ability to import and export data, create simple geometry that can be exported and extruded into three-dimensional objects. This Context also gives the user a medium by which they can document and plot the Work cell or CAD Worlds. ANALYSIS: assists in identifying various items in the World, as well as determining the distances and angles between them. Properties, such as Part area and volume, are obtained within this Context. Tools to create free-space, dynamically associative dimensions are also available here. Dimensions between geometries are associated and are continuously updated to reflect the current state of that geometry. Tools to create graphs are also available within this Context. SYS: provides the ability to define system attributes, e.g. world view (lights, grid, floor, background color, and Button colors) and files (creating directories, printing, and configuration file management). Lighting attributes (for machines equipped to handle multiple lights) and color graphs are also supported within this Context.
8.1.2.
GSL
Graphic Simulation Language (GSL) is a procedural language which can be used to control the behavior of Devices in the Work cell. GSL incorporates conventions commonly used in high-level computer languages with specific enhancements for Device motion and simulation environment inquiries.
74
Appendix
8.1.3.
CLI
The Command Line Interpreter (CLI) is a powerful communication, command and control system for accessing and operating IGRIP. It is accessible both from "inside" and "outside" the IGRIP menu system. Inside the IGRIP menu system, CLI offers an alternative to the point and click approach by allowing text commands. Examples of such are SET, to set Device and Part attributes, World views, and simulation parameters; and INQUIRE, to receive statistical information on Devices and Parts. Outside the IGRIP menu system, CLI acts in two different modes: stream and socket. In both cases, the IGRIP window appears without menus. For example, the stream mode allows a text file to be used as input as well as directing output to an output file. The socket mode allows any external program residing on any machine in a TCP/IP network to invoke IGRIP and communicate with it through a socket using CLI commands and return codes.
8.1.4.
Shared Library
The Shared Library is an open architecture environment that allows advanced users to extend or customize IGRIP with custom interfaces, communicate with external processes in real-time, create vertical applications, and link proprietary algorithms directly into the motion pipeline.
8.2. GSL Language Graphic Simulation Language (GSL) is a programming language developed by Deneb Robotics, Inc. for use in graphic simulations. GSL incorporates conventions commonly
75
Appendix used in high-level computer languages with specific enhancements for Device motion and simulation environment inquiries. GSL is a structured, Pascal-like procedural language. Like Pascal, the program is written using many of the same terms that would be used to state the solution to the original problem. GSL is used to program the actions and behavior of individual Devices in a simulation. GSL language has many built-in variables and functions. They control the motion and simulation related behavior of a Device during program execution. All of the variables have a particular value at the start of program execution. This value is the default value. For some of the system variables or functions, the default value is Device dependent. System variables are treated like any other variable when using them in program functions. Most of the system variables are real type. Functions in GSL language are called statements. A list of statements will be presented here, the overall system description can be found in [30]. Assignment Statements assign values to variables.
Variable Assignment
Array Assignment
Dynamic Memory Allocation
Free Procedure
Control Flow Statements alter the sequential flow of the program.
Break Statement
Continue Statement
Exit Statement
For Statement
Goto Statement
If Then Else Statement
Label Statement
Repeat Until Statement
Return Statement
Switch Case Statement 76
Appendix
Continue Case/Continue Test Statement
Wait Until Statement
While Do Statement
Motion Statements simulate the movement of the Device being programmed.
Follow Statement
Move About Statement
Move Along Statement
Move Away Statement
Move Home Statement
Move Joint Statement
Move Joints Statement
Move Near Statement
Move Relative Statement
Move Thru Statement
Move To Statement
Move Via Statement
Device Manipulation Statements control Device attachments.
Grab Statement
Grab Device Statement
Release Device Statement
Remote Statements control more than one Device from a single GSL Program.
Simulation Control Statements control various aspects of the simulation that is being created, including collision checking.
Add to Queue Statement
Check all Statement
Clear Collision Queue Statement
77
Appendix
Delay Statement
Exclude Statement
Remove Cross Check Statement
Remove From Queue Statement
Set Collision Check Statement
Set Minimum Distance Check Statement
Set Near Miss Statement
Set Super Checks
Signal Interrupt Statement
Simulation Update Statement
Input/Output Statements control the communication with external processes and files.
Close File Statement
Close Window Statement
Open File Statement
Open Pipe Statement
Open Socket Statement
Open Window Statement
Write Statement
Set Async
Comment Statements insert a comment into the GSL program.
Other Statements include the following:
Cancel Statement
Include Statement
Set Debug Statement
Unset Debug Statement
With Statement
78
References
9. References [1] S. Kalpakjian and S. R. Schmid, Manufacturing Engineering and Technology, Pearson Education, 2006 [2] T. Thomessen, T. K. Lien, and B. Solvang, “Robot control system for heavy grinding applications” in Proc. 30th International Symposium on Robotics, 1999, pp. 33–38. [3] Ken Young and Craig G. Pickin, “Accuracy assessment of the modern industrial robot“, An International Journal on Industrial Robot, ISSN 0143-991X, 2000, Dec, Vol. 27, No. 6, pp. 427–436 [4] Kevin R. Dixon, John M. Dolan and Pradeep K. Khosla, “Predictive Robot Programming: Theoretical and Experimental Analysis”, The International Journal of Robotics Research, 2004, Vol. 23, No. 9, pp. 955-973 [5] Hui Zhang, Heping Chen, Ning Xi, Zhang G. and Jianmin He, “On-Line Path Generation for Robotic Deburring of Cast Aluminum Wheels”, in Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, 2006, Oct, pp. 2400-2405, Beijing, China [6] Hui Zhang, Heping Chen and Ning Xi, “Automated robot programming based on sensor fusion”, An International Journal on Industrial Robot, 2006, Vol. 33, No. 6, pp. 451459 [7] F.S. Cheng and A. Denman, “A study of using 2D vision system for enhanced industrial robot intelligence”, in Proc. IEEE International Conference on Mechatronics and Automation, 2005, Vol. 3, pp. 1185-1189
79
References [8] Wenrui Dai and Kampker M, “PIN-a PC-based robot simulation and offline programming system using macro programming techniques”, in Proc. The 25th Annual Conference of the IEEE Industrial Electronics Society (IECON’99), 1999, Vol. 1, pp. 442-446 [9] Wenrui Dai and Kampker M, “User Oriented Integration of Sensor Operations in a Offline Programming System for Welding Robots”, in Proc. IEEE International Conference on Robotics & Automation, 2000, pp. 1563-1567, San Francisco, CA
[10]
Pires JN, “Robotics for small and medium enterprises: control and programming
challenges”, An International Journal on Industrial Robot, Vol. 33, No. 6, 2006, Emerald
[11]
ABB Asea Brown Boveri, ABB Robotics Homepage, [Online], Available:
http://www.abb.com/robotics
[12]
ABB Robotics, “ABB Programming Manual Robot Control System 3”, 1991, Jan
[13]
AXIS Communications, AXIS 206 Network Camera Homepage, [Online],
Available: http://www.axis.com/products/cam_206/
[14]
Eastman Kodak Company, Kodak DX7630 Digital Camera Homepage, [Online],
Available:
http://www.kodak.com/global/en/service/products/ekn028434.jhtml?pq-
path=1932
[15]
Parametric Technology Corporation, PTC Pro/Engineering Homepage, [Online],
Available: http://www.ptc.com/appserver/mkt/products/home.jsp?k=403
[16]
Gunnar Bolmsjo, “Programming robot welding systems using advanced simulation
tools”, Master thesis, 1999
80
References [17]
Boopathy S. and Radhakrishnan V., “An approach to robot off-line programming
and simulation for flexible manufacturing systems”, in Proc. IEEE/IAS International Conference on Industrial Automation and Control, 1995, Jan 5-7, pp. 461–466
[18]
Freund E., Rokossa, D. and Rossmann J., “Process-oriented approach to an efficient
off-line programming of industrial robots”, in Proc. 24th Annual Conference of the IEEE Industrial Electronics Society (IECON”98), 1998, Aug 31 – Sep 4, Vol. 1, No. 31, pp. 208-213
[19]
Microsoft Corporation, Microsoft Windows Image Acquisition Automation Layer,
[Online], Available: http://msdn2.microsoft.com/en-us/library/ms630814.aspx
[20]
Microsoft Corporation, Introducing DirectX 9.0 for Managed Code, [Online],
Available: http://msdn2.microsoft.com/en-us/library/bb318659.aspx
[21]
Steven Smith, Digital Signal Processing: A Practical Guide for Engineers and
Scientists, 2002, Oct, 672 pp. ISBN: 075067444X, Elsevier
[22]
Michael Seul, Lawrence O'Gorman and Michael J. Sammon, Practical Algorithms
for Image Analysis, 2000, Apr, 302 pp., ISBN: 0521660653
[23]
Rafael C. Gonzalez and Richard E. Woods, Digital Image Processing 2nd Edition,
2002, 793 pp., ISBN 0201180758, Prentice Hall
[24]
G. Sziebig, A. Gaudia, P. Korondi, and N. Ando, “Video image processing system
for RT-middleware,” in Proc. 7th International Symposium of Hungarian Researchers on Computational Intelligence (HUCI’06), 2006, pp. 461–472
[25]
Mark S. Nixon and Alberto S. Aguado, Feature Extraction and Image Processing,
2000, 368 pp., ISBN 0750650788, Elsevier
81
References [26]
J Canny, “A computational approach to edge detection”, IEEE Transactions on
Pattern Analysis and Machine Intelligence, Vol. 8 , No. 6, 1986, Nov, pp. 679–698, ISSN 0162-8828
[27]
A. J. Alvares, L. S. J. Romariz Jr, “Telerobotics: Methodology for the Development
of a Through-the-Internet Robotic Teleoperated System”, Journal of the Brazilian Society of Mechanical Sciences, Vol. 24, No. 2, 2002, May, ISSN 0100-7386
[28]
Rae S, “Using telerobotics for remote kinematics experiments”, Honours Thesis,
The University of Western Australia., 2004
[29]
Dalton B, “A Distributed Framework for Telerobotics”, Master Thesis, 2001
[30]
IGRIP Online Documentation, Deneb Robotics, Inc, 1998
82