Original Article
Augmented reality–based programming, planning and simulation of a robotic work cell
Proc IMechE Part B: J Engineering Manufacture 1–17 Ó IMechE 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/0954405414534642 pib.sagepub.com
Yun Suen Pai, Hwa Jen Yap and Ramesh Singh
Abstract In this article, the development of an augmented reality–based robotic work cell is presented, consisting of a virtual robot arm, conveyor belt, pallet and computer numerical control machine that simulates an actual manufacturing plant environment. The kinematics of the robot arm is realized using Denavit–Hartenberg’s theorem, which enables complete manipulation of the end-effector in three-dimensional space when interacting with other virtual machines. Collision detection is implemented in two areas, namely, modifiable marker–based detection for the robot arm, which detects nearby obstacles as well as integration with object manipulation to pick and place a virtual object around the environment. In addition, an augmented heads-up display overlay displays live information of the current system. The case studies suggest that the proposed system can simulate a collision-free operation while displaying the coordinates of the virtual object, current tool equipped and speed of the conveyor belt, with a percentage error of less than 5%.
Keywords Augmented reality, robotic work cell, kinematics, collision detection, heads-up display, robot simulation
Date received: 22 January 2014; accepted: 14 April 2014
Introduction Current implementation of robotics in engineering is not a novel approach as numerous industries have been utilizing automation to design and develop their products. However, the primary concern of robotics lies in the complexity in integrating and fully utilizing the capabilities of robots due to the fact that numerous aspects need to be considered such as programming methods, algorithms, path planning, display systems and kinematics. Owing to these requirements, small and medium enterprises (SMEs) tend to avoid incorporating robotics into their production processes, even though they are fully aware of the benefits offered by automated systems. Furthermore, SMEs avoid implementing robotics due to cost and time constraints as well as safety issues when programming robots on site. Augmented reality (AR) is a field of research, which is rapidly growing following the introduction of virtual reality (VR) that aims to fully integrate virtual and real environments to solve the above issues. VR utilizes a fully computer-generated environment, which is more costly and requires a higher degree of skills compared to AR. Even though AR has been present since the early 1990s, it is only recently that AR emerges as one of the forefront technologies due to
the rise of popularity in smartphones and tablets. This indicates that AR can be applied in various research fields and consumer products. Engineers can simulate a manufacturing environment effectively by implementing AR in robotics, in which a scaled simulation of a robotic arm is possible. The integration of AR in robotics enhances a user’s perception and visual senses while reducing costs associated with expensive prototype fabrication. In path planning, it is complex yet necessary to avoid collision and accidents. In addition, there is a lack of information feedback to the operators, which in turn increases the chances of errors as well as time consumption. AR solves these problems by clear visualization of collision detection and two-dimensional (2D) information overlay. Knowing the benefits of AR in robotics, the main objective of this study is to simulate
Department of Mechanical Engineering, Faculty of Engineering, University of Malaya, Kuala Lumpur, Malaysia Corresponding author: Hwa Jen Yap, Department of Mechanical Engineering, Faculty of Engineering, University of Malaya, 50603 Lembah Pantai, Kuala Lumpur, Malaysia. Email:
[email protected]
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
2
Proc IMechE Part B: J Engineering Manufacture
an AR environment for use of a robotic work cell, apply marker-based collision detection as well as augment 2D information in a heads-up display (HUD) to enhance the user’s visual, aural and proprioceptive senses.
Related studies Manual work by humans is generally not as efficient and flexible in comparison to robots and is further inhibited by other limitations such as the randomness of walking worker activities1 and fatigue. Hence, ergonomic studies must be taken into consideration.2 However, robots require careful calibration as well as numerous testing and programming, which are achieved through simulations. There are two main programming methods currently available for industrial robots, namely, online and offline programming. Online programming basically requires the operator to be within the working envelope of the robot, which is undesirable due to safety reasons. Conversely, offline programming uses a remote computer console, in which the input program is translated into robotic language. An AR-assisted robot simulation is considered to be a form of offline programming, which requires functions and precise positional values.3 Positioning and path planning involve computing a continuous order of configurations between a starting point and goal point. Motion or trajectory planning can be clearly distinguished from path planning since the path in motion planning is constrained by time. This means that computing a collision-free trajectory between the initial and end points requires both precise trajectory and path planning. Planning involves actions taken by the robot in order to avoid obstacles and complete the programmed path. All actions taught to the robot are then transferred to the controller for execution. Kinematic modelling of an industrial robot is a vital part of this study as the main aim is to manipulate a robotic arm effectively in AR environment. Kinematic analysis can be classified as direct kinematics and inverse kinematics. Direct kinematics is the process of searching and calculating the position and orientation of the end-effector from the given joint position whereas inverse kinematics gives the final position of the end-effector whereby the position for each joint needs to be determined. The Denavit–Hartenberg (DH)-based method is the most popular approach for kinematic analysis4 and is used for the KUKA robotic arm manipulator in this study. The mathematics and notations of the robot’s forward kinematics can be best determined through D-H method regardless of its sequence or complexity, and the location of the joint is dependent on the previous joint’s location, which can be calculated using transformation matrices. Transformation can be represented by Cartesian, cylindrical, spherical, Euler or roll–pitch–Yaw (RPY) coordinates. Another kinematic analysis method was developed in 1876 and is known as the screw theory.5
Screw represents two kinds of motion, that is, translation and rotation, which can be converted from one another. The theory is expressed in Plu¨cker coordinates, which comprise angular and linear velocities. Another kinematic study carried out by Rocha is differential kinematics, which has a similar approach and utilizes Jacobian matrix to map joint velocities to the end-effector velocity in order to implement both D-H and screw theory methods. The screw theory offers an advantage over the D-H method in modelling and analysis such that it is more flexible. However, the screw theory is more complex and difficult to implement. AR is a technology capable of blending threedimensional (3D) virtual objects to the real world seamlessly in order to visualize the D-H model of the robot and observe its interaction with the environment. AR is a subject of intensive studies and perceived as one of the 10 emerging technologies.6 The two main activities of AR are tracking and registration. Markers in the real environment relative to the camera (also known as fiducials) are tracked to obtain position and orientation data. The tracking values are utilized during registration to superimpose the 3D virtual object onto the real environment. ARToolKit is a marker-tracking library implemented for AR applications and was developed by Hirokazu Kato in 1999. ARToolKit is used as the tracking system in this study. One study was focused on the tangible user interface (TUI) aspect of ARToolkit via a prototype system used for interior design,7 which includes a physical paddle that uses transparency cues in an AR interface for team collaboration work. Even though the prototype is relatively simple, it exhibits a potential, which can be further expanded such as its implementation in this study through full 3D object manipulation. In fact, the marker-tracking system supports robot programming in an interesting manner. Robot programming was developed recently using the AR (robot programming using augmented reality (RPAR)) system, which enables movement of a virtual robot either by the number of start-goal paths or enabling the end-effector to follow a pre-defined path.8 A collision-free volume (CFV) aids the system and constrains the movement of the robot simultaneously. Although these methodologies are proven to be reliable for their intended tasks, a user-defined path is comparatively more flexible. A user-defined path enables the user to move an object, which generates a path automatically for the robot to follow and this approach is adopted in this study. However, it shall be noted that ARToolKit is incapable of handling 3D computer graphic models, and thus, a third party computer graphics rendering software needs to be employed. OpenGL is a 3D graphics and modelling library that has a wide application programmer’s interface (API) and includes a number of commands and functions. The programmer can create 3D graphics consisting of points, lines and primitive polygons, create inputs for display as well as manipulate graphics with OpenGL.9 In addition, OpenGL is capable of handling complex
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
Pai et al.
3
modelling transformation techniques, and therefore, an interactive 3D simulation system can be built for robot path planning. A recent study demonstrated a robot collision avoidance simulation achieved via OpenGL.10 The fully realized 3D virtual environment displays the robot’s capability to detect collision in an axis-aligned bounding box (AABB) collision, which is quite promising. Implementing such a technology with AR will indeed create a more immersive system, which is what ARToolKit is capable of achieving. Furthermore, OpenGL can be supported by all platforms such as Windows and Macintosh operating systems.11 This study utilizes an OpenGL application interface written in C++ programming language supported by the OpenGL library to generate virtual contents in the AR scene with collision detection. The final key component in the AR simulation system is a display, which enables the user to view the scene. Display systems are available in various shapes and sizes, and a head-mounted display (HMD) is a hardware, which is typically utilized to exploit an AR generated environment. An early study focused on a development of a method for calibrating see-through HMDs using a dynamic framework that is also applicable for stereo HMDs.12 However, there are several limitations associated with the use of see-through HMD such as constrained transformations and increased errors due to camera movement. A stochastic solution was presented to calibrate a see-through HMD for the application of AR and it was proven that the stochastic model of commonly applied mathematical models is stochastically incorrect.13 Conversely, stochastically consistent solutions are possible. Flock of Birds (FOB) has been used for the tracking system, and it is a standalone system consisting of a one-foot cube transmitting antenna, an extended range controller (ERC) and two receivers. Virtual points on the HMD screen are assigned to points in the real environment during the calibration process, while taking into account pixel accuracy and coordinate systems. It shall be highlighted that there are no studies available in the literature that provide a systematic comparison between various calibration methods of an AR system through see-through HMDs. Hence, it is evident that an intensive calibration is required to minimize errors for see-through HMDs. In addition, the ergonomics aspect of HMDs needs to be considered as they are typically bulky and uncomfortable to wear, even over a short period. In this study, a typical Windows laptop monitor is used as the display system, as it can be used to run the required applications. Since the system is used primarily for robot programming, the hardware involved is stationary, which eliminates unnecessarily complicated calibration problems associated with HMDs.
Recent advancements Several studies were carried out in the past 2 years on the methods of implementing AR in robotics. A recent
study showed that marker-tracking requires accurate pose estimation, which is dependent on camera distance and viewing angle, even though it is more accurate than markerless ones.14 Pati et al.14 proposed a method to model errors based on scaled unscented transform (SUT), which is a new function for estimating nonlinear transformation. This gives a more accurate calibration since the method is independent of the image sequences, and multiple markers are not required. In comparison, ARToolKit uses a calibration dot marking system with multiple images of 6 3 4 dot patterns captured at different angles. An accurate calibration leads to accurate robot simulation, and a recently developed prototype system is able to control a virtual robot in a real environment.15 The system architecture used in Girbacia et al.15 bares resemblance to the study in this article, with the exception that it does not cover a work cell interaction. The system is also marker-based which differs from another study that uses markerless gestures.16 Markerless system is achieved through a 3D motion tracking system for depth perception, coupled with a 2D camera. Bare-hand manipulation is seemingly more natural especially in pick-and-place tasks due to the gripping gesture, and the data are transmitted to an industrial robot controller. Spatial robot programming is a form of online programming by demonstration (PbD), even though the AR module aids in defining the program. Although spatial robot programming is quite intuitive, it is rather costly due to the fact that depth sensors are expensive simulation tools. Operations such as welding shows potential for a system such as this, and the ability to verify the simulation system appropriately is vital to provide relevant information for further development.17 The recently developed half-silvered glass works similarly to an optical see-through display, with the exception of its aspect ratio and size of the display monitor. Superimposition of virtual robots becomes much more intuitive and realistic. Even though software communication is still poor in the present, the benefits offered are promising if the system can be scaled to fit an entire work cell or placed on a computer numerical control (CNC) machine.
Path generation and manipulation in AR A planned set-up is typically required to implement AR. A novel approach towards planning a collisionfree path in an unprepared environment was presented in Ong et al.,18 taking into account the advantages of utilizing AR in industrial robot programming. The endeffector is the main concern when an industrial robot is targeted for a simulation process, and therefore, its movements are constrained along a visible path and supported by a piecewise linear parameterization (PLP) algorithm. This algorithm is essentially used to parameterize the data points by adopting an interactive generated piecewise linear approximation. Furthermore, the data points are used to generate 3D parametric curves by applying Bayesian neural networks and
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
4
Proc IMechE Part B: J Engineering Manufacture
reparameterization. Reparameterization is used for fitting problems, in which the parameters of the data points are updated after each iteration. A CFV is generated once the curve is obtained and is used to verify whether the end-effector collides with obstacles along the curve. The AR transformation matrix will accurately position the orientation of the end-effector of the robot. CFV is essentially generated from the swept volume of a sphere and is represented as a mass of virtual spheres intersecting one another. AR environment is primarily used to provide visual feedback, which improves interaction between the user and real world directly, which is also part of the PbD approach. Since a virtual industrial robot will be placed into the real environment through AR, proper planning is crucial in order to accurately manipulate the 3D movement path in an AR work cell. A path editing method is required to evaluate the movement path of a 3D object in an AR scene.19 In general, there are three forms of path manipulation, whereby the first one involves utilizing a third party commercial software and loading using an AR toolkit. The second form of path manipulation involves extending commercial software in which an AR plug-in module is connected to a conventional commercial software, whereas the third form involves using a TUI method. In the TUI method, the user can plan a path by hand movements and confirm the results in the AR environment. The third method is chosen in this study due to its interactive nature. However, this method may generate temporal errors such as trembling of the user’ hand. In this method, the translation point of the object is first examined in order to select a point as the control point. Splines are then used to reconstruct the path into a smooth line. The Catmull–Rom spline is used since it has fewer extensions compared to other conventional splines, and it can be easily manipulated with control points. The control points are generated immediately as long as the user presses and holds down a button of the object or prop. A mouse pen is used as the input device in this study and the movement path is constructed until the button is released. This method differs from conventional graphical user interface (GUI)-based path manipulation that emphasizes the use of mouse and keyboard input. Since there are ongoing studies on the increasing interactiveness of path manipulation, collision-free-path programming is also studied in the same aspect. The utilization of heuristic beam search algorithm for path generation coupled with AR environment was analyzed by Chong et al.20 The system architecture involves markers to input inverse kinematic modules that calculate the coordinates of the virtual robot. The forward kinematic module calculates the end-effector’s reference point. OpenGL Renderer is used to portray images through a HMD for each video frame, which serves as feedback. The force module then calculates the work done by the robot and the value is used for path cost. They discovered that the size of the beam does not necessarily need to be large for a more
robust result and the ARToolKit is sufficient to demonstrate the proposed methodology. The ability to snap the control point onto the physical prop is useful in robot path manipulation, and a similar feature is studied in this article. Recent studies are similar in a way that the robot path manipulation system enables the user to create a list of control points and these points are used to generate a ruled surface.21,22 The set-up involves a robot arm mounted with an end-effector, a computer, monitor, camera and an interactive device strapped to a marker cube. A fully AR-controlled environment with effective orientation planning is possible with the utilization of an ARToolKit-based tracking and interaction module, as well as path planning, path optimization and simulation modules. Although a series of control points within the CFV are created so that a path can be formed, there is an issue that the path may not be collision free. Therefore, a Euclidean distance-based method that computes the distance between the probe, and each control point is used to generate the path through interpolation, making it collision free in the joint-space. The end-effector orientation is represented relative to the robot’s base frame by a unit vector. The path is then optimized using a convex optimization technique, which is the log-barrier method. The Newton–Euler algorithm is used to model the dynamic properties of the robot’s behaviour and the trajectory is simulated using a discrete proportional–derivative (PD) control scheme. The developed system further proves that ARToolKit is more than capable for robot path planning. This paper also demonstrates that ARToolKit is excellent for robot path planning as verified by KUKA Sim Pro, which is a commercially available simulation software.
AR application in work cell The concept, which renders virtual manufacturing very appealing, is the idea of having a factory sitting on your desk, with total control over all aspects of production.23 Automotive companies such as Mercedes Benz have adopted virtual manufacturing as they are convinced that it is an effective engineering tool. Each Mercedes line has three types of machines, namely, ‘Trallfa’, ‘Devilmat Spray Mate’ and ESTA for interior painting, metallic painting and electrostatic painting, respectively. These are painting programs from the RobCad’s virtual manufacturing software and have been proven to be highly efficient compared to importing data from computer-aided design (CAD) drawings into a PC. Such offline programming method results in 30%–40% gain in time, which increases productivity. A more streamlined approach can be achieved for communication and partnership between manufacturing organizations by implementing virtual manufacturing as the main production tool. Even though virtual factory layout planning (also known as digital manufacturing technology) may still be uncommon for most
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
Pai et al.
5
enterprises, those who have chosen to adopt virtual factory layout planning perceive it as a way to encourage parallel processing, decrease cycle time and improve precision. One of the methods that can be used to construct a mixed reality manufacturing environment is image-based tracking.24 Image-based tracking does not involve the use of markers, and tracking is done by finding an arbitrary feature in the real environment such as a safety sign. The safety sign acts as a marker and acquires a fixture on the site. However, it shall be noted that not all kinds of feature are suitable for use as real-life reference markers and a circular contour shape, which is typical of safety signs, is analysed in Lee et al.24 The Hough transform method, direct least square fitting and moment of inertia optimization method are among the methods used to analyse an ellipse, which pertains to the appearance of a circle at certain angles. The virtual manufacturing system software Delmia is used to apply 3D CAD geometry data of each equipment and machine for process planning. VCollide library is used for collision detection such that a flag will be created at the collided parts in the event of a collision. However, accuracy is an issue because safety signs are not designed specifically for camera tracking and recognition, unlike fiducial markers.
Comparison with VR Although AR application is the main focus of this study, a comparison with VR will not only produce valuable data for ongoing studies, but allows ample room for improvement in the field of visualization. VR replaces reality, in which the user is totally immersed in an artificial world and all surroundings are rendered through a computer. Meanwhile, AR enhances or modifies reality by adding minimum virtual objects and requires high accuracy in tracking scenery and environment. Both VR and AR have been growing rapidly over the past few years with continuous improvements in variety, functional capabilities and usability.25 The multitude of technologies include computer assisted virtual environment (CAVE) environments, power walls, holographic workbenches, HMDs and sensors. The integration of VR into CAD (called VR-CAD/E technology) has garnered much attention in scientific research, in which the aim is to overcome issues such as integrating VR with commercialized CAD software tools, modelling and tooling applications. The challenges in integrating VR into CAD lie in the limitations in current technology. A number of studies have been devoted on addressing the above challenges, in which new AR approaches, advanced VR solutions and extensions of CAD/E systems are proposed. For instance, studies on AR are focused on product development process by using augmented technical drawing (ATD),26 interfacing AR with custom-built 3D applications and immersive modelling system (IMMS) for interface design.27 In contrast, studies on VR analysis are centred on VR tools for product life cycle
management (PLM) systems,28 haptic feedback system and simulation, SIMUCAL virtual simulator for analysis of footwear comfort29 as well as VR-based modular fixtures. Studies that integrate VR into CAD/E systems stress on immersive interaction and haptic paradigms for CAD, bi-manual 3D input for CAD modelling and human overall performance in multi-modal CAD.30 The common factor for all VR integration is that cost is substantially higher compared to AR, which results in the need for higher computational power. Simulation has been proven to be a significant research tool towards the continuous development of various engineering industries, especially robotic systems, due to effective manipulation of the end-effector while accounting for limited floor availability as well as operator safety. Therefore, AR is the subject of much research due to its robust application in simulating a manufacturing environment, with its seamless interfacing between real and computer-generated content. However, there are still limitations of AR even with today’s technology, especially with regard to fully integrating AR with robotics, display systems, sensors and path programming. It is evident that there are many areas for improvement in order to realize the goal of a fully digitalized factory with increased accuracy, registration and latency. One of the factors that need to be considered for robotic applications is the ability of the robot to recognize its path and effectively avoid collision. Collision avoidance is crucial since collision may result in injuries to nearby operators and increase maintenance costs for the robot arm. Studies have shown that there are several ways to prevent collision, either by developing a complex algorithm, generating a CFV or utilizing commercial software. These methods are challenging and may be inflexible for use in various environments. Hence, a marker-based method is devised in this study in order to develop a simpler recognition system. In this system, markers are placed at specific points, which will be recognized by ARToolKit as an obstruction in order to avoid collision effectively. Finally, a display system is mandatory to view the effects of AR on the system. However, a feature that portrays information will indeed be more useful for engineers. Even though various display systems can be used, there only a few systems available that can analyze and provide information feedback of current operations. Hence, this study is aimed to provide accurate 2D information using a heads-up display with proper calibration.
Methodology In this study, C++ programming using ARToolKit is used to create a running program that generates AR content through a marker-based tracking method. The program includes path drawing, whereby a line can be drawn with two vertices. The marker probe acts as a teach pendant in order to determine the location of the
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
6
Proc IMechE Part B: J Engineering Manufacture
vertex and needs to be fabricated out of paper and cardboard. The full kinematics of the robot is then studied. Forward kinematics is first studied to understand how the coordinates of the end-effector are determined, followed by inverse kinematics in which the joint variables are determined based on the location of the vertices. The entire formulation is carried out by programming in order to combine line drawing and kinematics effectively for simulation. A full work cell is added to the virtual environment when the AR robot functions as desired. The work cell consists of a conveyor belt, CNC machine and pallet, which are called by other static markers. HUD is added by programming in order to provide a more effective information feedback, which shows the speed of the conveyor, current tool equipped to the machine, as well as coordinates of the manipulated object. The object serves as a visual cue to show how a full operation is done in the work cell and requires collision detection and object manipulation algorithms in the code. The combination of these elements creates a fully functional AR robotic work cell with a pick-and-place system. However, the work cell is not merely constrained to perform this operation.
The experimental set-up comprising hardware and software is relatively low cost and can be constructed virtually anywhere provided that there is proper lighting and sufficient space. The initial testing of the program is carried out by a simple set-up, which consists of a laptop, Webcam and ‘Hiro’ marker. The experimental set-up is then transferred to the Robotics Laboratory, Department of Mechanical Engineering, University of Malaya, after the program is compiled. A photograph of the experimental is shown in Figure 1. The main purpose is to position the virtual robot over the physical one in order to demonstrate the feasibility of running a real work cell in the laboratory. The layout plan is measured prior to the experimental set-up as this will influence the accuracy of the results. Positioning of markers and camera is also crucial. The markers must be placed in a way that they are clearly visible to the camera under direct lighting. The camera must be positioned at a suitable height as higher locations will result in blurry images for the markers. An additional marker (labelled as ‘Star’) is added for the purpose of a case study. The ‘Hiro’ marker serves as a global origin, in which every other marker’s position is calculated relative to the ‘Hiro’ marker. The 3D marker cube is used as the teach pendant because its function resembles that of an actual teach pendant of the KUKA robot, whereby it can move the robot arm around and save the coordinates of a point in space. This means that the position of the tip of the teach pendant is calculated relative to the ‘Hiro’ marker and is registered whenever one of the four faces are viewable. The primary reason the teach pendant is constructed as a marker cube is so that it is viewable to the camera regardless how it is held. The set of static markers as well as marker cube teach pendant are shown in Figure 2 and the overall system architecture is shown in Figure 3.
Kinematic modelling Figure 1. Experimental set-up at Robotics Laboratory.
Each compartment of the robot is first assigned a coordinate frame. The 6-DoF KUKA robot31 and the
Figure 2. Set of static markers (left) and marker cube teach pendant (right).
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
Pai et al.
7
Figure 3. Overall system architecture.
Figure 4. (a) KUKA robot 6-DoF. KUKA industrial robots, http://www.kuka-robotics.com/res/sps/e6c77545-9030-49b1-93f54d17c92173aa_Spez_KR_6_KS_en.pdf and (b) robot configuration with D-H frame.
assigned D-H frame are shown in Figure 4. The top surface of the pedestal, which is a solid cube onto which the robot is mounted, acts the world coordinate or point (0, 0, 0) of the robot. It shall be noted that angle A, B and C rotates about the z-axis, y-axis and x-axis, respectively.
The D-H coordinate frame consists of mainly four parameters (a, a, u, d), which represents the link length, link twist, joint angle and link offset, respectively. The link length and link twist are the link parameters that describe the relative position between joints. The joint angle will be variable if the joint is revolute.
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
8
Proc IMechE Part B: J Engineering Manufacture
Figure 5. Calculation of u1 in X-Y plane, rotation about Z-axis. Figure 6. Calculation of u3 in X–Z plane, rotation about Y-axis.
Angle u1. Joint 1 is a twist joint, and therefore, it changes the orientation of its subsequent frame while the position of the subsequent frame is kept constant. In fact, there are three twist joints present, namely, Joints 1, 4 and 6. From calculations, only the x- and y-axes, as well as the length of the joints, are considered. Angle u1 can be determined from the position of Joint 5, as shown in Figure 5. Frames 4 and 5 share the same position but different orientations, and hence, ! p04 and ! p05 have the same coordinates. It is assumed that the end of the arm will always face downwards, parallel to the floor. Therefore, the robot link will always form a straight line when viewed from the top. Thus, the position of ! p04 can be obtained using the following equation 0 1 0 1 0 1 p04, x px 0 @ p04, y A = @ py A + @ 0 A ð1Þ p04, z d6 pz u1 = arctan 2 p04, y , p04, x The arctan2 function was used in this case rather than arctan since arctan2 represents angles in a different quadrant. Angle u3. In this case, Link 4 is different from other links, as illustrated in Figure 6. This link is a special L-shaped link attached to Joint 3. An offset line parallel to Link 4 is required in order to determine u3. This angle can be solved from angle u using the law of cosine. jp24 j represents the opposite length of the new triangle formed and is obtained by determining the p05 ) with referposition of Joint 2 ( ! p02 ) and Joint 5 ( ! ence to the base. The position of Joint 2 is obtained from forward kinematic transformation (02 T), whereas the position of Joint 5 ( ! p05 ) shares the same position as that for Joint 4 ( ! p04 ). The difference between these two positions will yield ! p24 and thus, the hypotenuse jp24 j can be determined. 0 1 0 1 0 1 p04, x p02, x = c1 a1 p24, x @ p24, y A = @ p04, y A @ p02, y = s1 a1 A ð2Þ p24, z p04, z p02, z = d1
jp24 j2 = p224, x + p224, y + p224, z
ð3Þ
The L-shaped link forms a small degree of difference and can be solved using the arctan function ð4Þ
s = arctan (a3 , d4 )
The link, which connects Joints 3 and 5, is not d4. Rather, it is the hypotenuse of a right angled triangle (d 04 ) with the edges d4 and a3. Hence, Pythagoras’ theorem can be used to determine d 04 as follows qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ð5Þ d 04 = a23 + d24 Applying the cosine law gives a2 + ðd0 4 Þ2 jp24 j2 u = arccos 2 2 a2 d 0 4 u03 = p u s
! ð6Þ
(magnitude)
u03
= ðp u s Þ p u3 = u03 + 2
ð7Þ
(direction)
ð8Þ
Angle u2. From Figure 7, it can be seen that angle u2 can be calculated using angles b1 and b2. Starting from ! angle b , p(2) (which is the vector ! p ) is used refer1
24
24
enced to Frame 2. From forward kinematics, it can be ! noted that p(2) 24 consists of only two components (x and y) as it is not a twist joint and will always form a right angled triangle. Hence, Pythagoras’ theorem is used to solve for the edges of the triangle 0 ! 1 0 1 p(2) p24, z 24, x C B ! qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi B (2) C B C ð9Þ B p24, y C = @ p224, x + p224, y A @ A ! 0 p(2) 24, z ! ! (2) ð10Þ b1 = arctan p(2) 24, x , p24, y
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
Pai et al.
9
Figure 8. Calculation of u5 in X–Z plane, rotation about Y-axis. Figure 7. Calculation of u2 in X–Z plane, rotation about Y-axis.
The law of cosines is applied for the second triangle in order to obtain b2 0 1 !2 2 a22 + jp24 j d 04 A ð11Þ b2 = arccos@ !2 2 a2 jp24 j There are two solutions for u2 as follows: Case 1. If b1 is negative and b2 is positive The direction of 90° rotation is clockwise from home position and is therefore negative. If a part of Link 3 remains on top of x-axis of Joint 2 and the position of the end of arm moves downwards having negative b1, only angle b2 is required to define u2 p b (magnitude) 2 2 p b2 u0 2:1 = (direction) 2 p u2:1 = + u02:1 2 u0 2:1 =
ð12Þ
Angle u5. The law of cosine is applied twice for the two triangles in order to find u5, as shown in Figure 8. l1 can be easily determined using the following equation by substituting the length of the triangle (d04 ) obtained previously ! (d0 4 )2 + jp24 j2 a22 l1 = arccos ð14Þ 2 (d0 4 ) jp24 j Following this, the length connecting from Joint 2 to Joint 7 (l2) needs to be determined. Knowing that the vector is equal to the position of the end of arm minus with the position of Joint 2 referenced to the base, the length jp27 j can be obtained easily as follows 0 1 0 1 0 1 px p02, x = c1 a1 p27, x @ p27, y A = @ py A @ p02, y = s1 a1 A ð15Þ p27, z p02, z = d1 pz jp27 j2 = p227, x + p227, y + p227, z
Case 2. Else u02 will always be equal to the summation of the angles b1 and b2 (i.e. b1 + b2) and is subtracted from 90°. The solution for u2 as in D-H is then added with 90°. Hence p ðb + b2 Þ (magnitude) 2 1 p ðb1 + b2 Þ u0 2:2 = (direction) 2 p u2:2 = + u02:2 2
Since u4 is a twist joint, this angle does not change the consecutive joint’s coordinate. It is assumed that the orientation of Link 4 is constant throughout this study, and therefore, u4 is 0.
u0 2:2 =
ð13Þ
ð16Þ
The law of cosines is applied to solve for l2 using the length obtained using equation (16). The direction of rotation for angle u5 moves downwards from the home position and therefore has a negative sign ! jp24 j2 + d26 jp27 j2 l2 = arccos ð17Þ 2 jp24 j d6 u5 = ðp (l1 s) + l2 Þ
Angle u6 u6 = 0
Angle u4 u4 = 0
u6 remains zero throughout the equation. This angle is dependent on the type of end-effector used. The position of the end of arm is unaffected.
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
10
Proc IMechE Part B: J Engineering Manufacture
Figure 9. CAD model of the robot sketched using ProEngineer software.
CAD models The actual 3D CAD model of the KUKA robot is rather complex, and therefore, a simplified version is used, while adhering to the kinematics study of the robot. Pro-Engineer software is used to sketch the robot, which is drawn to scale and follows the dimensions of the real KR 16 KS model, as shown in Figure 9. This means that each joint of the robot must be sketched individually to scale. The parts are then assembled to create the full robot arm in the 3D model in which each joint can be moved when they are dragged. The black base represents the pedestal. However, it shall be noted that the assembly file cannot be imported into the program, and thus, each joint needs to be individually imported as an STL file and re-assembled and rendered using OpenGL. Hence, the placement of the origin of the coordinate system for each joint needs to be accurate in order to reduce complications when assembling parts in the program. Three additional parts are added in order to create a robotic work cell. Since it is required that the robot must interact and perform operations with other machines, a CNC machine, conveyor belt and pallet are added to the robot. The basic operation performed by the KUKA robot is pick and place, which is highly dependent on the tools provided at the end-effector and is not defined in this study. The purpose of this study is to demonstrate that the AR work cell is applicable for a variety of work cell operations. Unlike the robot arm, the drawings for the CNC machine, conveyor belt and pallet are part files and no assemblies are required. Since these components act as dummy machines without specific operations, the files are drawn to full scale and remain static throughout the study. They are also rendered and coloured using OpenGL.
HUD implementation The addition of a HUD is extremely useful when virtual content is involved in any context. The use of a HUD not only extends the user’s knowledge of the
current operation, but also updates itself continuously on the current situation. HUD is purely AR generated and therefore, it is solely dependent on the codes added into the program. The information needs to be portrayed in such a way that it is viewable regardless of display systems. The display system can either be a HMD or a simple PC monitor such as that used in this study. The information overlay includes the current tool equipped to the CNC machine, speed of the conveyor belt (albeit not specific) as well as the current coordinate of the manipulated object in the virtual workspace. It is relatively simple to print text on the virtual environment using OpenGL. This poses a problem as the global origin is set to the ‘Hiro’ marker beforehand, which causes the text to appear on the marker and moves accordingly when the marker is moved. Hence, the text that appears on the marker is merely a text, rather than a HUD. For HUD, the text is a 2D overlay located on the top left corner of the screen. The following codes are utilized to achieve the above purpose: glMatrixMode(GL_PROJECTION); glMatrixMode(GL_MODELVIEW); These codes basically project the information onto a 2D space and the coordinate of the origin appears at the centre of the screen. A semi-transparent background is first drawn before the information is displayed on the screen in order to increase the visibility of the words. The following functions are added in the init() function of the program: glEnable(GL_BLEND); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ ALPHA); These functions enable blending, in which incoming primitive colours are blended with the colours stored in the frame buffer. The relevant information on the screen is then printed out using the printw function. Figure 10 shows the real-time HUD display, with the information of the cube’s coordinate updated in accordance with changes in position.
Results and discussion The code is initially tested with a downscaled version of the robot in order to observe the degree of accuracy of the end-effector in following the teach pendant according to the angles calculated for each arm. This is made possible by assembling the joints individually in the AR environment based on the kinematic model. A vertex is drawn at a point by right-clicking the mouse button, and the coordinate and angle of each arm are automatically saved in separate files. Drawing additional vertices will create lines for the linear movement of the robot arm.
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
Pai et al.
11
Figure 10. Dynamic HUD comparison. CNC: computer numerical control.
Figure 11. Calibration and error-checking test.
It is also imperative that the program must be errorfree or within a specific error tolerance prior to carrying out the case studies. This is because 3D coordinates often require careful calibration,32 and therefore, the error-checking test acts as a calibration method. Hence, a ruler is placed in front of the camera and a virtual line is drawn for this purpose, as shown in Figure 11. The line is drawn precisely 10 cm, as indicated on the ruler. The coordinates of the two vertices saved in the
output file are calculated to determine whether the value is 10 cm. This test is carried out to ensure that the distance between the vertices is correctly registered and the procedure does not involve calculating the angle of the actual robot arm. The actual robot remains static during this test since this is an offline robot programming method. No additional operations are performed, and the RPY values are not considered at this stage. The line is drawn horizontally and thus only the x-coordinate is inspected.
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
12
Proc IMechE Part B: J Engineering Manufacture
The values obtained from calibration and errorchecking test are tabulated in Table 1. It can be seen that the average error value is 3.29%, which is less than 5%. This proves that the length of the line drawn is reasonably accurate with respect to the real-world scale.
Case studies The first case study involves investigating the repeatability of the virtual lines in order to determine the degree of precision. Virtual lines are drawn repeatedly in this case study, as shown in Figure 12. However, no comparisons are made with respect to the real-world scale such as that for the error-checking test. The distance across all three axes is also tested. The edges of a
Table 1. Calibration and error-checking test. Line
Distance, X (mm)
Error (%)
1 2 3 4 5 6 Average
103.4849 102.9585 102.6574 104.2947 103.2485 103.0858 103.29
3.4849 2.9585 2.6574 4.2947 3.2485 3.0858 3.29
wooden block are used as reference points when drawing the vertices in order to ensure straight lines during the test. Similar to the error-checking test, the first case study is conducted without any physical movement of the actual robot arm, and the RPY values are not considered. The root mean square error (RMSE) values are presented in Table 2, which reveals that the total average RMSE is only 1.2026 mm, with an error percentage less than 5%. The second case study involves investigating the full application of the KUKA robot’s kinematics, object manipulation, heads-up display and collision detection in order to develop a simple work cell. The operation involves picking and placing a random item around various virtual machines. For this purpose, additional static markers are placed around the ‘Hiro’ marker, which acts as the global origin. The ‘AR’, ‘Lightning’, ‘T’ and ‘5’ markers (as shown in Figure 2) are recognized by the camera and are used to render the CNC machine, wireframe box, conveyor belt and pallet, respectively. This forms the virtual work cell, in which the wireframe box is the virtual object. The KUKA robot arm picks and places the wireframe box at each virtual machine, as shown in Figure 13. The coordinate and angle of the arms at each point are saved into separate files during the pick-and-place operation.
Figure 12. Virtual edges.
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
Pai et al.
13
Figure 13. Sequence of pick-and-place operation whereby a virtual object is picked from the conveyor and placed at the CNC machine. The virtual object is then picked and placed at the pallet. The process is repeated three times.
Table 2. RMSE values for the first case study. Line
RMSE (mm)
X-axis
1.4658 4.1841 1.4715 2.3738 0.4993 1.7741 2.1489 1.1412 0.3588 0.8161 0.1786 0.0929 1.20
Average Y-axis Average Z-axis Average Total average RMSE RMSE: root mean square error.
The values of the angles are then inputted into KUKA Sim Pro, which is a commercially available software used to verify the points. KUKA Sim Pro is an offline programming software that can fully simulate the activities of the robot arm and is used to determine the relative difference between the values generated by the AR environment and software. The RMSE values obtained for the second case study are tabulated in
Table 3. It can be seen that the maximum error is 3.83 mm, which verifies the simulation. Since the coordinates are validated using KUKA Sim Pro, the third case study involves checking the reliability and effectiveness of collision detection and avoidance when an obstacle is placed within the vicinity of the robot arm during operation. Collision avoidance is particularly important as collision may damage the robot arm and result in injuries to the operator in real applications. The same code is used for this purpose with a few modifications. A new marker is added, which represents an obstacle. The marker is a rectangular box having a length, width and height of 100, 100 and 150 mm, respectively, as shown in Figure 14. GLLINES and GL-POLYGON codes are used to replace the glutwirecube() function. This replaces the original cube block with a rectangular box with black edges for ease of identification. The parameters w, d and e refer to the origin of the rectangular box. The ‘Star’ marker is placed at a static location together with the ‘Hiro’ marker within the camera’s field of view. The maximum acceptable distance is set to 75,000, which is the square value of the real-world distance. In other words, the virtual KUKA robot arm will register that it has collided with the rectangular box if its distance from
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
14
Proc IMechE Part B: J Engineering Manufacture
Table 3. RMSE values for the second case study. Coordinate
X-axis, RMSE (mm)
Y-axis, RMSE (mm)
Z-axis, RMSE (mm)
Point 1 Point 2 Point 3 Point 4 Point 5 Point 6 Point 7 Point 8 Point 9 Point 10 Point 11 Point 12 Average RMSE (mm)
0.4141 4.2164 3.5474 2.7827 0.4047 4.1637 3.9777 2.4066 0.1017 3.111 3.0399 2.639 2.57
1.9285 4.2203 4.4712 2.1014 3.5775 4.6139 4.8049 2.6216 4.18 5.4389 5.6196 2.3947 3.83
0.3501 0.6262 1.3002 0.861 1.9736 1.2302 1.3513 1.3885 2.4686 2.5826 2.7081 1.18 1.50
RMSE: root mean square error.
Figure 14. Colour change of obstacle from (a) to (b) during collision.
the end-effector is less than 27.4 cm. Figure 14 shows the colour change of the obstacle during collision. This case study is also carried out to obtain the coordinates of the linear movement for the robot arm, followed by verification using KUKA Sim Pro. The same environment is duplicated in the software, whereby the obstacle is placed at the same location as that of the virtual box. If the software yields the same results, this indicates that the robot arm can effectively avoid the obstacle based on the coordinates obtained in the AR environment and this validates the case study. Each coordinate in space must be obtained when the block is blue as this indicates the absence of a collision, as shown in Figure 15. Otherwise, a further point is used.
The RMSE values for the third case study are presented in Table 4. It can be seen from the results of each case study which are verified using KUKA Sim Pro software that there is a slight deviation in the coordinates between both simulations. However, this deviation is found to be less than 4 mm with percentage error less than 5%, which is acceptable. A deviation of 5% is acceptable as there are many factors that can influence the final readings. One of the factors is jitter of the hands. It is natural that human hands will jitter, which causes a slight discrepancy in the results, even though the teach pendant is pointed at the same location with the same orientation. Marker design is another factor, which influences the results. As discussed previously,
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
Pai et al.
15
Figure 15. (a) Colour change of the obstacle when it is approached by the end-effector, (b) generated collision-free path and (c) validation of the generated path using KUKA Sim Pro software.
Table 4. RMSE values for the third case study. Coordinate
X-axis, RMSE (mm)
Y-axis, RMSE (mm)
Z-axis, RMSE (mm)
Point 1 Point 2 Point 3 Point 4 Point 5 Point 6 Point 7 Average RMSE (mm)
5.1414 4.4938 2.9118 0.4397 2.7178 5.5576 5.9301 3.88
4.1776 3.9035 4.0837 4.0662 3.3678 1.8706 0.0047 3.07
2.3259 2.1921 3.1284 4.055 4.3264 3.1766 1.2556 2.92
RMSE: root mean square error.
some markers cannot be used if they have similar characteristics or design aspects. However, there are other reasons, which need to be investigated. For example, marker ‘5’ appears to be the toughest marker to register, even though there are no similar markers used in this study. Camera resolution is an additional factor, which needs to be considered. Even though the camera used in this study supports 1080 p full high-definition recording, the resolution is found to be insufficient to capture the markers at a larger distance. The distance used for the case studies is the maximum allowable distance for the camera. A larger distance will result in blurry images, which hinders registration of the markers. Poor lighting condition is also a factor that will affect the results. Extreme care should be exercised when selecting a location for the experiments as sufficient lighting is required in order to register the markers accurately. Direct lighting is not advisable since a small amount of reflection on the markers influences the threshold image, even though the markers are made of standard A4 papers.
Conclusion and recommendations for future work A fully functional robotic work cell has been developed in this study, which is user-friendly and eliminates the need for extensive programming knowledge. Users can simply click at the desired location of the machines and observe the movement of the virtual robot since it interacts with the object that is being manipulated in the workspace. The users’ experience is greatly enhanced by visual cues such as colour changes during collision, as well as 2D HUD, which updates the location of the object in real time. HMDs can be used in the near future in order to enhance visual realism, in which the HUD will appear as though it is right in front of the users’ eyes, coupled with the parallax phenomenon that provides higher accuracy compared to the monocular vision of regular webcams. This will eliminate occlusion, which is present throughout this study, whereby the virtual object
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
16
Proc IMechE Part B: J Engineering Manufacture
always appears to be in front of physical objects regardless of its position. 11.
Acknowledgements The authors also amiably thank the Department of Mechanical Engineering, Faculty of Engineering, University of Malaya, for providing the necessary facilities for this study.
12.
13.
Declaration of conflicting interests The authors declare that there is no conflict of interest. 14.
Funding This study was financially supported by the University of Malaya Research Collaborative Grant Scheme (PRP-UM) Grant No.: CG006-2013.
15.
References
16.
1. Wang Q, Pan L, Mileham AR, et al. Modelling randomness of a manual assembly system with walking workers. Int J Ind Syst Eng 2007; 2: 195–210. 2. Broeders IAMJ. Robotics: the next step? Best Pract Res Clin Gastroenterol 2014; 28(1): 225–232. 3. Pan Z, Polden J, Larkin N, et al. Recent progress on programming methods for industrial robots. Robot Comput Integrated Manuf 2012; 28: 87–94. 4. Steinparz FX. Co-ordinate transformation and robot control with Denavit-Hartenberg matrices. J Microcomput Appl 1985; 8: 303–316. 5. Rocha CR, Tonetto CP and Dias A. A comparison between the Denavit–Hartenberg and the screw-based methods used in kinematic modeling of robot manipulators. Robot Comput Integrated Manuf 2011; 27: 723–728. 6. Jonietz E. 10 Breakthrough technologies. MIT Technology Review, 2007. Available at: http://www2.technologyreview.com/article/407473/tr10-augmented-reality/ 7. Kato H, Billinghurst M, Poupyrev I, et al. Virtual object manipulation on a table-top AR environment. In: IEEE and ACM international symposium on augmented reality, 2000 (ISAR 2000) proceedings, Munich, 5–6 October 2000, pp.111–119. Available at: http://ieeexplore.ieee.org/ xpl/login.jsp?tp=&arnumber=880934&url=http%3A% 2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3 Farnumber%3D880934 8. Ong SK, Chong JWS and Nee AYC. Methodologies for immersive robot programming in an augmented reality environment. In: Proceedings of the 4th international conference on computer graphics and interactive techniques in Australasia and Southeast Asia, Kuala Lumpur, Malaysia, 29 November–2 December 2006, pp.237–244. Kuala Lumpur, Malaysia: ACM. 9. Wright RS, Haemel N, Sellers GM, et al. OpenGL SuperBible: comprehensive tutorial and reference. 5th ed. Upper Saddle River, NJ: Addison-Wesley, 2010, 1008 pp. 10. Wang P. Application of OpenGL-based virtual collision avoiding technology in robot path planning simulation. In: 2011 international conference on multimedia technology
17.
18.
19.
20.
21.
22.
23. 24.
25.
26.
27.
28.
(ICMT), Hangzhou, China, 26–28 July 2011, pp.2964– 2967. New York: IEEE. Edward Angel DS. Interactive computer graphics: a topdown approach with shader-based OpenGL. 6th ed. Boston, MA: Pearson, 2012. McGarrity E and Tuceryan M. A method for calibrating see-through head-mounted displays for AR. In: Proceedings of the 2nd IEEE and ACM international workshop on augmented reality, San Francisco, CA, 20–21 October 1999, p.75. New York: IEEE Computer Society. Leebmann J. A stochastic analysis of the calibration problem for augmented reality systems with see-through head-mounted displays. ISPRS J Photogramm Remote Sens 2003; 57: 400–408. Pati S, Erat O, Wang L, et al. Accurate pose estimation using single marker single camera calibration system. Proc SPIE 2013; 8671: 867126. Girbacia F, Mogan G and Paunescu T. AR-based off-line programming of the RV-M1 robot. In: Gogu G, Maniu I, Lovasz EC, et al. (eds) Mechanisms, mechanical transmissions and robotics. Zurich: Trans Tech Publications Ltd, 2012, pp.344–351. Lambrecht J, Kleinsorge M, Rosenstrauch M, et al. Spatial programming for industrial robots through task demonstration. Int J Adv Robot Syst 2013; 10: Article 254. Novak-Marcincin J, Janak M, Barna J, et al. Verification of a program for the control of a robotic workcell with the use of AR. Int J Adv Robot Syst 2012; 9: Article 54. Ong SK, Chong JWS and Nee AYC. A novel AR-based robot programming and path planning methodology. Robot Comput Integrated Manuf 2010; 26: 240–249. Ha T, Billinghurst M and Woo W. An interactive 3D movement path manipulation method in an augmented reality environment. Interact Comput 2012; 24: 10–24. Chong JWS, Ong SK, Nee AYC, et al. Robot programming using augmented reality: an interactive method for planning collision-free paths. Robot Comput Integrated Manuf 2009; 25: 689–701. Fang HC, Ong SK and Nee AYC. Interactive robot trajectory planning and simulation using augmented reality. Robot Comput Integrated Manuf 2012; 28: 227–237. Fang HC, Ong SK, Nee AYC, et al. End-effector orientation planning using augmented reality. Proced CIRP 2012; 3: 191–196. Gilad L. Making virtual manufacturing a reality. Ind Robot Int J 1995; 22: 16–17. Lee J, Han S and Yang J. Construction of a computersimulated mixed reality environment for virtual factory layout planning. Comput Ind 2011; 62: 86–98. Talaba D, Horva´th I and Lee KH. Special issue of computer-aided design on virtual and augmented reality technologies in product design. Comput Aided Des 2010; 42: 361–363. Uva AE, Cristiano S, Fiorentino M, et al. Distributed design review using tangible augmented technical drawings. Comput Aided Des 2010; 42: 364–372. Lee Y-G, Park H, Woo W, et al. Immersive modeling system (IMMS) for personal electronic products using a multi-modal interface. Comput Aided Des 2010; 42: 387–401. Mahdjoub M, Monticolo D, Gomes S, et al. A collaborative design for usability approach supported by virtual
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015
Pai et al.
29.
30.
31.
32.
17
reality and a multi-agent system embedded in a PLM environment. Comput Aided Des 2010; 42: 402–413. Rupe´rez MJ, Monserrat C, Alemany S, et al. Contact model, fit process and, foot animation for the virtual simulator of the footwear comfort. Comput Aided Des 2010; 42: 425–431. Bourdot P, Convard T, Picon F, et al. VR–CAD integration: multimodal immersive interaction and advanced haptic paradigms for implicit edition of CAD models. Comput Aided Des 2010; 42: 445–461. KUKA. KR 16-2 KS-F specification. KUKA Industrial Robots, 2004. Augsburg: Germany. Available at: http:// www.kuka-robotics.com/res/sps/e6c77545-9030-49b1-93f54d17c92173aa_Spez_KR_6_KS_en.pdf Muelaner JE, Wang Z, Martin O, et al. Estimation of uncertainty in three dimensional coordinate measurement by comparison with calibrated points. Meas Sci Tech 2010; 21: 025106.
c, s
i1 i T
if followed by angle, these lower case letters represent cos, sin function link offset an arrow above indicates the start, i and end points, j of a position vector upper left superscript on a transformation matrix represents departure and the lower left superscript on a transformation matrix represents the destination of the frame
u a s, u, l, b, g = ’
joint angle link twist angle variables is equal to is approximately equal to
d ! pij
Appendix 1 Notation a
link length
Downloaded from pib.sagepub.com at KEIO UNIV MEDIA CENTER on November 24, 2015