sensors Article
Development and Performance Evaluation of Image-Based Robotic Waxing System for Detailing Automobiles Chi-Ying Lin *
ID
and Bing-Cheng Hsu
Department of Mechanical Engineering, National Taiwan University of Science and Technology, No. 43, Keelung Rd., Sec. 4, Taipei 106, Taiwan;
[email protected] * Correspondence:
[email protected]; Tel.: +886-2-2737-6484 Received: 2 April 2018; Accepted: 12 May 2018; Published: 14 May 2018
Abstract: Waxing is an important aspect of automobile detailing, aimed at protecting the finish of the car and preventing rust. At present, this delicate work is conducted manually due to the need for iterative adjustments to achieve acceptable quality. This paper presents a robotic waxing system in which surface images are used to evaluate the quality of the finish. An RGB-D camera is used to build a point cloud that details the sheet metal components to enable path planning for a robot manipulator. The robot is equipped with a multi-axis force sensor to measure and control the forces involved in the application and buffing of wax. Images of sheet metal components that were waxed by experienced car detailers were analyzed using image processing algorithms. A Gaussian distribution function and its parameterized values were obtained from the images for use as a performance criterion in evaluating the quality of surfaces prepared by the robotic waxing system. Waxing force and dwell time were optimized using a mathematical model based on the image-based criterion used to measure waxing performance. Experimental results demonstrate the feasibility of the proposed robotic waxing system and image-based performance evaluation scheme. Keywords: robotic waxing system; force control; image-based performance evaluation; parameter optimization
1. Introduction As a common procedure in auto detailing, waxing removes the fine scratches in automotive sheet metal and reduces surface roughness, which in turn prevents impurities such as dust or moss from sticking to the outer shell of sheet metal, reduces resistance and undesired attachments, and thereby decreases fuel consumption. However, waxing by hand means long durations of applying force, being subjected to machine vibrations, and remaining in poor posture, which cause muscle aches, bone injuries, and damage to the joints in the hand [1]. There are also the issues of uneven force application and inconsistent quality during the waxing process, so waxing quality cannot be guaranteed. This study therefore developed a robotic system for waxing in auto detailing to increase the efficiency and cost-effectiveness of waxing procedures. At present, most of the existing literature associated with waxing automation lies in patent reports [2–5], and there are few technical articles that provide data reference. Thus, we based our design on existing robotic systems developed to perform motions similar to those used in waxing, such as grinding [6], deburring [7,8], and polishing [9–14]. The objective of those engineering applications was not to alter the geometric shape of the workpiece, but rather to adjust the surface roughness to within set standards. This means that the need for precision in force control far exceeds that required for position control. In previous studies, industrial robotic arms were equipped with force
Sensors 2018, 18, 1548; doi:10.3390/s18051548
www.mdpi.com/journal/sensors
Sensors 2018, 18, 1548
2 of 20
sensors to achieve force control and surface finishing procedures [15,16]. These same devices were also used to measure surface roughness as an indicator of surface quality [10,17]. The researchers in [18] investigated how various parameters pertaining to grinding and polishing processes, including the size of abrasives, the linear velocity of the belt, contact force, and feed rate affected the final surface roughness of the workpiece. Their results revealed that surface roughness is positively correlated with machining force and feed rate. We therefore adopted waxing force and waxing speed (defined as the amount of time spent working on a given spot) as our primary parameters of waxing quality. One prerequisite for waxing sheet metal is that the surface roughness of the underlying material must surpass a given threshold. Coating quality can be determined using three indices: durability, appearance, and protection against the environment [19]. Protection can be quantified by measuring the contact angle of water drops on the surface of the workpiece [20], wherein a higher angle means that water drops are less likely to adhere to the surface. The measurement of durability involves observing the severity of rust formation following exposure to test environment for extended durations. Appearance (in terms of the numerical classification of paint coatings) can be gauged using tele-spectroradiometers and multi-angle spectrophotometers [21]. Protection and durability depend on the type of coating, but are beyond the scope of this study. We therefore based our assessment of the proposed automated waxing system solely on appearance. Most auto detailers manually assess and adjust the appearance quality of waxing jobs. Optical microscopes could be used to evaluate the status of workpiece surfaces [10,17,18]; however, this would be time-consuming and expensive. Thus, for the quantification of waxing surface quality, we opted for non-contact image sensing technology, which is easily implemented and inexpensive to install and operate. In [22], image processing algorithms, including thresholding, noise removal, edge detection, and segmentation were used to identify surface defects such as holes, scratches, coil breaks, and rust with a reasonable degree of accuracy. In [23], multilevel thresholds were used to divide grayscale images into various regions representing the roughness of workpiece surfaces as an indicator of surface quality. Although several advanced machine vision based measurement methods have been proposed for assessing surface roughness recently [24–27], they all suffer from the drawbacks of computational burden or high-cost hardware. In [28], a computer vision system was integrated with a robotic arm to measure surface defects in polished workpieces before and after processing. However, this automated polishing system did not use image feedback to adjust system parameters, such as force or speed to improve processing quality. In this study, we incorporated computer vision technology within a robotic arm to enable automated waxing while simultaneously monitoring the quality and adjusting system parameters. We developed an image-based assessment method for waxing quality based on image analysis results of wax jobs performed manually by an auto detailer, and used this feedback information to optimize system parameters. Comparing to the existing image-based assessment methods, the proposed method has the advantages of high computational efficiency and easy implementation for the development of autonomous robotic waxing systems. The feasibility of the image assessment criteria and overall performance of the proposed system were evaluated using an engine hood as a waxing target. A depth camera was used to establish point cloud data of the sheet metal in order to plan the waxing path of the system. Waxing procedures were performed using a three-axis robotic arm with an xyz platform. Waxing force control was achieved using impedance control and feedback information from a multi-axis force sensor at the end of an effector attached to the robotic arm. The waxing speed was adjusted by altering the dwell time along identical waxing paths. Experimental results confirm the effectiveness of the proposed image assessment criteria in detecting differences in waxing quality. Parameter optimization using this feedback information enabled the proposed waxing system to achieve waxing quality on par with that of professional auto detailers.
Sensors 2018, 18, 1548
3 of 20
2. System Description Figure the overall framework of the proposed system. We adopted an xyz platform Sensors 2018, 1 18,displays x FOR PEER REVIEW 3 of 20 and a self-built three-axis robotic arm equipped with a waxing motor as the end-effector to wax sheet sheet The metal. xyz platform is responsible for large-stroke movement. effective travel range metal. xyzThe platform is responsible for large-stroke movement. TheThe effective travel range ofof the the platform x-y-axes and y-axes 500 mm × 500 mm, with a screw lead of 10 mm/rev, whereas platform alongalong the x-the and is 500ismm × 500 mm, with a screw lead of 10 mm/rev, whereas the the effective the z-axis is mm, 400 mm, a screw 5 mm/rev. The resolution effective traveltravel rangerange alongalong the z-axis is 400 withwith a screw leadlead of 5ofmm/rev. The resolution ofofthe the motor encoders for axis eachisaxis is 10,000/rev. The three-axis robotic served to wax the sheet motor encoders for each 10,000/rev. The three-axis robotic arm arm served to wax the sheet metal metal and exercise force control. DC motors (models IG-42CGM and IG32RGM) manufactured by and exercise force control. DC motors (models IG-42CGM and IG32RGM) manufactured by Sha Yang Yang YeCo. Industrial Co. Ltd.Taiwan) (Taoyuan, Taiwan) were forFor thethe joints. For the end-effector, YeSha Industrial Ltd. (Taoyuan, were used for theused joints. end-effector, we used thewe DC used the DC motor (model GBP30) made by Soho Precision Machinery Co., Ltd. (New Taipei City, motor (model GBP30) made by Soho Precision Machinery Co., Ltd. (New Taipei City, Taiwan) to meet Taiwan) to greater meet the need for speed greaterand rotational and torque.arm To use a robotic arm to waxmetal, the the need for rotational torque.speed To use a robotic to wax the target sheet target sheet metal, we first had to understand the external environment, which includes elements we first had to understand the external environment, which includes elements such as posture, position, such as posture, position, and color. To do so, we installed an Intel RealSence F200 RGB-D camera and color. To do so, we installed an Intel RealSence F200 RGB-D camera (Santa Clara, CA, USA) (color (Santa Clara, CA, USA) (color camera with the resolution of 1920 × 1080 pixels; depth camera with camera with the resolution of 1920 × 1080 pixels; depth camera with the resolution of 640 × 480 pixels) the resolution of 640 × 480 pixels) on the second axis of the robotic arm (eye-in-hand setup) to on the second axis of the robotic arm (eye-in-hand setup) to estimate the 3D point cloud and world estimate the 3D point cloud and world coordinates of the sheet metal to be waxed and capture coordinates of the sheet metal to be waxed and capture post-waxing images to assess waxing quality. post-waxing images to assess waxing quality. We installed a HEX-58-RE-400 N force sensor from We installed a HEX-58-RE-400 N force sensor from OptoForce Ltd. (Budapest, Hungary) on the OptoForce Ltd. (Budapest, Hungary) on the end-effector to measure waxing force and achieve force end-effector to system measure waxing force and achieve the force control.command The system uses aofcomputer to change control. The uses a computer to change position settings the xyz platform the position command settings of the xyz platform and process the images from the depth camera. and process the images from the depth camera. The measurement information derived from the The measurement information derived from the force is sent to an Arduino control board via a force sensor is sent to an Arduino control board via asensor USB port to control the position of the robotic USB port to control the position of the robotic arm and perform the waxing process. arm and perform the waxing process.
Figure 1. Proposed robotic waxing system. Figure 1. Proposed robotic waxing system.
3. Stereo Vision and Path Planning 3. Stereo Vision and Path Planning To obtain the spatial relationship among the target sheet metal, the camera, and the robotic arm To obtain the relationship among the target sheet metal, the camera, and roboticthe arm and facilitate thespatial waxing process, we first had to establish a coordinate system to the calculate and facilitate the waxing process, we first had to establish a coordinate system to calculate the correct correct location of the sheet metal and plan the waxing path [29]. We employed a depth camera to location of the sheet plan theof waxing path [29].The Weparts employed depth camera tosheet obtain obtain the color andmetal depthand information the sheet metal. that doa not belong to the the colorare and depth information of the sheet metal. The parts that doreflection not belong to the sheet metal metal removed via image preprocessing, and then the infrared principle of depth are removed via image preprocessing, and then the infrared reflection principle of depth cameras cameras is utilized to achieve stereo vision, estimate the point cloud plot of the target, and plan the is utilized to waxing achievepath. stereo vision, estimate the point cloud plot of the target, and plan the system’s system’s waxing path. 3.1. Image Processing 3.1. Image Processing Estimation of the sheet metal’s location is based on color images from the depth camera, which the the sheet metal’s location is based onthe color images fromtothe depth camera, which areEstimation integrated of with information measured from camera plane establish the point cloudare integrated information camera plane establish point cloud plot plot of thewith sheetthe metal. However,measured each pointfrom cloudthe plot contains threetochannels ofthe image information and depth information for all the coordinates in the images, so the information content is substantial. To reduce the computation time, the color images from the depth camera are subjected to color filtering, thresholding, and morphology processing. The resulting black-and-white images serve as
Sensors 2018, 18, 1548
4 of 20
of the sheet metal. However, each point cloud plot contains three channels of image information and depth information for all the coordinates in the images, so the information content is substantial. Sensors 2018, 18, x FOR PEER REVIEW 4 of 20 To reduce the computation time, the color images from the depth camera are subjected to color filtering, thresholding, and the morphology Theto resulting black-and-white imageswhich serve reduces as masksthe to masks to remove areas thatprocessing. do not belong the sheet metal in the images, remove the areas that do not belong to the sheet metal in the images, which reduces the system’s system’s computation time and decreases the chance of miscalculations. The mask operation computation and decreases the2.chance miscalculations. operation procedure is as procedure is time as shown in Figure Beforeofcapturing images,The we mask covered the sheet metal with shown in Figure 2. Before capturing images, we covered the sheet metal with non-reflective khaki non-reflective khaki paper to prevent incomplete depth information extraction due to reflections, as paper to incomplete information due to reflections, as shownThe in Figure shown inprevent Figure 2a. Figure 2bdepth displays the depthextraction image information that was captured. shade 2a. of Figure 2b displays the depth image information that was captured. The shade of the color indicates the the color indicates the distance of a spot from the camera. The scattered back spots were due to the distance ofofa spot fromlight the camera. spots due to the diffusion of infrared light diffusion infrared caused The by scattered the sheetback metal, so were the camera could not receive the light caused byback the sheet so the which cameraresulted could not thedepth light reflected back from these posts, reflected from metal, these posts, inreceive incorrect values. Figure 2c presents the which resulted in incorrect depth values. Figure 2c presents the mask created via image processing. mask created via image processing. The white area is the part that the system needs to wax and the The white area is the part that whereas the system to wax and target coordinateAfter conversion, target of coordinate conversion, theneeds black area does notthe need to beofprocessed. placing whereas the black area does not need to be processed. After placing this mask on the depth plane, this mask on the depth plane, what remains is the part needed by the system, as shown in Figure 2d. what remains is the part needed by the system, as shown in Figure 2d. Converting the coordinates of Converting the coordinates of the depth information and location of each spot then produces the the depth information location point cloud plot of the and sheet metal. of each spot then produces the point cloud plot of the sheet metal.
(a)
(b)
(c)
(d)
Figure 2. Mask Figure 2. Mask operation. operation. (a) (a)color color image; image; (b) (b) depth depth image; image; (c) (c) mask mask computing; computing; (d) (d) mask mask to to depth image. depth image.
3.2. Stereo Vision 3.2. Stereo Vision The locations of the robotic arm, camera, and target are not fixed, so the relationships of the The locations of the robotic arm, camera, and target are not fixed, so the relationships of the three three with the world coordinates have to be established [30]. We installed a depth camera on the with the world coordinates have to be established [30]. We installed a depth camera on the second axis second axis of the robotic arm to create a stereo vision system. Based on different descriptive of the robotic arm to create a stereo vision system. Based on different descriptive backgrounds, the backgrounds, the system has to define three different coordinate systems to describe the system has to define three different coordinate systems to describe the relationships among the target, relationships among the target, the camera, and the robotic arm. These coordinate systems include the camera, and the robotic arm. These coordinate systems include the world coordinate system (xw , yw , the world coordinate system (xw, yw, zw), the camera coordinate system (xc, yc, zc), and the pixel zw ), the camera coordinate system (xc , yc , zc ), and the pixel coordinate system (up , vp ). The conversion coordinate system (up, vp). The conversion relationship is as shown in Figure 3. The world coordinate system refers to a specific origin in the workspace, so any object in the space has coordinates in relation to the defined origin so that the world coordinate system and robotic arm coordinate system can be defined with an identical origin and directions. The camera coordinate system is a coordinate system that describes other objects based on the location of the camera. Generally speaking, there is no deflection in the light that travels through the center of the lenses. For this reason, we set the
Sensors 2018, 18, 1548
5 of 20
relationship is as shown in Figure 3. The world coordinate system refers to a specific origin in the workspace, so any object in the space has coordinates in relation to the defined origin so that the world coordinate system and robotic arm coordinate system can be defined with an identical origin and directions. The camera coordinate system is a coordinate system that describes other objects based on the location of the camera. Generally speaking, there is no deflection in the light that travels through Sensors 2018, 18, x FOR PEER REVIEW 5 of 20 the center of the lenses. For this reason, we set the center of the lenses as the origin of the camera coordinate system. origin of the coordinate systemsystem. is the upper left corner the image plane center of the lensesThe as the origin of pixel the camera coordinate The origin of theofpixel coordinate and describes location of the of target the image plane. The system must convert location system is the the upper left corner the on image plane and describes the location of the the target on theof the target into The world coordinates so thatthe thelocation robotic arm can trackinto the world planned path. Theso coordinate image plane. system must convert of the target coordinates that the conversion achieved byplanned first converting world coordinates cameraby coordinates and then robotic armiscan track the path. Thethe coordinate conversion into is achieved first converting the converting them into pixel coordinates. world coordinates into camera coordinates and then converting them into pixel coordinates.
Figure conversion. Figure 3. 3. Coordinate conversion.
The conversion conversion relationship relationship between between the the pixel pixel coordinates The coordinates and and the the world world coordinates coordinatesisisas as presented in Equation (1): presented in Equation (1):
up f u f p SS v v p p =0 0 1 0 0
1
00 ff 00
00 00 11
xw uui " # i R 3R33×3T3T1 3×1yw vvii 0 1 0 1×3 1 zw 00 13 1 ,
xw yw zw 1
,
(1) (1)
where S is a magnification factor; (up , vp ) are the target coordinates as described by the pixel coordinate where S is a magnification factor; (up, vp) are the target coordinates as described by the pixel system, and (xw , yw , zw ) are the coordinates of the target in the world coordinate system; f, ui , and vi coordinate system, and (xw, yw, zw) are the coordinates of the target in the world coordinate system; f, are intrinsic parameters of the camera, which are fixed upon manufacturing; f denotes the focus of the ui, and vi are intrinsic parameters of the camera, which are fixed upon manufacturing; f denotes the camera; (ui , vi ) are the coordinates of the camera center. Note that these intrinsic parameter values can focus of the camera; (ui, vi) are the coordinates of the camera center. Note that these intrinsic be obtained via a standard camera calibration process [31]. R3×3 and T3×1 are the extrinsic parameters parameter values can be obtained via a standard camera calibration process [31]. R3×3 and T3×1 are the of the camera, including a translation matrix and a rotation matrix. However, extrinsic parameters extrinsic parameters of the camera, including a translation matrix and a rotation matrix. However, vary with world coordinate definitions and the position in which the camera is installed. We installed extrinsic parameters vary with world coordinate definitions and the position in which the camera is the camera on the second axis of the robotic arm in this study, so the conversion relationships between installed. We installed the camera on the second axis of the robotic arm in this study, so the coordinate systems will change as the robotic arm moves. Thus, a camera coordinate system had to be conversion relationships between coordinate systems will change as the robotic arm moves. Thus, a defined to calculate the extrinsic parameters. camera coordinate system had to be defined to calculate the extrinsic parameters. The axes of the robotic arm and the camera coordinate system are as shown in Figure 4. Frames 0 The axes of the robotic arm and the camera coordinate system are as shown in Figure 4. Frames and 1 are the coordinate systems of the first and second axes of the robotic arm and change with motor 0 and 1 are the coordinate systems of the first and second axes of the robotic arm and change with rotations. Frames 2 and 3, respectively, represent the position of the camera and the center coordinates motor rotations. Frames 2 and 3, respectively, represent the position of the camera and the center coordinates of the camera and do not change with the posture of the robotic arm. According to this coordinate system, we can derive the conversion relationship in Equation (2), where E is the extrinsic parameter of the camera; Pr and Pc are the locations of the target as described by the coordinate systems of the robotic arm and the camera, respectively. With this coordination matrix, we can convert the target location estimated by the camera coordinate system into coordinates in the robotic
Sensors 2018, 18, 1548
6 of 20
of the camera and do not change with the posture of the robotic arm. According to this coordinate system, we can derive the conversion relationship in Equation (2), where E is the extrinsic parameter of the camera; Pr and Pc are the locations of the target as described by the coordinate systems of the robotic arm and the camera, respectively. With this coordination matrix, we can convert the target location estimated by the camera coordinate system into coordinates in the robotic arm coordinate system, perform 3D reconstruction using the depth and color planes obtained using the depth camera, Sensors 2018, 18, x FOR PEER REVIEW 6 of 20 and construct a point cloud plot of the target sheet metal to plan the waxing path:
PrPr EP = cEP . c.
(2) (2)
Figure 4. 4. Camera Camera coordinate coordinate system system (Frame (Frame 0–1: 0–1: motor motor rotation rotation axis, axis, Frame Frame 2: 2: position position of camera, Figure of camera, Frame 3: center of camera). Frame 3: center of camera).
3.3. Planning Planning of of Waxing Waxing Path Path 3.3. The planar planar dimensions dimensions of of the the target target plane plane on on the the engine engine hood hood to to be be waxed waxed in in this this study study was was The 400 mm mm × × 300 waxing procedure, procedure, each each swipe swipe overlapped overlapped the the 400 300 mm, mm, as as shown shown in in Figure Figure 5. 5. During During the the waxing previous swipe to a certain degree so as to prevent missing any spots. In order to evenly cover each previous swipe to a certain degree so as to prevent missing any spots. In order to evenly cover each spot on onthe thetarget target area, the system divided the waxing path into six paths, and waxing was spot area, the system divided the waxing path into six paths, and waxing was performed performed from the bottom The upwards. The relative of positions of each 40 mm, mm, 170 105 mm, mm, from the bottom upwards. relative positions each path werepath at 40were mm,at105 170 mm, 235 mm, 300 mm, and 360 mm, as shown in Figure 5, and waxing was performed with 235 mm, 300 mm, and 360 mm, as shown in Figure 5, and waxing was performed with a waxinga waxing 80 sponge 80diameter, mm in diameter, so theoverlapped paths overlapped about 1/5 ofother. each We other. We conducted sponge mm in so the paths about 1/5 of each conducted curve curve fitting of the different paths three times based on the side view of the yz plane of the point fitting of the different paths three times based on the side view of the yz plane of the point cloud plot cloud plot converted from the coordinates (as shown in Figure 6). The results are as shown in Table 1. We then calculated the derivative of the curve equation:
z'(y) tanθ ,
(3)
with the equation above, we could calculate the angle θ between the horizontal plane and the tangents at different spots on the paths on the surface. With inverse kinematics, we established the
Sensors 2018, 18, 1548
7 of 20
converted from the coordinates (as shown in Figure 6). The results are as shown in Table 1. We then calculated the derivative of the curve equation: z0(y) = tan θ,
(3)
with the equation above, we could calculate the angle θ between the horizontal plane and the tangents at different spots on the paths on the surface. With inverse kinematics, we established the posture limitations of the robotic arm to ensure that the wax can coat the sheet metal evenly without angle deviations that prevent the robotic arm from applying force vertically to the curved surface of the Sensors 2018, 18, x FOR PEER REVIEW 7 of 20 Sensors 18, x FOR PEER REVIEW 7 of 20 sheet 2018, metal.
(a) (a)
(b) (b)
Figure5.5.Path Pathplanning planningfor forthe thesheet sheetmetal. metal.(a) (a)area areato tobe bewaxed; waxed;(b) (b)waxing waxingtrajectories. trajectories. Figure Figure 5. Path planning for the sheet metal. (a) area to be waxed; (b) waxing trajectories.
(a) (a)
(b) (b)
Figure 6. Posture constraint for the robot end-effector. (a) point cloud 3D plot; (b) slope estimation. Figure Figure6.6.Posture Postureconstraint constraintfor forthe therobot robotend-effector. end-effector.(a) (a)point pointcloud cloud3D 3Dplot; plot;(b) (b)slope slopeestimation. estimation. Table 1. Curve fitting results from the side view of sheet metal (Paths 1 to 6). Table 1. Curve fitting results from the side view of sheet metal (Paths 1 to 6). Table 1. Curve fitting results from the side view of sheet metal (Paths 1 to 6).
Path # #ofofPath # of PathPath 1
Path 1
Path 1 Path 2 Path 2 Path 2 Path 3 Path 3Path 3 Path4 4 Path 4Path Path 5 Path 5 Path 6Path 5
Path6 6 Path
PathFunction Function(after (afterCubic CubicCurve CurveFitting) Fitting) Path
(after Fitting) z ( y ) 2 .86 Path 10 6Function y 3 0 .0055 y 2 Cubic 2 .135Curve y 126 .0215 z 1 (1y ) 2 .86 10 6 y 3 0 .0055 y 2 2 .135 y 126 .0215 5 3 2 − 6 3 2 )= −2.86y ×10 y + 126.0215 z ( y ) z 1(.y02 10 0 .0223 15 .037−y2.135y 3154−.08 2y 0.0055y z 2 (2y ) 11.02 10 5 y 3 0 .0223 15 .0372 y 3154 .08 −5 y3y+ 0.0223y z2 (y) = −1.02 − 15.037y − 3154.08 6 3× 10 2 z ( y ) 5 .49 106 3y 0 .−0117 y 1134 .38 2y 6 .9988 6 y3y+ z 3 (3y ) z53.49 10−5.49 y .0117 6 .99882 − y 6.9988y 1134 .38 (y)= ×010 0.0117y − 1134.38 6 3 2 − 6 3 2 z 4 ( y ) z47(.y69 10 y 0 . 0167 y 10 . 716 y 2051 − .362051.36 6 3 2 ×010 y y+ 0.0167y z 4 ( y ) 7 .69)= 10−7.69 y .0167 10 .716 − y 10.716y 2051 .36 6 3× 10−6 y3 + 20.0156y2 − 9.7094y − 1746.57 z ( y ) = − 7.29 z ( y ) 57 .29 106 3y 0 .0156 2y 9 .7094 y 1746 .57 z 5 (5y ) z7 .29 10 y 0 .0156 −5 3y 9 .7094 2 y 1746 .57 6 ( y ) = −1.17 × 10 y + 0.0256y − 17.241y − 3607.75 z 6 ( y ) 1 .17 1055 3y 3 0 .0256 2y 2 17 .241 y 3607 .75 z 6 ( y ) 1 .17 10 y 0 .0256 y 17 .241 y 3607 .75
ForceControl ControlofofRobotic RoboticArm Arm 4.4.Force Forthe therobotic roboticarm armtotomaintain maintainconstant constantforce forcewhile whilewaxing waxingand andavoid avoiddamaging damagingthe thesheet sheet For metal and the structure of the robotic arm, we adopted an impedance control method that metal and the structure of the robotic arm, we adopted an impedance control method that isis commonly seen in the literature to control the waxing force of the robotic arm [32]. The concept is to
Sensors 2018, 18, 1548
8 of 20
4. Force Control of Robotic Arm For the robotic arm to maintain constant force while waxing and avoid damaging the sheet metal and the structure of the robotic arm, we adopted an impedance control method that is commonly seen in the literature to control the waxing force of the robotic arm [32]. The concept is to achieve force tracking control with the expected force of the system (Fd ) and an impedance control algorithm. Furthermore, if the sponge at the end-effector of the robotic arm is not completely flat on the surface of the sheet metal, it will affect the evenness of force application, increase the chance of structural damage in the robotic arm, and create an additional moment Mx in the robotic arm, as shown in Figure 7. In view of this, we also considered orientation impedance control to keep the sponge flat Sensors 2018, 18, xbased FOR PEER REVIEW 8 of 20 on the surface on the external moment. Figure 8 displays a block diagram of the overall force control system. Considering the position terms py and pz of the end-effector in the y- and z-directions . .. z-directions and rotation angle x around the x-axis,Tx = [py pz θx]T (position of end-effector); and x the and rotation angle θ x around theθx-axis, x = [py pz θ x ] (position of end-effector); x and x denote T x denote the velocity and acceleration of the end-effector, xdthe = [pdesired yd pzd θxd] is the desired position of velocity and acceleration of the end-effector, xd = [pyd pzd θ xd ]T is position of the end-effector. the end-effector. The mass and moment of inertia matrix of the end-effector = diag(M y, Mz, Jθ), The mass and moment of inertia matrix of the end-effector is M = diag(My , Mz ,isJ θM ), and the matrices the matrices of damping between the robotic armare and environment are ofand damping and stiffness betweenand the stiffness robotic arm and the environment B =the diag(B y , Bz , Bθ ) and B = diag(B y, Bz, Bθ) and K = diag(Ky, Kz, Kθ), respectively. The force that comes into contact with the K = diag(Ky , Kz , Kθ ), respectively. The force that comes into contact with the external environment environment is the Fexternal force/torque Fext = [Fy Fz Mx]T. As the force sensor is installed on T isexternal the external force/torque ext = [Fy Fz Mx ] . As the force sensor is installed on the end-effector, Fssinθ and Fz = −Fthe scosθ exist between the measured the end-effector, conversion conversion relationships Fy = Fsrelationships sinθ and Fz = F−y F=s cosθ exist between measured waxing force Fs and waxing force F s and the coordinate system of the point cloud. Note that θ can be derived using the coordinate system of the point cloud. Note that θ can be derived using Equation (3). Let the waxing Equation Let the force applied to the target be Fw; its components Fyd = Fwsinθ, Fzd = force being(3). applied to waxing the target be Fbeing w ; its components Fyd = Fw sinθ, Fzd = −Fw cosθ, and Mxd = 0 can T wcosθ, and Mxd = 0 can be expressed using expected force be−Fexpressed using expected force vector Fd = [Fyd Fzd Mxd ]T . vector Fd = [Fyd Fzd Mxd] . Theprinciple principlebehind behindFigure Figure8 8isistotouse usethe thedifference difference between Fextand andF Fdtotochange changethe theposition position The between Fext d responsesofofthe theoverall overallsystem. system.When Whenthe thedifference differenceisiszero, zero,ititmeans meansthat thatthe theforce forceapplied appliedby bythe the responses robotic arm to the sheet metal has reached the system's target value. Furthermore, impedance robotic arm to the sheet metal has reached the system's target value. Furthermore, impedance control used tothe alleviate impact the force created by the when end-effector when it comeswith into iscontrol used toisalleviate impactthe of the forceof created by the end-effector it comes into contact contact the sheet metal. the sheet with metal.
(a)
(b)
Figure7.7.Use Useofofrobotic roboticarm armininwaxing waxingforce forcecontrol. control.(a) (a)normal normalwaxing waxingforce forcecontrol; control;(b) (b)end-effector end-effector Figure not flat on sheet metal. not flat on sheet metal.
Impedance control
(a)
(b)
Figure 7. Use of robotic arm in waxing force control. (a) normal waxing force control; (b) end-effector Sensorsnot 2018, 18,on 1548 flat sheet metal.
9 of 20
Impedance control
Figure 8. Block diagram of robot force control system in this study. Figure 8. Block diagram of robot force control system in this study.
5. Image Based Performance Evaluation 5. Image Based Performance Evaluation To effectively test the waxing results and verify the feasibility of the proposed system, we divided To effectively test the waxing results and verify the feasibility of the proposed system, we the automotive sheet metal used in our experiment into two parts as shown in Figure 9. The left divided the18,automotive sheet metal used in our experiment into two parts as shown in Figure99.of The Sensors 2018, x FOR PEER REVIEW 20 part was waxed by the robotic arm, while the right part was waxed by an experienced auto detailer, left part was waxed by the robotic arm, while the right part was waxed by an experienced auto which served as the control group. Using the auto detailer’s waxing results, we applied an external detailer, which served as the control group. Using the auto detailer’s waxing results, we applied an light source for analysis to develop assessment criteria for waxing quality. external light source for analysis to develop assessment criteria for waxing quality. 5.1. Selection of External Light Source 5.1. Selection of External Light Source White point light sources have a condensation effect that shows any fine scratches on paint White point light sources have a condensation effect that shows any fine scratches on paint surfaces. For this reason, we adopted eight 5050 LED strips 50 cm in length for the light source. surfaces. For this reason, we adopted eight 5050 LED strips 50 cm in length for the light source. Each Each strip contained 36 LED point lights, so there were 144 point lights on either side, as shown strip contained 36 LED point lights, so there were 144 point lights on either side, as shown in in Figure 9. The right shows the reflection results of the waxing done by the auto detailer. As the Figure 9. The right shows the reflection results of the waxing done by the auto detailer. As the fine fine scratches on the surface of the sheet metal have been filled in, the result is an almost mirror-like scratches on the surface of the sheet metal have been filled in, the result is an almost mirror-like reflection of the white point light sources, and each point light can be distinguished clearly. In contrast, reflection of the white point light sources, and each point light can be distinguished clearly. In the fine scratches in the unwaxed area on the left are still very visible, so the reflection results are dim, contrast, the fine scratches in the unwaxed area on the left are still very visible, so the reflection and some reflected light spots are indistinguishable with only a glimmer remaining. results are dim, and some reflected light spots are indistinguishable with only a glimmer remaining.
Figure 9. Comparison of LED lighting results. Figure 9. Comparison of LED lighting results.
5.2. Waxing Assessment Criteria 5.2. Waxing Assessment Criteria To evaluate the quality of waxing in the waxed areas, we utilized the condensation effects of the To evaluate the quality of waxing in the waxed areas, we utilized the condensation effects of the round LED point lights. Image preprocessing was conducted on the area waxed by the auto detailer round LED point lights.distribution Image preprocessing was light conducted area waxed included by the auto detailer to calculate the radius of the reflected spots. on Thethe preprocessing grayscale conversion, Canny edge detection [33], and the Hough circle transform [34]. Next, we calculated the radius of each reflected light spot for statistical analysis to establish the assessment criteria of the proposed waxing system. 5.2.1. Image Preprocessing
Sensors 2018, 18, 1548
10 of 20
to calculate the radius distribution of the reflected light spots. The preprocessing included grayscale conversion, Canny edge detection [33], and the Hough circle transform [34]. Next, we calculated the radius of each reflected light spot for statistical analysis to establish the assessment criteria of the proposed waxing system. 5.2.1. Image Preprocessing Once the camera captures a color image of the waxed surface, it must be converted to grayscale for edge detection. Grayscale conversion merges the three-color channels into a single channel, and it is also a way of expressing the brightness values of each pixel. The image processing results are as shown in Figure 10. Figure 10a shows the image captured by the camera of the surface waxed by the auto detailer; Figure 10b displays the image following grayscale conversion; Figure 10c presents the results of edge detection. As can be seen, the halos in the image were completely removed after edge detection. Next, the Hough circle transform determines whether the reflected light spots are uniform in size and presents their distribution conditions. Sensors 2018, 18, x FOR PEER REVIEW 10 of 20
(a)
(b)
(c) Figure 10. 10. Results image preprocessing preprocessing (area waxed by by auto auto detailer). detailer). (a) Figure Results of of image (area waxed (a) captured captured RGB RGB image; image; (b) RGB to grey image; (c) after edge detection. (b) RGB to grey image; (c) after edge detection.
Figure 11 presents the radius distributions of the reflected LED light spots following the Hough Figure 11 presents the radius distributions of the reflected LED light spots following the Hough circle transform. All 144 reflected light spots were circular and 4 pixels to 5 pixels in diameter. The circle transform. All 144 reflected light spots were circular and 4 pixels to 5 pixels in diameter. maximum error was less than 1 pixel, which means that the auto detailer had filled in the fine The maximum error was less than 1 pixel, which means that the auto detailer had filled in the fine scratches on the surface of the sheet metal, so the resulting surface facilitated reflection. The scratches on the surface of the sheet metal, so the resulting surface facilitated reflection. The influence influence of the poor surface conditions on the image analysis results was also reduced. of the poor surface conditions on the image analysis results was also reduced. 25
# of circles
20
15
10
5
Histogram of circle detection (waxing by car detailer) Total: 144 samples
(b) RGB to grey image; (c) after edge detection.
Figure 11 presents the radius distributions of the reflected LED light spots following the Hough circle transform. All 144 reflected light spots were circular and 4 pixels to 5 pixels in diameter. The maximum error was less than 1 pixel, which means that the auto detailer had filled in the fine Sensorsscratches 2018, 18, 1548 11 of 20 on the surface of the sheet metal, so the resulting surface facilitated reflection. The influence of the poor surface conditions on the image analysis results was also reduced. Histogram of circle detection (waxing by car detailer) Total: 144 samples
25
# of circles
20
15
10
5
0 3.6
3.8
4
4.2 4.4 4.6 radius of circles (pixel)
4.8
5
5.2
Figure 11. Histogram of reflected LED light spot radiuses (area waxed by auto detailer).
Figure 11. Histogram of reflected LED light spot radiuses (area waxed by auto detailer).
5.2.2. Establishment of Image Assessment Criteria
5.2.2. Establishment of Image Assessment Criteria
To effectively analyze the radius distributions resulting from circle detection, we used the mean
To analyze to theconstruct radius distributions from circle we used the andeffectively standard deviation the probabilityresulting density function of thedetection, normal distribution andmean express the degree of dispersion in the radiuses of the reflected LED light spots. The distribution and standard deviation to construct the probability density function of the normal distribution and conditions of thisoffunction served theradiuses assessment If the image analysis results of the area express the degree dispersion inas the of criteria. the reflected LED light spots. The distribution waxed by the proposed robotic arm meet the preset threshold, it means that the proposed system conditions of this function served as the assessment criteria. If the image analysis results of the area has successfully completed the waxing process. waxed by the proposed robotic arm meet the preset threshold, it means that the proposed system has Sensors 2018, 18, x FOR PEER 11 of 20 successfully completed theREVIEW waxing process. Figure 12 presents the normal distribution function plotted based on the radius distribution Figure 12 presents the normal distribution function plotted based on the radius distribution from the area waxed by the auto detailer. The mean (x) and standard deviation (σ) of the function from the area waxed by the auto detailer. The mean ( x ) and standard deviation (σ) of the function are 4.40 and 0.20, respectively. As the standard deviation is very small, it is clear that the function are 4.40 and 0.20, respectively. As the standard deviation is very small, it is clear that the function distribution is very closeclose to the of the ThisThis means thatthat the area waxed by the autoauto detailer distribution is very tomean the mean ofseries. the series. means the area waxed by the was very smooth and could the LED light above to the camera plane; no fine detailer was very smoothcompletely and could reflect completely reflect the source LED light source above to the camera scratches caused diffusion. plane; no finelight scratches caused light diffusion. 25
Distribution of circle radius (waxing by car detailer) mean :4.40 std :0.20
# of circles
20
15
10
5
0 3.6
3.8
4
4.2 4.4 4.6 radius of circles (pixel)
4.8
5
5.2
Figure 12. Normal distributionofofreflected reflected light light spot waxed by by auto detailer). Figure 12. Normal distribution spotradiuses radiuses(area (area waxed auto detailer).
The data above was obtained from a camera angle of 30.To confirm whether these statistical characteristics can serve as the assessment criteria of the system, we analyzed the area waxed by the auto detailer using images taken from camera angles of 15, 30 and 45 and used the data from the 30 images for quality screening. In the following experiments, we used an intermediate value of camera angle (30) to ensure desired reflection effects and avoid unexpected overexposure for consistent image analysis results. Table 2 displays the radius analysis results from the different
Sensors 2018, 18, 1548
12 of 20
The data above was obtained from a camera angle of 30◦ .To confirm whether these statistical characteristics can serve as the assessment criteria of the system, we analyzed the area waxed by the auto detailer using images taken from camera angles of 15◦ , 30◦ and 45◦ and used the data from the 30◦ images for quality screening. In the following experiments, we used an intermediate value of camera angle (30◦ ) to ensure desired reflection effects and avoid unexpected overexposure for consistent image analysis results. Table 2 displays the radius analysis results from the different angles, and Figure 13 shows the corresponding image analysis results. The red circles indicate that the circle meets the threshold standard. As can be seen, the results from the different angles are very similar, which demonstrates that the data can serve as the assessment criteria for waxing quality. We set the mean plus and minus two standard deviations (±2σ) as the acceptable range for reflected light spot radiuses. Waxing results that are closer to the average value of the ones in Table 2 (134 samples) indicate greater quality, which is the objective of the proposed system. Table 2. Radius distributions of reflected light spots from different camera. Samples in Different Range (x = 4.40, œ = 0.20) Camera View (θ)
Total Samples
x±œ
x ± 2œ
15◦ 30◦ 45◦
144 144 144
96 samples 98 samples 97 samples
130 samples 135 samples 137 samples
Sensors 2018, 18, x FOR PEER REVIEW
(a)
(b)
(c)
(d)
(e)
(f)
12 of 20
Figure 13. Image Imageanalysis analysisresults results from different camera angles. (a) circle detection from Figure 13. from different camera angles. (a) circle detection results results from camera ◦ ◦ camera angle 15°; (b) circle screening results from camera angle 15° (130 samples); (c) circle detection angle 15 ; (b) circle screening results from camera angle 15 (130 samples); (c) circle detection results from results angle screening 30°; (d) circle results from angle 30° (135 samples); camera from angle camera 30◦ ; (d) circle resultsscreening from camera angle 30◦ camera (135 samples); (e) circle detection ◦ ◦ (e) circle detection results from camera angle 45°; (f) circle screening results from camera angle 45° results from camera angle 45 ; (f) circle screening results from camera angle 45 (137 samples). (137 samples).
6. Experiments and Performance Analysis The experiment in this study was divided into two parts: force tracking and wax image analysis. Due to the fact that force tracking performance will affect the evenness of the wax covering
Sensors 2018, 18, 1548
13 of 20
6. Experiments and Performance Analysis The experiment in this study was divided into two parts: force tracking and wax image analysis. Due to the fact that force tracking performance will affect the evenness of the wax covering the sheet metal while the robotic arm is waxing, the impedance control parameters must be adjusted in advance so that the waxing force of the robotic arm will remain constant and thereby prevent excessive error from affecting waxing quality or damaging the robotic arm or target sheet metal. Once suitable impedance parameters are obtained, a waxing experiment is conducted using these parameters with waxing force and dwell time as the impact factors of waxing quality so as to analyze and investigate the waxing results derived from different parameters and the pre-established assessment criteria. 6.1. Analysis of Force Tracking Performance The impedance control parameters (M, B and K) in Figure 8 all influence the force tracking error and response speed. By adjusting the impedance parameters, we change the response behavior of the system to achieve force tracking performance with stability, a short settling time, and no overshoot. If the damping ratio (ζ) is set as 1 and the settling time as less than 0.5 s, then, with the transient responses of a standard second-order dynamic system [35], we can derive that the natural frequency (ω n ) of the system must be greater than 8 rad/s to fulfill this constraint. In this experiment, we set the natural frequency as 10 rad/s; in terms of Fy , if the mass coefficient selected for the impedance controller is M = 3.5 kg, then the damping and stiffness coefficients are B = 70 N s/m and K = 350 N/m, respectively. We also chose the same set of parameters to achieve the impedance control of components Fy and Mx and used a single target point to perform waxing force tracking of the robotic arm and analyze its response results. As previously shown in Figure 7, when the system begins the waxing process, the end-effector must maintain vertical contact with the surface of the sheet metal to ensure that the waxing sponge is completely flat on the sheet metal. Thus, we must consider whether Fs , the force that is vertical to the sheet metal, reaches the target value Fw set by the system. At the same time, we must suppress the torque Mx created when the end-effector is not vertical to the sheet metal. Figure 14 presents the response results of force tracking when Fw is set as 15 N. As can be seen, the responses resulting from this set of parameters can quickly reach the target value, the settling time is significantly shorter than 0.5 s, no overshoot exists, and the absolute value of the steady state error is less than 0.3 N. The robotic arm showed some posture deviations during the time from 0.1 s to 0.4 s, which prevented the end-effector from being completely flat on the sheet metal surface and created a torque of approximately 3 N mm. However, posture compensation eliminated this after 0.4 s, so it did not affect the results of the waxing experiment. When Fw is set between 10 N and 20 N, the force tracking results of other fixed values present an identical trend.
this set of parameters can quickly reach the target value, the settling time is significantly shorter than 0.5 s, no overshoot exists, and the absolute value of the steady state error is less than 0.3 N. The robotic arm showed some posture deviations during the time from 0.1 s to 0.4 s, which prevented the end-effector from being completely flat on the sheet metal surface and created a torque of approximately 3 N mm. However, posture compensation eliminated this after 0.4 s, so it did not the1548 results of the waxing experiment. When Fw is set between 10 N and 20 N, the force tracking Sensorsaffect 2018, 18, 14 of 20 results of other fixed values present an identical trend.
Figure 14. Force tracking control results therobotic roboticwaxing waxing system system (F MM 3×3 3= 3.5, 3.5, Figure 14. Force tracking control results ofofthe (Fww= =15,15, ×diag(3.5, 3 = diag(3.5, 3×3 = diag(70, 70, 70), K 3×3 = diag(350, 350, 350)). B 3.5), B3.5), = diag(70, 70, 70), K = diag(350, 350, 350)). 3×3 3×3
6.2. Examination of Waxing Parameters
6.2. Examination of Waxing Parameters
In this experiment, we divided the area to be waxed on the sheet metal into six paths. Based on
In this experiment, we divided the area to be waxed on the sheet metal into six paths. Based on the previously established waxing paths, we adjusted the dwell time of each spot to change the the previously established waxing paths, weparameters adjusted the dwell time of each spot to change waxing speed. We also used the impedance established in Section 6.1 for waxing force the waxing speed. Wethe also used task the impedance parameters established Section fortowaxing control. After waxing was completed, we waited for 5 to 10 in min for the6.1 wax dry andforce control. AfterTothe waxing wasexperimental completed,process, we waited for 5cotton to 10 buffing min forcloth the wax to dry solidify. simplify thetask overall a normal was used to and remove the excess wax (the dewaxing/polishing process). With the camera angle set at 30, images of solidify. To simplify the overall experimental process, a normal cotton buffing cloth was used to thethe reflections of external point lights resulting from With different experimental were of remove excess wax (the dewaxing/polishing process). the camera angle conditions set at 30◦ , images processed,ofand the radius of each reflected light spot was calculated. conditions The resultswere wereprocessed, then the reflections external point lights resulting from different experimental compared with the assessment criteria that we established to verify the feasibility of the proposed and the radius of each reflected light spot was calculated. The results were then compared with the system. The accepted range of circle radiuses and the target number are indicated in Table 3, the assessment criteria that we established to verify the feasibility of the proposed system. The accepted latter being 134 circles and the threshold set at a difference under 10 circles. range of circle radiuses and the target number are indicated in Table 3, the latter being 134 circles and the threshold set at a difference under 10 circles. Table 3. Waxing image assessment criteria adopted in this study. Estimation of Radius
Criterion Definition
Mean value (x) Standard deviation (σ) Range of circle radius Average result from Table 2 Threshold
4.4 pixel 0.2 pixel x ± 2σ 134 samples 10 samples
6.2.1. Experiment Conditions and Parameter Settings We used Carnauba wax, which is the most common type of wax used in the auto detailing industry, for the waxing experiment. The waxing sponge, a fine waxing sponge specifically made for auto detailing, was 25 mm thick to prevent the vibrations caused by waxing from damaging the structure of the robotic arm, facilitate heat dissipation needed for high-speed motor rotations, and reduce the possibility of burning the wax and the sheet metal surface. Table 4 presents the waxing parameter settings of the system. As the force required in most waxing procedures ranges from 10 N
Sensors 2018, 18, 1548
15 of 20
to 20 N, we experimented with waxing forces 10 N, 13 N, 15 N, 17 N, and 19 N and chose a motor with rotational speed 850 rpm and torque output reaching 30 N-m for the waxing motor. For dwell time, we experimented with 0.1 s/pt, 0.2 s/pt, 0.3 s/pt, and 0.4 s/pt to facilitate subsequent investigations. In the experiment, we divided the y-direction into 500 equally spaced points (as shown in Figure 6) and used the curve fitting equation and Equation (3) in Section 3.3 to derive the target location of the end-effector, xd . Via inverse kinematics, we obtained the motor position command and then realized point-to-point motion control in the motor with the proportional-derivative (PD) control algorithm. Waxing of each fixed point was then performed from the bottom upwards according to each of the planned paths. The movements between paths were achieved using the xyz platform, coordinated with impedance control to prevent the robotic arm from applying excessive force when it comes into contact with the sheet metal. Table 4. Waxing experiment parameter settings. Waxing Motor Specifications
Waxing Force (From 10 to 19 N)
Dwell Time
Executing Time for all Paths (Approx.)
24 V—850 rpm (30 N-m)
10 N 13 N 15 N 17 N 19 N
0.1 s/pt 0.2 s/pt 0.3 s/pt 0.4 s/pt
5~6 min 10~11 min 15~16 min 20~21 min
6.2.2. Discussion of Waxing Results Table 5 presents the image analysis results from the five different waxing forces above, and Figure 15 displays the corresponding 3D plot. Observation revealed that, as the waxing force increased from 10 N to 15 N, the number of reflected light spots increased significantly because the wax filled the fine scratches more fully as the waxing force increased. Once the waxing force reached 17 N; however, the number of assessment criteria that met standards stagnated and began to decline. This is because the optimal waxing force for the Carnauba wax chosen for this system is around 15 N; excessive force affects the stacking effect of the wax on the sheet metal, preventing the wax from filling in the fine scratches normally and leading to the diffusion of the reflected light spots on the sheet metal surface. As a result, the image analysis results were poorer than those at 15 N. Table 5. Experimental results of different waxing parameters. Waxing Force
Dwell Time
Executing Time
Estimated Result (Total: 144 Samples)
10 N
0.1 s/pt 0.2 s/pt 0.3 s/pt 0.4 s/pt
5 min 28 s 10 min 35 s 15 min 24 s 20 min 19 s
70 samples 82 samples 89 samples 93 samples
13 N
0.1 s/pt 0.2 s/pt 0.3 s/pt 0.4 s/pt
5 min 36 s 10 min 21 s 15 min 18 s 20 min 30 s
92 samples 102 samples 110 samples 115 samples
15 N
0.1 s/pt 0.2 s/pt 0.3 s/pt 0.4 s/pt
5 min 23 s 10 min 29 s 15 min 34 s 20 min 36 s
110 samples 116 samples 121 samples 127 samples
17 N
0.1 s/pt 0.2 s/pt 0.3 s/pt 0.4 s/pt
5 min 31 s 10 min 33 s 15 min 29 s 20 min 24 s
112 samples 117 samples 115 samples 110 samples
19 N
0.1 s/pt 0.2 s/pt 0.3 s/pt 0.4 s/pt
5 min 36 s 10 min 21 s 15 min 18 s 20 min 30 s
115 samples 117 samples 115 samples 113 samples
0.3 s/pt 0.4 s/pt 0.1 s/pt 0.2 s/pt 0.3 s/pt 0.4 s/pt
19 N Sensors 2018, 18, 1548
15 min 29 s 20 min 24 s 5 min 36 s 10 min 21 s 15 min 18 s 20 min 30 s
115 samples 110 samples 115 samples 117 samples 115 samples 113 samples
16 of 20
As Ascan canbe beseen seen in in Figure Figure 15, 15, the the number number of of circles circles that that meet meet standards standards increases increaseswith withdwell dwelltime time and andpeaks peaksat at127 127when whenwaxing waxingforce force15 15NNisispaired pairedwith with0.4 0.4s/pt. s/pt. This This number number is is less less than than the the result result shown shownin inTable Table33by byseven sevencircles circlesand andwithin withinthe thethreshold thresholdof of10 10circles, circles,which whichmeans meansthat thatthe the quality quality isisalready that achieved by by thethe autoauto detailer. When the waxing force force is greater than 15 N, alreadyvery veryclose closetoto that achieved detailer. When the waxing is greater than the image analysis results do not improve as the dwell time increases. The reason is similar; excessive 15 N, the image analysis results do not improve as the dwell time increases. The reason is similar; force affects the affects stacking of theeffect wax of onthe thewax sheet and metal, even ifand the even waxing speed decreases, excessive force theeffect stacking onmetal, the sheet if the waxing speed the wax cannot fill in the scratches onscratches the sheeton metal, which reduces the reduces repairing effects of theeffects wax. decreases, the wax cannot fill in the the sheet metal, which the repairing Thus, results increased dwell time paired forces 17 N and 19 N poorer of thethe wax. Thus,ofthe results of increased dwell with time waxing paired with waxing forces 17were N and 19 N than were those with waxing forcewaxing 15 N. force 15 N. poorer than those with
Figure 15. 15. Experimental Experimental results results of ofdifferent differentwaxing waxingparameters—3D parameters—3Dplot. plot. Figure
6.3. Waxing Parameter Optimization The surface formed by the results in Figure 15 presents a convex function. To derive the optimal parameters, we conducted 3D surface fitting with the experimental results from different waxing forces using least squares regression, thereby producing a mathematical model for optimization. 6.3.1. Problem Formulation The mathematical model for optimization comprises two parts: an objective function and constraints. Both parts are functions of the design variables. We adopted a fifth-order polynomial to fit the surface using least squares regression, as shown in Equation (4). Figure 16 presents the surface fitting results of the experimental data. The R-squared value (R2 = 0.996) indicates that the fitted surface is very close to the distribution of the experimental results, so the quantic equation with two variables served as the objective function of optimization in the proposed system. The various coefficients are as shown in Table 6: 5
z = f ( x, y) =
n
∑ ∑ p(n−i,i) xn−i yi .
n =0 i =0
(4)
variables served as the objective function of optimization in the proposed system. The various coefficients are as shown in Table 6: 5
z f ( x, y)
n
p
( n i ,i ) x
ni
yi
n0 i 0
Sensors 2018, 18, 1548
(4) .
17 of 20
Figure Figure 16. 16. Fifth-order Fifth-order surface surface fitting fitting results. results. Table 6. Coefficient table of of surface surface equation. equation. Table 6. Coefficient table p00 p10 p01 p20 p11 p02 p30
p00
−7917
p30
35.63
−7917 p10 3081 3081 01 −3663 −p3663 20 −471.6 −p471.6 915.7 p11 915.7 9295 p02 9295 35.63
p21
p21 p12 p12 p03 p03 p40 p40 p31 p31 p22 p22 p13
p13
−76.58 p04 −76.58 −1923 p50 −1923 −8127 −8127p41 −1.324 p32 −1.324 2.224 2.224 p23 159.7p14 159.7 −83.42 −83.42 p05
18140 p04 0.01935 p50 −0.008048 p41 −4.448 p32 p23 12.24 p14 −255.9 p05 −11600
18140 0.01935 −0.008048 −4.448 12.24 −255.9 −11600
Table 7 presents mentioned in the presents the the parameter parameter constraints constraints of the the optimization optimization problem. As mentioned notnot suitable for for the car chosen for this For thisFor reason, previous section, section,excessive excessiveforce forceis is suitable the wax car wax chosen forsystem. this system. this we limited the waxing force to between 10 N and 17 N. In consideration of the overall efficiency of the reason, we limited the waxing force to between 10 N and 17 N. In consideration of the overall system, weoflimited the dwell time to the between s/pt 0.5 s/pt as and to prevent dwell efficiency the system, we limited dwell0.1 time to and between 0.1 so s/pt 0.5 s/ptoverly so as long to prevent time from burning or wearing down the which would prevent image analysis results overly long dwell time from burning orwax, wearing down thethen wax, whichthe would then prevent the from corresponding to the optimized parameters. image analysis results from corresponding to the optimized parameters. Table 7. Constraints of optimization. Optimization Parameters of the Waxing System
Lower and Upper Bounds
Force (N) Dwell time (s/pt)
From 10 N to 17 N From 0.1 s/pt to 0.5 s/pt
6.3.2. Optimization Results and Verification We chose constrained nonlinear minimization [36] for parameter optimization and used the Matlab Optimization Toolbox (version 8.0, The MathWorks, Inc., Natick, MA, USA) to derive the optimal combination of waxing parameters. Constrained nonlinear minimization searches for the minimum value of the objective function within the constraints. Thus, we multiplied Equation (4) by −1 to flip the surface, as shown in Equation (5), to facilitate optimization: 5
Objective function : z = − f ( x, y) = − ∑
n
∑ p(n−i,i) xn−i yi .
n =0 i =0
(5)
derive the optimal combination of waxing parameters. Constrained nonlinear minimization searches for the minimum value of the objective function within the constraints. Thus, we multiplied Equation (4) by −1 to flip the surface, as shown in Equation (5), to facilitate optimization: 5
Sensors 2018, 18, 1548
n
Objective function: z f ( x, y ) p( n i ,i ) x n i y i n 0 i 0
18 of (5)20
.
The above included included six six iterations iterations and and 26 26 calculations. calculations. The indicate The optimization optimization process process above The results results indicate that waxing force 15 N and dwell time 0.5 s/pt is the optimal parameter combination of the that waxing force 15 N and dwell time 0.5 s/pt is the optimal parameter combinationsystem. of the The number of reflected light spot radiuses that reached the assessment criterion was 132. Next, we system. The number of reflected light spot radiuses that reached the assessment criterion was 132. performed the waxing experiment using this parameter combination, conducted image analysis of Next, we performed the waxing experiment using this parameter combination, conducted image the results, andresults, compared with the assessment criteria. criteria. analysis of the and them compared them with the assessment Figure 17 presents the image analysis results of waxing optimal parameters. As As cancan be Figure 17 presents the image analysis results of waxingwith withthe the optimal parameters. seen in Figure 17a, the resulting reflection is very close to that in the area waxed by the auto detailer. be seen in Figure 17a, the resulting reflection is very close to that in the area waxed by the auto The sheetThe metal displayed a mirror-like reflection ofreflection the LED light spots, thespots, number circles that detailer. sheet metal displayed a mirror-like of the LEDand light andofthe number reached the assessment criterion was 130, which is extremely close to the 132 derived in optimization. of circles that reached the assessment criterion was 130, which is extremely close to the 132 derived Table 8 compares the 8execution of the results system of with those ofwith the those auto of detailer. Although in optimization. Table comparesresults the execution the system the auto detailer. the optimal parameters were used for waxing, it is still noticeable that the execution time needed Although the optimal parameters were used for waxing, it is still noticeable that the execution time by the system was far from that needed by the auto detailer. This is because we used a smaller needed by the system was far from that needed by the auto detailer. This is because we used a smaller robotic smaller waxing prevent robotic arm arm and and smaller waxing sponge sponge (diameter (diameter 80 80 mm) mm) for for the the waxing waxing procedure procedure so so as as to to prevent overburdening the xyz platform during operation. In contrast, the sponge used by the auto detailer overburdening the xyz platform during operation. In contrast, the sponge used by the auto detailer was 150 mm in diameter, which covered a much larger area, and this resulted in the difference in was 150 mm in diameter, which covered a much larger area, and this resulted in the difference in execution execution time. time. The The experimental experimental results results verify verify the the feasibility feasibility of of the the proposed proposed system system and and demonstrate demonstrate that the parameter optimization analysis can further improve actual waxing quality. that the parameter optimization analysis can further improve actual waxing quality. Table 8. 8. Comparison Comparison of of waxing waxing results results (proposed (proposedsystem: system: waxing waxing force force15 15N, N,dwell dwelltime time0.5 0.5s/pt). s/pt). Table
Car detailer Car Thedetailer waxing system The waxing system
Sensors 2018, 18, x FOR PEER REVIEW
(b)
Executing Time Executing Time About 15 min About 15 min 25 min 16 s 25 min 16 s
Estimated Result (Total: 144 Samples) Estimated Result (Total: 144 Samples) 134 samples 134 samples 130 samples 130 samples
18 of 20
(a)
(c)
Figure 17. Waxing Figure 17. Waxing image image analysis analysis following following system system parameter parameter optimization optimization (waxing (waxing force force 15 15 N, N, dwell time 0.5 s/pt). (a) waxing results comparison (proposed system vs. car detailer); (b) captured dwell time 0.5 s/pt). (a) waxing results comparison (proposed system vs. car detailer); (b) captured image system; (c) (c) Hough Hough circle circle transform transform results results (130 (130 circles). circles). image with with the the proposed proposed system;
7. 7. Conclusions Conclusions This studydeveloped developed a robotic waxing system auto detailing and an image-based This study a robotic waxing system for autofor detailing and an image-based assessment assessment method for performance evaluation and improvement. We adopted a depth to method for performance evaluation and improvement. We adopted a depth camera to obtaincamera the color obtain the color and depth information of the sheet metal and then established waxing paths using and depth information of the sheet metal and then established waxing paths using image processing image processing and coordinate conversion. In order for the robotic arm to maintain constant waxing force and better compliance, we adopted a position-based impedance control framework to realize waxing force control. With the waxing results of an experienced auto detailer as the standard, we analyzed images of LED point light reflections on the waxed sheet metal to develop a set of assessment criteria for the waxing system. We performed experiments and parameter optimization analysis with two parameters, waxing force and dwell time, which served as the primary impact
Sensors 2018, 18, 1548
19 of 20
and coordinate conversion. In order for the robotic arm to maintain constant waxing force and better compliance, we adopted a position-based impedance control framework to realize waxing force control. With the waxing results of an experienced auto detailer as the standard, we analyzed images of LED point light reflections on the waxed sheet metal to develop a set of assessment criteria for the waxing system. We performed experiments and parameter optimization analysis with two parameters, waxing force and dwell time, which served as the primary impact factors of waxing quality. The results indicate that the proposed system can successfully achieve the waxing quality of an experienced auto detailer and provide reference for further developments on whole automated waxing systems in auto detailing. Author Contributions: C.-Y.L. conceived and designed the robotic waxing system, analyzed the results, and wrote the paper. B.-C.H. performed the automatic waxing experiments and the data collection for performance evaluation. Funding: This research was funded by Ministry of Science and Technology, Taiwan, under grant number MOST-105-2628-E-011-004. Conflicts of Interest: The authors declare no conflict of interest.
References 1. 2. 3. 4. 5. 6.
7. 8. 9.
10. 11.
12. 13.
14.
15. 16.
Kumar, S. Theories of musculoskeletal injury causation. Ergonomics 2001, 44, 17–47. [CrossRef] [PubMed] Wagner, H.R. Apparatus for Washing and Waxing Cars. U.S. Patent 3447505, 3 June 1969. Falls, J.W. Automatic Car Waxer. U.S. Patent 5076202, 31 December 1991. Hung, V. Car Waxing Machine with Driving Handle. U.S. Patent 6725491, 27 April 2004. Wu, X.; Wu, X.; Guo, H. Combination Automobile Detailing Machine. U.S. Patent 11518864, 11 September 2006. Ning, F.; Cong, W.; Wang, H.; Hu, Y.; Hu, Z.; Pei, Z. Surface grinding of CFRP composites with rotary ultrasonic machining: A mechanistic model on cutting force in the feed direction. Int. J. Adv. Manuf. Technol. 2017, 92, 1217–1229. [CrossRef] Sallinen, M.; Heikkila, T.; Sirvio, M. Robotic Deburring System of Foundry Castings Based on Flexible Workobject Localization; SPIE: Boston, MA, USA, 2001; pp. 476–487. Hsu, F.Y.; Fu, L.C. Intelligent robot deburring using adaptive fuzzy hybrid position/force control. IEEE Trans. Robot. Autom. 2000, 16, 325–335. [CrossRef] Mohsin, I.; He, K.; Cai, J.; Chen, H.; Du, R. Robotic polishing with force controlled end effector and multi-step path planning. In Proceedings of the IEEE International Conference on Information and Automation, Macau, China, 18–20 July 2017; pp. 344–348. Chen, F.; Hao, S.; Miao, X.; Yin, S.; Huang, S. Numerical and experimental study on low-pressure abrasive flow polishing of rectangular microgroove. Powder Technol. 2018, 327, 215–222. [CrossRef] Han, G.; Sun, M. Compound control of the robotic polishing process basing on the assistant electromagnetic field. In Proceedings of the IEEE International Conference on Mechatronics and Automation, Changchun, China, 9–12 August 2009; pp. 1551–1555. Sharma, K.; Shirwalkar, V.; Das, A.P.; Pal, P.K. Robotic polishing of pilger-die. In Proceedings of the Conference on Advances in Robotics, Goa, India, 2–4 July 2015. Jamisola, R.; Ang, M.H.; Oetomo, D.; Khatib, O.; Ming, T.; Lim, S.Y. The operational space formulation implementation to aircraft canopy polishing using a mobile manipulator. In Proceedings of the IEEE International Conference on Robotics and Automation, Washington, DC, USA, 11–15 May 2002; pp. 400–405. Zhang, D.; Yun, C.; Song, D.; Lan, X.; Wu, X. Design and application for robotic polishing system. In Proceedings of the IEEE International Conference on Mechanic Automation and Control Engineering, Wuhan, China, 26–28 June 2010; pp. 2470–2474. Guvenc, L.; Srinivasan, K. Force controller design and evaluation for robot-assisted die and mould polishing. Mech. Syst. Signal. Process. 1995, 9, 31–49. [CrossRef] Lu, Y.Y.; Dong, J.H. The study of force control with artificial intelligence in ceramic grinding process. In Proceedings of the IEEE International Symposium on Intelligence Information Processing and Trusted Computing, Huanggang, China, 28–29 October 2010; pp. 208–211.
Sensors 2018, 18, 1548
17.
18.
19. 20. 21.
22.
23.
24. 25. 26. 27. 28.
29. 30. 31. 32. 33. 34. 35. 36.
20 of 20
Liu, H.; Wan, Y.; Zeng, Z.; Xu, L.; Zhao, H.; Fang, K. Freeform surface grinding and polishing by CCOS based on industrial robot. In Proceedings of the International Symposium on Advanced Optical Manufacturing and Testing Technologies, Suzhou, China, 26–29 April 2016. Zhao, T.; Shi, Y.; Lin, X.; Duan, J.; Sun, P.; Zhang, J. Surface roughness prediction and parameters optimization in grinding and polishing process for IBR of aero-engine. Int. J. Adv. Manuf. Technol. 2014, 74, 653–663. [CrossRef] Akafuah, N.K.; Poozesh, S.; Salaimeh, A.; Patrick, G.; Lawler, K.; Saito, K. Evolution of the automotive body coating process—A review. Coatings 2016, 6, 24. [CrossRef] Placido, F.; Birney, R.; Kavanagh, J. Investigation of automotive detailing products by ellipsometry and contact angle analysis. Acta Phys. Pol. A 2009, 116, 712–714. [CrossRef] Gomez, O.; Perales, E.; Chorro, E.; Burgos, F.J.; Viqueira, V.; Vilaseca, M.; Martinez-Verdu, F.M.; Pujol, J. Visual and instrumental assessments of color differences in automotive coatings. Color Res. Appl. 2016, 41, 384–391. [CrossRef] Sharifzadeh, M.; Alirezaee, S.; Amirfattahi, R.; Sadri, S. Detection of steel defect using the image processing algorithms. In Proceedings of the IEEE International Multitopic Conference, Karachi, Pakistan, 23–24 December 2008; pp. 125–127. Besari, A.R.A.; Zamri, R.; Rahman, K.A.A.; Palil, M.D.M.; Prabuwono, A.S. Surface defect characterization in polishing process using contour dispersion. In Proceedings of the IEEE International Conference on Soft Computing and Pattern Recognition, Malacca, Malaysia, 4–7 December 2009; pp. 707–710. Samta¸s, G. Measurement and evaluation of surface roughness based on optic system using image processing and artificial neural network. Int. J. Adv. Manuf. Technol. 2014, 73, 353–364. [CrossRef] Khoo, S.W.; Karuppanan, S.; Tan, C.S. A review of surface deformation and strain measurement using two-dimensional digital image correlation. Metrol. Meas. Syst. 2016, 23, 461–480. [CrossRef] Simunovic, G.; Svalina, I.; Simunovic, K.; Saric, T.; Havrlisan, S.; Vukelic, D. Surface roughness assessing based on digital image features. Adv. Prod. Eng. Manag. 2016, 11, 93–140. [CrossRef] Liu, E.; Liu, J.; Gao, R.; Yi, H.; Wang, W.; Suo, X. Designing indices to measure surface roughness based on the color distribution statistical matrix (CDSM). Tribol. Int. 2018, 122, 96–107. [CrossRef] Besari, A.R.A.; Prabuwono, A.S.; Zamri, R.; Palil, M.D.M. Computer vision approach for robotic polishing application using artificial neural networks. In Proceedings of the IEEE Student Conference on Research and Development, Kuala Lumpur, Malaysia, 13–14 December 2010; pp. 281–286. Luh, J.Y.S.; Lin, C.S. Optimum path planning for mechanical manipulators. J. Dyn. Syst. Meas. Control 1981, 103, 142–151. [CrossRef] Tsai, L.W. Robot Analysis: The Mechanics of Serial and Parallel Manipulators; Wiley: New York, NY, USA, 1999. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [CrossRef] Jung, S.; Hsia, T.C.; Bonitz, R.G. Force tracking impedance control of robot manipulators under unknown environment. IEEE Trans. Control Syst. Technol. 2004, 12, 474–483. [CrossRef] Canny, J. A computational approach to edge detection. IEEE Trans. Pattern Anal. Mach. Intell. 1986, 8, 679–698. [CrossRef] [PubMed] Ballard, D.H. Generalizing the Hough transform to detect arbitrary shapes. Pattern Recognit. 1981, 13, 111–122. [CrossRef] Norman, S.N. Control Systems Engineering; Wiley: New York, NY, USA, 2011. Luenberger, D.G.; Te, Y. Linear and Nonlinear Programming; Springer: New York, NY, USA, 2008. © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).