A Multiple Sensors Platform Method for Power Line Inspection ... - MDPI

4 downloads 0 Views 4MB Size Report
May 26, 2017 - electric power equipment to happen, including broken conductors, ... conduct regular inspection on electric transmission lines, to detect ...
sensors Article

A Multiple Sensors Platform Method for Power Line Inspection Based on a Large Unmanned Helicopter Xiaowei Xie 1 , Zhengjun Liu 1, *, Caijun Xu 2 and Yongzhen Zhang 1 1 2

*

Institute of Photogrammetry and Remote Sensing, Chinese Academy of Surveying and Mapping, Beijing 100830, China; [email protected] (X.X.); [email protected] (Y.Z.) School of Geodesy and Geomatics, Wuhan University, Wuhan 430079, China; [email protected] Correspondence: [email protected]; Tel.: +86-010-6388-0536

Academic Editors: Cheng Wang, Julian Smit, Ayman F. Habib and Michael Ying Yang Received: 31 March 2017; Accepted: 22 May 2017; Published: 26 May 2017

Abstract: Many theoretical and experimental studies have been carried out in order to improve the efficiency and reduce labor for power line inspection, but problems related to stability, efficiency, and comprehensiveness still exist. This paper presents a multiple sensors platform method for overhead power line inspection based on the use of a large unmanned helicopter. Compared with the existing methods, multiple sensors can realize synchronized inspection on all power line components and surrounding objects within one sortie. Flight safety of unmanned helicopter, scheduling of sensors and exact tracking on power line components are very important aspects when using the proposed multiple sensors platform, therefore this paper introduces in detail the planning method for the flight path of the unmanned helicopter and tasks of the sensors before inspecting power lines, and the method used for tracking power lines and insulators automatically during the inspection process. To validate the method, experiments on a transmission line at Qingyuan in Guangdong Province were carried out, the results show that the proposed method is effective for power line inspection. Keywords: unmanned helicopter; power line inspection; automatic target tracking

1. Introduction With the development of the global economy and population, the power demand keeps rising, leading to an increasing scale of expansion of overhead transmission lines, of which a major part needs to cross regions with relatively adverse natural environment and complex terrain, such as rivers, mountains, forests and even depopulated zones, etc. Because the power lines and towers suffer from long-term exposure to the environmental conditions in the field, combining the influences of material aging, thunder strikes, and interference from trees, it is highly possible for malfunctions of electric power equipment to happen, including broken conductors, damaged or overheating insulators, tilted towers, and discharges caused by trees [1,2]. Therefore electric power departments need to conduct regular inspection on electric transmission lines, to detect potential abnormal conditions and to conduct timely maintenance. The current inspection on electric transmission lines mainly depends on manpower inspection and manned helicopter inspection. However the labor intensity is quite high, the working conditions are poor and the inspection efficiency concerning manpower inspection is low; whereas in the case of manned helicopter inspections, the cost is quite high, and the operation is risky. In recent years, a variety of schemes have been proposed to replace the manual inspection and manned helicopter inspection, which can be divided into three categories. The first category is the use of various kinds of robots for power line inspection [3–9], for example, the “LineScout” developed by the Hydro-Québec research institution (Varennes, QC, Canada), the “Expliner” developed by the Tokyo Institute of Technology (Tokyo, Japan), the “TI” developed by Electric Power Research Institute (Palo Alto, CA, Sensors 2017, 17, 1222; doi:10.3390/s17061222

www.mdpi.com/journal/sensors

Sensors 2017, 17, 1222

2 of 17

USA), etc. These robots are characterized by high safety and low cost, can be used in continuous inspection for fault diagnosis of power lines, the deficiency is that they cannot support inspection on other power line components and surrounding objects. The second category is inspection methods using cameras or video cassette recorders (VCRs) based on unmanned aerial vehicles (UAVs); examples can be seen in [10–19]. Power line components and surrounding objects can be inspected by using different sensors in different sorties. However, limited by the short flight time of UAVs, it is unfeasible to conduct long-distance inspection on power lines. The third category is inspection methods using typical photogrammetric or LiDAR techniques based on manned aerial vehicles, which are more effective and can be used for long distance inspection, but the details of the power line components cannot be obtained by these methods [20–23]. In this paper, we propose a multiple sensors platform method for power line inspection. Compared to the schemes mentioned above, the use of multiple sensors can support inspection of all power line components and surrounding objects within one sortie. A large unmanned helicopter is used here to meet the requirements for carrying the multiple sensors platform and to support the power line inspection over a long distance. Therefore, it will be more effective and more comprehensive for power line inspection. When the multiple sensors platform is used for power line inspection, the flight safety of the unmanned helicopter, scheduling of sensors and exact tracking of power line components are very important aspects. This paper present the method of using a multiple sensor platform, which includes two steps, the first is to plan the flight path of the unmanned helicopter and the coordinates needed to perform the tasks for each sensor; the second is to complete all tasks automatically based on an exact target tracking algorithm, through which we can acquire different data about power line components or surrounding objects efficiently and accurately with different sensors during the inspection process. This paper is organized as follows: Section 2 introduces the components of the multiple sensors platform. Section 3 introduces the planning for the flight paths of the unmanned helicopter and the tasks of each sensor before inspecting the power lines. Section 4 introduces the automatic tracking method used during the power line inspection process. Section 5 validates the dependability of the method presented in this paper on the basis of experiments, followed by conclusions in Section 6. 2. The Composition of the Multiple Sensors Platform The multiple sensors platform includes two parts, inertial stabilized platform and the sensor pod, as shown in Figure 1a. A two-axis inertial stabilized platform is adopted here [24], which is used to adjust the attitude of the sensor pod so that the sensors’ line of sight (LOS) can track the targets accurately in real time. The sensor pod is the unit where all the sensors are placed, which integrates six sensors in total including the position and orientation system (POS), light detection and ranging (LiDAR), thermal camera, ultraviolet camera, a long-focus camera and a short-focus camera as shown in Figure 1b. Among all the sensors, the lines-of-sight of the infrared camera and the ultraviolet camera, the long-focus camera and the short-focus camera are parallel to each other. POS can obtain the real-time attitude and coordinates of the sensor pod, and it is used on real-time target tracking during the power line inspection and on the data analysis after the inspection. The high-precision point cloud acquired by LiDAR can be used on monitoring the distance between the electric transmission lines and the surroundings (trees in particular). The thermal infrared data collected by the thermal camera can be used on monitoring the heat status of facilities including electric power fittings, power lines and insulators, etc. The ultraviolet videos acquired by the ultraviolet camera can be used for monitoring abnormal discharges of electric power fittings, power lines and insulators. Images acquired by the short-focus camera can be used on inspecting whether there are broken conductors in the shield wires and titled towers. High-resolution images acquired by the long-focus camera can be used on diagnosing the anomalies including missing pins, burst insulators and corroded electric power fittings. Parameters of the sensors are shown in Table 1, including sensor name, product model, focal length, field of view (FOV), and data type.

Sensors 2017, 17, 1222

3 of 17

Sensors 2017, 17, 1222

3 of 17

(a)

(b)

Figure 1. Appearance of the multiple sensors platform: (a) Overall picture of the multiple sensors

Figure 1. Appearance of the multiple sensors platform: (a) Overall picture of the multiple sensors platform; (b) Appearance of the sensor pod. platform; (b) Appearance of the sensor pod. Table 1. Parameters of the sensors.

Table 1. Parameters of the sensors.

Sensor Name Product Model LiDAR Rigel VZ400 Sensor Name Product Model Thermal camera customized Ultraviolet camera customized LiDAR Rigel VZ400 The short-focus 5D Mark II Thermal camera camera Canon customized The long-focus 5D Mark II Ultraviolet camera camera Canon customized

Focal Length (mm) FOV ( × ) Data Type Point cloud Focal Length (mm) FOV (H × V) Data Type 100 Video 9° × 6° ° ° -VideoPoint cloud 9 × 6.75 ◦ ° ◦ Image Video 35 54° × 938× 100 6 ◦ ◦ 180 11° × 9 7.6 ×° 6.75 Image Video

The short-focus camera Canon 5D Mark II 3. Planning of camera Flight PathsCanon and Tasks of theII Sensors The long-focus 5D Mark

35 180





54 × 38 ◦ ◦ 11 × 7.6

Image Image

The planning of the flight paths of the unmanned helicopter and tasks of the sensors includes

3. Planning of Flight Paths Tasks thepaths Sensors two parts: the first is to and design the of flight of the unmanned helicopter while inspecting the electric transmission lines; and the second is to plan the tasks of all sensors while the unmanned

The planning of the flight paths of the unmanned helicopter and tasks of the sensors includes two helicopter is flying along the flight paths. The current UAV power lines inspection generally adopts parts:one thesensor, first isthe to flight design theplanning flight paths the unmanned helicopter whilea inspecting the electric path mostlyofinvolves connecting and smoothing series of flight-path transmission lines; and the second is to plan the tasks of all sensors while the unmanned helicopter segments referring to the towers of power lines, which fails to consider other influencing factors, and is flyingthe along themostly flightcontinue paths. The current UAV power lines is inspection adopts one sensor, sensor working on time-lapse, so there no need togenerally plan sensor tasks [25–27]. Different the current flight paths planning, the flight control system of the unmanned helicopter the flight pathfrom planning mostly involves connecting and smoothing a series of flight-path segments presented this paper can automatically smoothingother function according to the waypoints, referring to the in towers of power lines, whichstart failsthe to consider influencing factors, and the sensor so continue we only need to provide waypoints with smoothing. In addition, planning the waypoints, mostly working on time-lapse, so no there is no need to planwhile sensor tasks [25–27]. Different we also take into account the influence of complex terrain (such as mountains) and non-target from the current flight paths planning, the flight control system of the unmanned helicopterpower presented lines. On the other hand, using multiple sensors can support inspection for all power line components in this paper can automatically start the smoothing function according to the waypoints, so we only and surrounding objects within one sortie, so scheduling of sensors is very important aspect. need Therefore, to providethis waypoints with no smoothing. In addition, while planning the waypoints, we also paper presents the planning method for the tasks of all sensors during the inspection take into account the influence of complex terrain (such as mountains) and non-target power lines. process. On the other hand, usingthe multiple sensors support inspection all power linehigh-precision components and Before planning flight paths andcan tasks of the sensors, it is for assumed that the surrounding oneelectric sortie,transmission so scheduling sensors is very(the important aspect. Therefore, 3D pointobjects cloud ofwithin the entire linesof has been acquired LiDAR on the multiple sensors platform can be used to acquire the point clouds), thus the precise 3D coordinates of the this paper presents the planning method for the tasks of all sensors during the inspection process. towers, insulators and power lines can be obtained, as well as the digital surface model (DSM) data Before planning the flight paths and tasks of the sensors, it is assumed that the high-precision of the entire transmission line corridor, of which the resolution and elevation accuracy are 15 m and 3D point cloud of the entire electric transmission lines has been acquired (the LiDAR on the multiple 3 m, respectively. sensors platform can be used to acquire the point clouds), thus the precise 3D coordinates of the towers, insulators and power lines can be obtained, as well as the digital surface model (DSM) data 3.1. Waypoints Planning of the Unmanned Helicopter of the entire transmission line corridor, of which the resolution and elevation accuracy are 15 m and There are three reasons for planning the waypoints of the unmanned helicopter. First, the 3 m, respectively. unmanned helicopters have a maximum flight time which is limited by their fuel carrying capacity. Second, the distanceofbetween the unmanned helicopter and the power lines must be larger than a 3.1. Waypoints Planning the Unmanned Helicopter safe distance. Finally, the closer the unmanned helicopter is to the electric transmission lines, the There aredata three reasons planning thetowaypoints of data the from unmanned helicopter. First, better the collected by for the sensors is, and better acquire insulators on each tower, it the

unmanned helicopters have a maximum flight time which is limited by their fuel carrying capacity. Second, the distance between the unmanned helicopter and the power lines must be larger than a safe distance. Finally, the closer the unmanned helicopter is to the electric transmission lines, the better the

Sensors 2017, 17, 1222

4 of 17

Sensors 2017,by 17, 1222 of 17 data collected the sensors is, and to better acquire data from insulators on each tower, it is4necessary to slow down at each tower. Therefore, it is necessary to plan the waypoints in a way that can not is necessary to slow down at each tower. Therefore, it is necessary to plan the waypoints in a way only guarantee a safe and effective inspection, but also ensure data quality good. The planning of the that can not only guarantee a safe and effective inspection, but also ensure data quality good. The waypoints of using the unmanned helicopter on inspecting electric transmission lines includes three planning of the waypoints of using the unmanned helicopter on inspecting electric transmission lines steps.includes The first step is toThe create the referential waypoints according coordinates of the towers of three steps. first step is to create the referential waypointstoaccording to coordinates of the electric transmission lines that need lines to bethat inspected. The second step is to optimize referential the towers of the electric transmission need to be inspected. The second step is tothe optimize waypoints based on the DSM dataonand non-targetofelectric transmission lines. Finally, the referential waypoints based the coordinates DSM data andofcoordinates non-target electric transmission Finally, based on waypointshelicopter, of unmanned helicopter, operatingof parameters of LiDAR can basedlines. on the waypoints ofthe unmanned operating parameters LiDAR can be calculated.

be calculated.

3.1.1. Referential Waypoints Planning 3.1.1. Referential Waypoints Planning

Assuming that D1 is the horizontal safe distance between the unmanned helicopter and the Assuming thatvertical is the horizontal distance between the unmanned the lines. power lines, D2 is the safe distancesafe between the unmanned helicopterhelicopter and the and power power lines, is the vertical safe distance between the unmanned helicopter and the power lines. As shown in Figure 2, T( x, y, z, w) representing the n towers on the electric transmission lines, ( x, y, z, w) As shown in Figure 2, T , , , representing the n towers on the electric transmission lines, representing for the centric coordinates at the top and the width of the towers, these data can be , , , representing for the centric coordinates at the top and the width of the towers, these data acquired from the 3Dfrom pointthe cloud data of the electric lines. We begin can be acquired 3D point cloud data of transmission the electric transmission lines. by Weconnecting begin by the points of every two towers to obtain a series of line segments T T , T T , · · · , T T . Secondly, 2 segments 2 3 n−1, n , ⋯ , .move connecting the points of every two towers to obtain a series of1 line all theSecondly, line segments with w/2 horizontally a series oftocross points C =of{cross C1 , Cpoints ⁄2 horizontally move all theDline with +to obtain obtain a series 2 , · · · , Cn }, 1 + segments then Cmove C temporary waypoints, which iswaypoints, R0 = { R1 ,which R2 , · · is · , R n }. then move to C obtain with the = , with , ⋯ , D2, vertically vertically to obtain the temporary . Finally, extend along R0 as the = , the , ⋯ ,distance to obtain point andasextend the inletthe point, R extend Finally, S along R2the R1 distance to obtainS point inlet point, distance to obtain point asRthe R are = the andRextend distance S along S along to obtain point Rn+1 as the appearance point, the = {appearance R0 , R1 , · · · point, , Rn , Rthe n−1 Rn the n +1 } are the final waypoints, S is set according to the practical situation. , , ⋯ , , final waypoints, S is set according to the practical situation.

Figure Theprinciple principle of of referential planning. Figure 2. 2.The referentialwaypoints waypoints planning.

3.1.2. Optimization of Waypoints

3.1.2. Optimization of Waypoints

Unavoidably, except for target electric transmission lines that need to be inspected, sometimes

Unavoidably, except for target electric lines that need be need inspected, sometimes there are also non-target transmission linestransmission in the same area. Therefore, wetoalso to ensure the theresafety are also non-target transmission lines in the same area. Therefore, we also need to ensure the distance between the unmanned helicopter and the non-target transmission lines. In addition, safetygiven distance between the unmanned helicopter and non-target In addition, the complex environment of the transmission line the corridor, it is alsotransmission need to keep alines. safe distance the unmanned helicopter ground to guarantee the safety of the unmanned givenbetween the complex environment of theand transmission line corridor, it is also need to keephelicopter. a safe distance We assume that ishelicopter the safe distance betweento theguarantee unmannedthe helicopter andthe theunmanned ground. between the unmanned and ground safety of helicopter. The optimization of waypoints is to check and handle security risks in referential waypoints, We assume that D3 is the safe distance between the unmanned helicopter and the ground. based on the DSM data and coordinates of non-target transmission lines. The detailed steps of The optimization of waypoints is to check and handle security risks in referential waypoints, based on the DSM data and coordinates of non-target transmission lines. The detailed steps of optimizing the

Sensors 2017, 17, 1222

5 of 17

waypoints are 17, as 1222 follows. For the auxiliary waypoints obtained in the following steps, the unmanned Sensors 2017, 5 of 17 helicopter does not slow down at these waypoints, therefore here each flight-path segment Ri Ri+1 is optimizing the waypoints as follows. For the auxiliary waypoints following steps, deems as a whole, which theare auxiliary waypoints are included in, asobtained Ri · · · Rinp ·the ··R i +1 : (a) (b)

(c) (d)

(e)

(f)

the unmanned helicopter does not slow down at these waypoints, therefore here each flight-path For any one segment Ri Ri+as1 , disperse into n the temporary according a step length. is deems a whole, itwhich auxiliarypoints waypoints are to included in, as segment Staring ⋯ ⋯from: Ri , we take each temporary point p as the center and D3 as the radius to draw

a circle. Based the DSM data, estimate whether the distance between thetocoordinates of point p , disperse it into n temporary points according a step length. (a) For any oneon segment and the ground within the circle is greater than D is estimated. If it is greater, the following (b) Staring from , we take each temporary point p as3 the center and as the radius to draw a stepscircle. should be on continued; if it isestimate less, the auxiliary waypoint R p is added at point of p.point p The height Based the DSM data, whether the distance between the coordinates of Rand should enable the distance between R and all ground points within the circle to be greater the ground within the circle is greater than is estimated. If it is greater, the following p p steps should be continued; if it is less, the auxiliary waypoint is added at point p. The height than D3 . When point p is too close to a certain waypoint, this waypoint can be heightened to should of that all ground points within circle toto beadding greater new ensure the enable heightthe of distance point p between meets the and requirement, without the the necessity . When point is too close to a certain waypoint, this waypoint can be heightened to than auxiliary waypoints. p ensure that the height of point p meets the requirement, without the necessity to adding new Taking point p as the center and taking D1 as the radius, re-draw a circle to estimate whether auxiliary waypoints. there are non-target power lines falling into the new circle, as shown in Figure 3. (c) Taking point p as the center and taking as the radius, re-draw a circle to estimate whether If there powerlines lines fall into intothe thenew new circle cross with 3. axle b, the distance thereare arenon-target non-target power falling circle, as and shown in Figure 0 , it need to add an auxiliary waypoint R acquired by between point p and the power lines is r (d) If there are non-target power lines fall into the new circle and cross with axle b, thepdistance 0. acquired by between pointp and the power is r′, it need to add an auxiliary waypoint moving p outwardly with D1 − rlines moving p outwardly with this − ′.circle and cross with axle a, it is necessary to estimate whether If there are power lines fall into (e) If there distance are power lines fall into pthis and cross axle a,than it is D necessary to estimate the vertical between point andcircle the power linewith is greater 2 . If it is greater, nothing whether the vertical distance between point p and the power line is greater than . If it is needs to be done; if it is less, it is necessary to add the auxiliary waypoint R p , of which the greater, nothing needs to be done; if it is less, it is necessary to add the auxiliary waypoint , of ( x, y) can be acquired by intersection point between the power lines and axle a, and z is the height which the , can be acquired by intersection point between the power lines and axle a, and of the linesofadded with D2 .added Similarly, point p is too close to a certain waypoint, it is . Similarly, when point z ispower the height the power lines with when p is too close to a certain feasible to directly heighten this waypoint to ensure that the safe distance point p and waypoint, it is feasible to directly heighten this waypoint to ensure that the safe between distance between the lines meet requirement. point can thethe lines can meet the requirement. p and (f) Repeat Steps completethe the optimization optimization ofofallall waypoints. It needs to beto noted that, once Repeat Steps b–eb–e to to complete waypoints. It needs be noted that,a once new auxiliary waypoint was added in steps b, d, and e, it is necessary to recalculate the dispersed a new auxiliary waypoint was added in steps b, d, and e, it is necessary to recalculate the temporary points from the from last waypoint, then repeatthen Stepsrepeat b–e. Steps b–e. dispersed temporary points the last waypoint,

(a)

(b) Figure 3. The interference of non-target power lines: (a) Power lines in parallel; (b) Power lines cross

Figure 3. The interference of non-target power lines: (a) Power lines in parallel; (b) Power lines over. cross over.

Sensors 2017, 17, 1222

6 of 17

Sensors 2017, 17, 1222

6 of 17

3.1.3. Operating Parameters of LiDAR

In order to ensure the data acquired by LiDAR meet the requirement, reasonable operating 3.1.3. Operating Parameters of LiDAR parameters need to be calculated before power line inspection. The longitudinal and lateral distances In order ensure the acquired by LiDARaccording meet thetorequirement, between laser to footprints ( data , ) can be calculated the followingreasonable formulas: operating parameters need to be calculated before power line inspection. The longitudinal and lateral distances (1) = /2 between laser footprints (dlon , dlat ) can be calculated according to the following formulas: = 4ℎ tan /2 / dlon = v/2 f s

(2) (1)

dlat =ℎ 4h (θ/2)/F = f s tan+ q h = D12 + D22 / + tan = tan − + + ′ /   θ = tan−1 (( B − D1 )/H ) + tan−1 B + D1 + w0 /H = ⁄2 + tan + /2 /  − tan + + ′ /  θs = π/2 + tan−1 D1 + w0 /2 /D2 − tan−1 B + D1 + w0 /H

(2) (3)

+θ θe = = θs +

(6) (6)

(3) (4) (4) (5) (5)

In the theformulas formulasabove, above, represents the unmanned helicopter, represents In v represents flightflight speedspeed of theof unmanned helicopter, f s represents scanning scanning speed, ℎ distance represents distance between LiDARlines, andF power lines, pulseθ speed, h represents between LiDAR and power represents pulse represents repetition rate, repetition rate, represents scanning angle, represents the distance need to be covered on each represents scanning angle, B represents the distance need to be covered on each side of the power lines, side of the power lines, width represents the average width of the represents w0 represents the average of the power lines, H represents the power averagelines, flying height relativethe to average flying height relative to the ground, represents starting scanning angle, and as the ground, θs represents starting scanning angle, and θe represents terminational scanning angle, represents terminational scanning as shown Figure 4. According to the requirement of shown in Figure 4. According to theangle, requirement of din lon and dlat for power line inspection, ( f s , F, θs , θe ) and can be determined. for power line inspection, , , , can be determined.

Figure 4. The scanning angle of LiDAR.

the Sensors Sensors 3.2. Planning of Tasks of the Task planning planningisisthe therealization realizationofof scheduling of the sensors. LiDAR, thermal camera Task thethe scheduling of the sensors. TheThe LiDAR, thermal camera and and ultraviolet camera can start directly the unmanned helicopter reaches the starting ultraviolet camera can start workwork directly whenwhen the unmanned helicopter reaches the starting point pointcan and canwhile stop while it reaches the ending only needs setpoints task points the starting and stop it reaches the ending point; point; it onlyitneeds to set to task at the at starting point pointending and ending point for starting and stopping data collection. Therefore major part of the and point for starting and stopping data collection. Therefore the majorthe part of the planning planning tasks of theissensors to plan taskfor points the short-focus to the track the power of tasks ofofthe sensors to planisthe taskthe points the for short-focus cameracamera to track power lines lines and the long-focus camera to track the insulators. Task points represent the initial position for and the long-focus camera to track the insulators. Task points represent the initial position for the the cameras to carry photo-taking tasks; point corresponds a target position. Here cameras to carry out out the the photo-taking tasks; eacheach tasktask point corresponds to a to target position. Here the the infrared camera and the ultraviolet camera are not taken into consideration, because the lines-ofinfrared camera and the ultraviolet camera are not taken into consideration, because the lines-of-sight sight the thermal ultraviolet short-focus cameras and long-focus cameras of the of thermal camera,camera, ultraviolet camera, camera, short-focus cameras and long-focus cameras are parallelare to parallel to each other. each other.

Sensors 2017, 17, 1222

7 of 17

Sensors 2017, 17, 1222

7 of 17

As shown in Figure 5a,5a,assuming segmentof of power lines composed of two As shown in Figure assuming that that Ti Ti+1 isis aasegment power lines composed of two neighboring towers, andthe thelength length isis TL. TL. The The corresponding segment of the unmanned neighboring towers, and correspondingflight flight segment of the unmanned , and lengthisis RL, temporary point helicopter helicopter is Ris thethe length RL, Rc isisaatemporary pointclose closetoto Ri+. 1 . i Ri +1 , and

(a)

(b) Figure 5. The principle of task planning:(a) (a)The Thelayout layout of of task task points (b)(b) The order Figure 5. The principle of task planning: pointsfor fortwo twocameras; cameras; The order to to shoot insulators of the long-focus camera. shoot insulators of the long-focus camera.

To ensure the short-focus camera and the long-focus camera can accomplish all tasks when the To ensure helicopter the short-focus camera and long-focus canfor accomplish all tasks when unmanned is flying through thethe flight segment, camera task points the short-focus camera to the unmanned helicopter is flying through the flight segment, task points for the short-focus camera to are set at and task points for the long-focus camera to track the track the power lines of trackinsulators the power Ti+1 are task pointssteps for the long-focus itowers areset setat at R i Rc and . The detailed are as follows: camera to track the on lines the of T

insulators on the Ti+1 towers are set at Rc Ri+1 . The detailed steps are as follows: (a)

(b)

(c)

(a) According to the insulators’ number of towers on the right side, we can get the According to thethe insulators’ number of Ti+1 towers on the side, we can get theinsulators number M of number M of task points for the long-focus camera. The right centric coordinates of the are used as target positions, and the shooting order is from bottom to of topthe andinsulators from left toare right the task points for the long-focus camera. The centric coordinates used as as positions, shown in Figure 5b.shooting The insulators the other sidetooftop theand towers areleft inspected target and the order on is from bottom from to rightthrough as shown in bilateral flying. Figure 5b. The insulators on the other side of the towers are inspected through bilateral flying. (b) To ensure that there is enough time for the implementation, set the minimum distance between To ensure that there is enough time for the implementation, set the minimum distance between every two task points as ts, multiplied by M, we can get = × which is the flight every two task points as ts, multiplied by M, we can get RL2 = tsis×tracking M which is the flight distance distance of the unmanned helicopter when the long-focus camera the insulators. The of the unmanned helicopter when the long-focus camera is tracking the insulators. , andThe the start start point of the task points of the long-focus camera can be obtained based on point R of the task points of the long-focus camera can be obtained based on RL , c 2 and the spacing between the two task points is ts. spacing between the two taskW points is ts. to FOV of the short-focus camera and the distance (c) Calculate the image width according between camera and power the distance hereofadopts is mentioned in + ⁄2, which Calculate the image width Wlines, according to FOV the short-focus camera and the distance , calculate the times the short-focus camera need to take photos Section 3.1. Starting from between camera and power lines, the distance here adopts D + w/2, which is mentioned in 1

Section 3.1. Starting from Ti , calculate the times the short-focus camera need to take photos between Ti Ti+1 in the horizontal direction as the overlap is q, N = TL/((1 − q) × W ) + 1. The centric position of each image can be used as the target position for tracking.

Sensors 2017, 17, 1222

(d)

8 of 17

Sensors 2017, 17, 1222

8 of 17

Then to plan uniformly-spaced task points for the short-focus camera according to N on the flight segment of Ri Rc , theinlength of Ri Rcdirection is RL1 . as However, when all1.the , N =< (⁄N + 1 −M ) × × ts, + between the horizontal the overlap is RL Thetask pointscentric of theposition short-focused camera need to be starting Ri based on a separation of each image can be used ascompleted the target position forfrom tracking. (d) Then points for the short-focus according N on thetask distance of to ts.plan The uniformly-spaced task points of thetask long-focus camera are put offcamera orderly, and thetoexceeded , the length . the However, when helicopter the segment RL < + is set × to, all is pointsflight are all set in of the position of Ri+of1 . Meanwhile unmanned hovering task points of the short-focused camera need to be completed starting from based on at the position of Ri+1 , and the hovering duration is determined according to the amount ofa the separation distance of ts. The task points of the long-focus camera are put off orderly, and the exceeded tasks and the minimum duration needed to implement the tasks. exceeded task points are all set in the position of

. Meanwhile the unmanned helicopter is

set toto hovering at above, the position of obtain , and hovering duration determined to According the steps we can thethe coordinates of all taskispoints in theaccording flight segment the amount of the exceeded tasks and the minimum duration needed to implement the tasks. of Ri Ri+1 and the corresponding targets. According to the steps above, we can obtain the coordinates of all task points in the flight

and the corresponding targets. segmentTarget of 4. Automatic Tracking

This paper adopts theTracking double closed loop control method to achieve the automatic target tracking. 4. Automatic Target The outer loop adopts the distance control method, enabling the camera to start aiming and tracking This paper adopts the double closed loop control method to achieve the automatic target when the helicopter is close enough to the task point, to ensure the orderly implementation of tracking. The outer loop adopts the distance control method, enabling the camera to start aiming and the inspection tasks.theThe inner loop adopts the control method, the real-time tracking when helicopter is close enough to attitude the task point, to ensure the which orderly uses implementation coordinates the projection center of loop the camera obtained from POSmethod, and thewhich coordinates the targets of the of inspection tasks. The inner adopts the attitude control uses theofreal-time to be tracked to calculate the pitch and heading, adjust the posture of camera by gradually coordinates of the projection center of the camera obtained from POS and the coordinates adjusting of the targets to be tracked to calculate the requirements pitch and heading, the posturebetween of camera by gradually the stabilized platform, which meets the whenadjust the difference posture of camera theheading stabilizedwe platform, whichismeets requirements whenDGPS the difference between and theadjusting pitch and calculated smallthe enough. Real-time is adopted hereposture to ensure of camera the pitch and of heading we calculated is small enough. Real-time DGPS is adopted here the precision of and the coordinates the camera. to ensure the precision of theiscoordinates camera. The principle of algorithm as shown of inthe Figure 6. The first step is to obtain the coordinates of The principle of algorithm is as shown in Figure 6. The first step is to obtain the coordinates of the current task points through initializing the task list. Second is to check whether enter into the the current task points through initializing the task list. Second is to check whether enter into the current task in real time according to the distance control method. Third is to calculate the heading current task in real time according to the distance control method. Third is to calculate the heading and pitch of theoftarget through the coordinates of of thethe target points coordinates of andangle pitch angle the target through the coordinates target pointsand andthe the centric centric coordinates the camera which are corrected by installation errors, and send the angles to the platform of the camera which are corrected by installation errors, and send the angles to the platformto toadjust adjust the posturethe ofposture camera. is toLast estimate in realintime the current attitude ofof the ofLast camera. is to estimate real whether time whether the current attitude thecamera camerameets meets the the requirements for photo-taking, andthe send the instruction for take photo to the cameraand andcontinue continue the requirements for photo-taking, and send instruction for take photo to the camera the if next task if the it meets the requirements. next task it meets requirements. Current POS posture

Current POS coordinates

Initialize tasks list

Get current task point

Installation error correction

Distance control

Centric coordinates of Camera

Entered into the task?

Y

Calculate heading and pitch

N

Aimed at target?

N Time over?

N

Y Y Y End

N

More task?

Take photo

Figure 6. The principle ofofautomatic tracking. Figure 6. The principle automatictarget target tracking.

One important to note this method is that there a time limit executionofofeach One important thingthing to note aboutabout this method is that there is aistime limit forfor thethe execution each task because the distance between each two task points is fixed based on the tasks planning. task because the distance between each two task points is fixed based on the tasks planning. Therefore, Therefore, once the task has not been completed within the specified time, it will be given up and the once the task has not been completed within the specified time, it will be given up and the next task will be started. There are three critical parts in the entire task implementation, which are real-time correction of installation errors, distance control and attitude control, the details in Sections 4.1–4.3.

Sensors 2017, 17, 1222

9 of 17

next task will be started. There are three critical parts in the entire task implementation, which are 9 of 17 real-time correction of installation errors, distance control and attitude control, the details in Sections 4.1–4.3.

Sensors 2017, 17, 1222

4.1. Real-time Correction of Installation Errors 4.1. Real-time Correction of Installation Errors During the inspection process on power lines, real-time coordinates of the phase center of the GPS During the by inspection power To lines, real-time coordinates ofautomatic the phase tracking, center of itthe antenna output POS are process obtainedon directly. ensure the precision of the is GPS antenna output by POS are obtained directly. To ensure the precision of the automatic tracking, necessary to transfer the real-time coordinates of the phase center of the GPS antenna to the real-time it is necessary the real-time coordinates of theby phase centerthe of the GPS antenna to the realcoordinates of to thetransfer photographing center of the cameras correcting installation errors: time coordinates of the photographing center of the cameras by correcting the installation errors:         c ∆x∆w ∆xw xe xce g g  e    ∆    e + Rc ( ϕ,, ω,, k)∆ ∆ycw + +  ye  (7) (7) ( L, B) R,nl ( R, P, H, ) R, lg (α, β, γ, ),  ∆y∆w  yc  = Rn = g + c ∆z z zec ∆z∆w e g ∆ w e In the from thethe navigation coordinate system to the , ) is isthe therotation rotationmatrix matrix from navigation coordinate system to the equation equationabove, above,Rn ( L, B n geocentric coordinate system. Rl ( R, P, H, ) ,is the matrix from the IMU systemsystem to the the geocentric coordinate system. isrotation the rotation matrix from thecoordinate IMU coordinate navigation coordinate system.system. Rlg (α, β, γ) is, the matrix from the reference coordinate system to the navigation coordinate , rotation is the rotation matrix from the reference coordinate g of the stabilized platformplatform to the IMU coordinate system. system. Rc ( ϕ, ω, k) is ,the, rotation from the system of the stabilized to the IMU coordinate is thematrix rotation matrix e , ye , ze ) is camera system to the reference of the stabilized platform. x ( from thecoordinate camera coordinate system to thecoordinate reference system coordinate system of the stabilized platform. c c c  w , ∆y w , ∆zw is is the 3D coordinates of thecenter. photographing GPS , is the ∆x the ,3D ,coordinates of the photographing is the GPS ,coordinates. ( xe , ye , ze )center. g coordinates. g g ∆ , ∆ , ∆ is the eccentric component between the center of the stabilized platform and the the eccentric component between the center of the stabilized platform and the phase center of the GPS c c c is the eccentric component from the photographing phase center of the GPS antenna. ∆ , ∆ , ∆ antenna. (∆xw , ∆yw , ∆zw ) is the eccentric component from the photographing center of the camera to center of the camera to the platform. center of the stabilized platform. the center of the stabilized

4.2. Distance Control As shown toto pass thethe task points planned in Section 3.2 shown in in Figure Figure7,7,the theunmanned unmannedhelicopter helicopterhas has pass task points planned in Section in conditions. However, influenced bybyvarious 3.2ideal in ideal conditions. However, influenced variousfactors, factors,the theunmanned unmanned helicopter helicopter cannot cannot fly strictly according to the planned flight paths, causing difficulties in estimating directly whether the position helicopter satisfies the conditions to implement the tasks. usual solution position ofofthe theunmanned unmanned helicopter satisfies the conditions to implement theThe tasks. The usual is to set a distance threshold D. When the distance between the unmanned helicopter and the solution is to set a distance threshold . When the distance between the unmanned helicopter task and point is point less than D,than it meets requirement, but the value of D of here is notiseasy to set, the the task is less , it the meets the requirement, but the value here not easy to when set, when value of D of is tooissmall and the unmanned helicopter is too deviated from the from planned paths, the the value too small and the unmanned helicopter is too deviated theflight planned flight possibility of failing to the tasks isthe quite high; and high; whenand the when value the of Dvalue is tooofgreat, paths, the possibility ofimplement failing to implement tasks is quite is the too tracking will be influenced. paper present a method combining distance and angle to check great, theeffect tracking effect will be This influenced. This paper present a method combining distance and whether it is satisfied with requirement to requirement perform taskstoor not. tasks or not. angle to check whether it isthe satisfied with the perform It is assumed that the current position of the unmanned unmanned helicopter is is p,, the location of the last task point is C1 , and the location of the next task will be carried out is C2 .. Firstly a greater distance threshold setset andand we ensure that the between p and C2 will certainly the threshold D. threshold D isis we ensure thatdistance the distance between and willmeet certainly meet the Secondly, ∠ pC2 C between p and C2between is smaller than Figurethan 6 it can threshold calculate . Secondly, calculate ∠ distance when the distance and D, From is smaller , 1 when the be seen that when ∠ pC C1 reaches 90◦ , ∠ the unmanned the cross section of Cthe exactly. Lastly, From Figure 6 it can be 2seen that when reachesflies 90°,over the unmanned flies over cross section ◦2 ◦ ° 90 − ∠ pC2when an threshold is set < 3 , threshold it is satisfied with θthe when p meets C1 ≤ θ, of angle exactly. Lastly, anθangle is set < 3requirement , it is satisfied with the requirement ◦ ° ∠ pC2 C |90 duecannot to datareach errors. ≤ exactly , ∠ meets − ∠ reach |90 90° exactly due to data errors. 1 cannot

Figure 7. The principle of distance control. Figure 7. The principle of distance control.

Sensors 2017, 17, 1222

10 of 17

4.3. Attitude Control After the position of the unmanned helicopter meets the requirements for implementing the current task, we calculate the heading and the pitch angle between the photographing center of the camera and the target, and send adjustment instruction to the stabilized platform. When the difference between the posture of camera and the calculated angle smaller than angle threshold, we perform the current task. The steps of the attitude control are as follows: (a)

Motion compensation: If the time-delay of the camera shooting is t, before calculating the heading and the pitch angle between the photographing center of the camera and the target, it needs to take the distance of the unmanned helicopter flying forward within t into consideration. The current coordinates of the photographing center of the cameras are ( x0 , y0 , z0 ), the current  speeds provided by POS are v x , vy , vz , and after the time t, the photographing center of the cameras reach the coordinates of ( x1 , y1 , z1 ). According to Equation (8), ( x1 , y1 , z1 ) can be obtained: 

       x1 vx t ∆x x0         c 0 0 0   y1  = Rl α , β , γ  vy t  +  ∆y  +  y0  z1 vz t ∆z z0

(b)

In the formula above, Rcl (α0 , β0 , γ0 ) is the rotation matrix from the IMU coordinate system to the coordinate system of the camera. (∆x, ∆y, ∆z) is the eccentric component between the IMU center and the center of the camera. Angle calculation: Coordinates of the photographing center of the camera are ( x1 , y1 , z1 ); coordinates of the target are ( x2 , y2 , z2 ). We can calculate the heading and pitch at according to Equations (9) and (10):   π/2 − ε ( x2 > x1 , y2 6= y1 )    3π/2 − ε x < x , y 6= y ( 2 1 2 1) heading =  0 x = x , y > y ( 2 1 2 1)    π ( x2 = x1 , y2 < y1 )

dist = (c)

(8)

(9)

pitch = sin−1 ((z1 − z2 )/dist)

(10)

k = ( y2 − y1 ) / ( x2 − x1 )

(11)

ε = tan−1 (k)

(12)

q

( x2 − x1 )2 + ( y2 − y1 )2 + ( z1 − z2 )2

(13)

Attitude control: To send angle adjustment instructions to the stabilized platform. Setting an angle threshold to conduct real-time estimation on whether the attitude of the platform meets the requirement for the threshold value. If it meets the requirement, photographing instructions are sent to the cameras; if it does not meet the requirement, we repeat Steps a-c until the requirement is met.

5. Experiments and Analyses A 110 kV electric transmission line at Qingyuan in Guangdong Province was selected for experiments to validate the feasibility of the method presented in this paper. The total length of this line is 4.2 km; it has 13 towers that are numbered from 316 to 328, and has 78 insulators. The geographical environment of the power line passages is complex with quite great topographic relieves, as shown in Figure 8. The 3D point cloud data of this electric transmission line was obtained by APLIS. We adopted the method introduced in Part 3 to plan the waypoints of unmanned helicopter and the tasks of sensors.

Sensors 2017, 17, 1222

11 of 17

The operating Parameters of LiDAR are shown in Table 2, and the parameters adopted in the planning are as 2017, shown in Table 3. Sensors 17, 1222 11 of 17

2m

168 m

Figure color-coded. Figure 8. 8. Appearance Appearance of of power power line line for for the the experiments. experiments. The The elevation elevation in in this this area area is is color-coded. Table 2. The The operating operating Parameters Parameters of of LiDAR. LiDAR. Table 2. Parameter Parameter Scanning speed Pulse repetition Scanning speed rate Pulse repetition rate angle Starting scanning Starting scanning angle angle Terminational scanning Terminational scanning angle

Value 40 line/s 300 kHz 40° 105°

Value 40 line/s 300 kHz ◦ 40 ◦ 105

Table 3. The parameters adopted in the planning flight path and task points. Table 3. The parameters adopted in the planning flight path and task points. Parameter Value Safe distance to power line in horizontal direction 30 m Parameter Value Safe distance to power line in vertical direction 40 m Safe distance to distance power line horizontal direction Safe of in unmanned helicopter 5030mm Safe distance to power line in vertical direction The distance of S 5040mm Safe distance of unmanned helicopter 50 m The overlay of image 30% The distance of S 50 m The overlay minimum distance for task 5 30% m The of image The minimum time for task 2 The minimum distance for task 5s m The minimum time for task

2s

Figure 9a shows the planned waypoints of the unmanned helicopter. There are 30 waypoints, of which the green ones are starting points and the purple ones are ending points. After taking off from Figure 9a shows the planned waypoints of the unmanned helicopter. There are 30 waypoints, of around tower No. 328, the unmanned helicopter firstly flew along the right side of the power lines which the green ones are starting points and the purple ones are ending points. After taking off from from tower No. 328 to tower No. 316, then it flew along the left side of the power lines from tower around tower No. 328, the unmanned helicopter firstly flew along the right side of the power lines No. 316 to tower No. 328. At a place near tower No. 323, there is a crossing between a non-target from tower No. 328 to tower No. 316, then it flew along the left side of the power lines from tower power line and the experimental power line, so we increase the height of the corresponding No. 316 to tower No. 328. At a place near tower No. 323, there is a crossing between a non-target waypoint. In addition, we add an auxiliary waypoint between the two waypoints corresponding the power line and the experimental power line, so we increase the height of the corresponding waypoint. tower No. 324 and tower No. 325, for the large terrain variation. Figure 9b presents the task points of In addition, we add an auxiliary waypoint between the two waypoints corresponding the tower the cameras. For the short-focus camera, there are 163 task points which are displayed in light blue, No. 324 and tower No. 325, for the large terrain variation. Figure 9b presents the task points of the and the average distance between each two task points is about 30 m. For the long-focus camera, cameras. For the short-focus camera, there are 163 task points which are displayed in light blue, and there are 78 task points which are displayed in pink, and the distance between each two task points the average distance between each two task points is about 30 m. For the long-focus camera, there is 5 m. The distance between every two towers is long enough, therefore the unmanned helicopter are 78 task points which are displayed in pink, and the distance between each two task points is 5 m. does not need to hover at any one of towers to extend the time to take photos of the insulators. Figure 9c shows the task points at tower No. 320. Figure 9d presents the comparison between the actual flight paths and the planned flight paths of the unmanned helicopter, the deep green represents for the actual waypoints, and it can be seen that the unmanned helicopter cannot strictly fly according to the planned flight paths. The line-pressing accuracy in this experiment is about ±5 m.

Sensors 2017, 17, 1222

12 of 17

The distance between every two towers is long enough, therefore the unmanned helicopter does not need to hover at any one of towers to extend the time to take photos of the insulators. Figure 9c shows the task points at tower No. 320. Figure 9d presents the comparison between the actual flight paths and the planned flight paths of the unmanned helicopter, the deep green represents for the actual waypoints, and it can be seen that the unmanned helicopter cannot strictly fly according to the planned Sensors 2017, 17, 1222 12 of 17 flight paths. The line-pressing accuracy in this experiment is about ±5 m.

2m

(a)

(b)

(c)

(d) 168 m

Figure Figure9.9.The Theresult resultof ofplanned plannedflight flightpath pathand andtasks, tasks,and andtrue trueflight flightpath, path,the theelevation elevationin inthis thisarea areaisis color-coded. (a) The planned flight path; (b) The planned task points; (c) The zoomed image of color-coded. (a) The planned flight path; (b) The planned task points; (c) The zoomed image oftask task points pointsat attower towerNo. No.320; 320;(d) (d)The Thetrue trueflight flightpath pathduring duringpower powerline lineinspection. inspection.

Dataacquired acquiredin inthis thisexperiment experimentincludes includes956 956M Mlaser-point laser-pointcloud clouddata dataof ofthe theentire entirepassages passagesof of Data thepower powerlines, lines,15.8 15.8G GHD HDinfrared infraredvideos videosof ofthe theelectric electrictransmission transmissionline, line,51.4 51.4G Gultraviolet ultravioletvideos videos the of the electric transmission line, 163 images of the power lines taken by the short-focus camera, and of the electric transmission line, 163 images of the power lines taken by the short-focus camera, and 73images imagesof ofthe theinsulators insulatorstaken takenby bythe thelong-focus long-focuscamera. camera. 73 The scheduling scheduling method method (task (task planning planning of of sensors) sensors) of of multiple multiple sensors sensors can can be beevaluated evaluated by bythe the The success rates of the tasks. Compared with the planned task points, it can be found that the success success rates of the tasks. Compared with the planned task points, it can be found that the success rateof oftasks tasksfor forthe theshort-focus short-focus camera camera and and the the long-focus long-focus camera camera are 100% 100% and 93.5% respectively, respectively, rate as shown shown in Table 4. From that thethe distance between each twotwo tasktask points will as Fromthe theSection Section4,4,we weknow know that distance between each points limit the time for task execution, therefore there are several failed tasks for the long-focus camera will limit the time for task execution, therefore there are several failed tasks for long-focus camera affectedby bysome somefactors factors(adjusting (adjustingspeed speedof ofthe thestable stableplatform, platform, response responsetime timeof ofcamera, camera,and andflight flight affected speed of unmanned helicopter, etc.), while the success rate of tasks for the short-focus camera is higher than the long-focus camera because the distance between each two task points of which is longer. However, longer distance between each two task points for the long-focus camera will affect the image resolution and viewing angle, so we should take a balance between data quality and task execution time based on actual power lines information when conducting power line inspection.

Sensors 2017, 17, 1222

13 of 17

speed of unmanned helicopter, etc.), while the success rate of tasks for the short-focus camera is higher than the long-focus camera because the distance between each two task points of which is longer. However, longer distance between each two task points for the long-focus camera will affect the image resolution and viewing angle, so we should take a balance between data quality and task execution time based on actual power lines information when conducting power line inspection. Table 4. The success rate of tasks for two camera. Sensor

Planned Task Number

Image Number

Success Rate

The short-focus camera The long-focus camera Total

163 78 241

163 73 236

100% 93.5% 98%

The effect of automatic target tracking can be evaluated through analyzing and comparing the offset 2017, distance between the photo center and the target object on the images. Because the FOV13ofofthe Sensors 17, 1222 17 long-focus camera is smaller than the short-focus camera and it is more difficult to tacking target for the long-focus long-focus camera, camera, we we use use the the result result of of the the long-focus long-focus camera camera to to evaluate evaluate the the effect effect of of automatic automatic the target tracking. tracking. Figure Figure 10 10 shows shows the the horizontal horizontal and and vertical vertical offsets offsets between between the the photo photo center center and and the the target target object object on on the the images images respectively, the mean values are − 0.05m m and and 0.04 0.04 m, m, and and the the standard standard target −0.05 deviation are are 0.71 0.71 m m and and 0.69 0.69 m, m, as as shown shown in in Table Table 5. However, However, the the image image coverage coverage is is 6.7 6.7 m m in in the the deviation horizontaldirection directionand and1010mminin vertical direction, greater than offset distance, socan wefind can horizontal thethe vertical direction, greater than the the offset distance, so we find targets in all images. Thus we can conclude that the automatic target tracking method perform targets in all images. Thus we can conclude that the automatic target tracking method perform quite quite well. well.

(a)

(b)

Figure 10. Offset between image center and the target. (a) Offset in horizontal direction; (b) Offset in Figure 10. Offset between image center and the target. (a) Offset in horizontal direction; (b) Offset in vertical vertical direction. direction. Table 4. The success rate of tasks for two camera. Table 5. The mean and standard deviation of offset in two direction. Sensor Direction The short-focus camera Horizontal The long-focus camera Vertical Total

Planned Task Number Mean (m) 163 −0.05 78 0.04 241

Image Number Success Rate Standard Deviation (m) 163 100% 0.71 73 93.5% 0.69 236 98%

Figure 11 shows the data acquired by multiple sensors platform. Figure 11a shows the point cloud Table 5. The mean and standard deviation of offset in two direction. data of power lines acquired by LiDAR, the compass on the left displays elevation, while the compass Direction (m)theStandard Deviation (m)acquired by thermal camera, on the right displays azimuth; Figure 11bMean shows thermal infrared data Horizontal Vertical

−0.05 0.04

0.71 0.69

Figure 11 shows the data acquired by multiple sensors platform. Figure 11a shows the point cloud data of power lines acquired by LiDAR, the compass on the left displays elevation, while the

Sensors 2017, 17, 1222

14 of 17

H, L, M and C represent the maximum temperature, minimum temperature, temperature at image center, and temperature at crosshair position respectively; Figure 11c shows the photons acquired by ultraviolet camera, which are superimposed on the optical image; Figure 11d shows the image of tower acquired by short-focus camera; Figure 11d shows a zoomed image of an insulator acquired by the long-focus camera. Sensors 2017, 17, 1222

14 of 17

(a)

(b)

(c)

(d)

(e)

Figure 11. The data acquired by sensors during power line inspection. (a) The LiDAR point cloud, the

Figure 11. The data acquired by sensors during power line inspection. (a) The LiDAR point cloud, compass on the left displays elevation, while the compass on the right displays azimuth; (b) The the compass oninfrared the leftdata, displays elevation, while the themaximum compasstemperature, on the rightminimum displaystemperature, azimuth; (b) The thermal H, L, M and C represent thermaltemperature infrared data, H, L,center, M and C temperature represent the maximum temperature, minimum temperature, at image and at crosshair position respectively; (c) The photons acquired ultraviolet camera, optical image; (d) The image of tower temperature at by image center, and superimposed temperatureonatthe crosshair position respectively; (c)acquired The photons theultraviolet short-focus camera, camera; (e) The zoomed image of the insulator acquired theimage long-focus camera. acquiredbyby superimposed on the optical image; (d) by The of tower acquired by the short-focus camera; (e) The zoomed image of the insulator acquired by the long-focus camera.

Sensors 2017, 17, 1222

15 of 17

Sensors 2017, 17, 1222

15 of 17

Differentdata data acquired acquired by be be synchronized based on POS analysis Different bysensors sensorscan can synchronized based on data. POS After data. manual After manual of the data, two anomalies were found, the first is a broken insulator on tower No. 319, another that analysis of the data, two anomalies were found, the first is a broken insulator on tower No. is319, one of the torsional had dampers fallen off had at tower as shown in Figure 12. in Figure 12. another is that one ofdampers the torsional fallenNo. off327, at tower No. 327, as shown

(a)

(b)

Figure Figure12. 12.Two Twoanomalies anomalieson onthe thepower powerline. line.(a) (a)Broken Brokeninsulator insulatoronontower towerNo. No.319; 319;(b) (b)AAtorsional torsional dampers fallen off at tower No. 327. dampers fallen off at tower No. 327.

6. Conclusions 6. Conclusions In this paper, a multiple sensors platform method based on the use of a large unmanned In this paper, a multiple sensors platform method based on the use of a large unmanned helicopter helicopter was proposed for power line inspection. LiDAR, thermal camera, ultraviolet camera, and was proposed for power line inspection. LiDAR, thermal camera, ultraviolet camera, and two cameras two cameras with different focal lengths are used to acquire information about power line with different focal lengths are used to acquire information about power line components and components and surrounding objects. Scheduling of sensors, exact tracking on power line surrounding objects. Scheduling of sensors, exact tracking on power line components and flight components and flight safety when the multiple sensors platform is used to power line inspection safety when the multiple sensors platform is used to power line inspection were introduced in detail. were introduced in detail. Experiments were carried out on an 110 kV electric transmission line at Experiments were carried out on an 110 kV electric transmission line at Qingyuan in Guangdong Qingyuan in Guangdong Province, and the results show that the proposed method is effective for Province, and the results show that the proposed method is effective for power line inspection. In the power line inspection. In the future, the method needs to be optimized to further improve the success future, the method needs to be optimized to further improve the success rate and accuracy of the task rate and accuracy of the task execution. In addition, we will also start research on the automatic execution. In addition, we will also start research on the automatic processing and analysis of data processing and analysis of data acquired by multiple sensors platform for power line inspection. acquired by multiple sensors platform for power line inspection. Acknowledgments: Acknowledgments:We Weare aregrateful gratefulfor forthe thetwo twoanonymous anonymousreviewers reviewersfor fortheir theirinsightful insightfulcomments, comments,which whichled led usustotoimprove the manuscript. This work was supported by the National Natural Science Foundation of improve the manuscript. This work was supported by the National Natural Science Foundation ofChina China (No.41431069) 41431069) and State’s Key Project of Research Development (Grant No. 2016YFC0803105). (No. and thethe State’s Key Project of Research andand Development PlanPlan (Grant No. 2016YFC0803105). We We thank Guangdong PowerResearch Science Institute, ResearchBeijing Institute, BeijingofUniversity Aeronautics and thank Guangdong Electric Electric Power Science University Aeronauticsofand Astronautics, Astronautics, and 60th Institute of General Staff of the Chinese People’s Liberation Army for providing help in the and 60th Institute of General Staff of the Chinese People’s Liberation Army for providing help in the course of course of the experiment. the experiment. Author Contributions: Xiaowei Xie wrote the majority of the paper; Xiaowei Xie, Zhengjun Liu, Caijun Xu, and Author Contributions: Xiaowei Xie wrote the majority of the paper; Xiaowei Xie, Zhengjun Liu,Caijun CaijunXu Xu, and Yongzhen Zhang provided the ideas and verified the methods; Xiaowei Xie, Zhengjun Liu and revised the manuscript. Yongzhen Zhang provided the ideas and verified the methods; Xiaowei Xie, Zhengjun Liu and Caijun Xu revised the manuscript. Conflicts of Interest: The authors declare no conflict of interest. Conflicts of Interest: The authors declare no conflict of interest.

References References 1.

1.

Bjarnadottir, S.; Li, Y. Risk-based economic assessment of mitigation strategies for power distribution poles subjected toS.; hurricanes. Struct. economic Infrastruct.assessment Eng. 2014, of 10,mitigation 740–752. [CrossRef] Bjarnadottir, Li, Y. Risk-based strategies for power distribution poles

subjected to hurricanes. Struct. Infrastruct. Eng. 2014, 10, 740–752, doi:10.1080/15732479.2012.759240.

Sensors 2017, 17, 1222

2. 3.

4.

5. 6.

7.

8. 9. 10. 11. 12. 13. 14.

15.

16.

17. 18. 19. 20. 21. 22. 23.

16 of 17

França, G.B.; Oliveira, A.N. A Fire-Risk-Breakdown System for Electrical Power Lines in the North of Brazil. J. Appl. Meteorol. Climatol. 2014, 53, 813–823. [CrossRef] Pouliot, N.; Montambault, S. LineScout Technology: From Inspection to Robotic Maintenance on Live Transmission Power Lines. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009. Richard, P.L.; Pouliot, N. Introduction of a LIDAR-based obstacle detection system on the LineScout power line robot. In Proceedings of the IEEE International Conference on Advanced Intelligent Mechatronics (AIM), Besancon, France, 8–11 July 2014. Debenest, P.; Guarnieri, M. Expliner—Robot for inspection of transmission lines. In Proceedings of the IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, 19–23 May 2008. Debenest, P.; Guarnieri, M. Field experiences using LineScout Technology on large BC transmission crossings. In Proceedings of the 1st International Conference on Applied Robotics for the Power Industry, Montreal, QC, Canada, 5–7 October 2010. Phillips, A.; Engdahl, E. Autonomous Overhead Transmission Line Inspection Robot (TI) Development and Demonstration. In Proceedings of the 2nd International Conference on Applied Robotics for the Power Industry, ETH Zurich, Zurich, Switzerland, 11–13 September 2012. Byambasuren, B.E.; Kim, D. Inspection Robot Based Mobile Sensing and Power Line Tracking for Smart Grid. Sensors 2016, 16, 250. [CrossRef] [PubMed] Seok, K.H.; Kim, Y.S. A State of the art of Power Transmission Line Maintenance Robots. J. Electr. Eng. Technol. 2016, 9, 1412–1422. [CrossRef] Jones, D. Aerial inspection of overhead power lines using video: Estimation of image blurring due to vehicle and camera motion. IEE Proc. Vision Image Signal Process. 2000, 147, 157–166. [CrossRef] Whitworth, C.C.; Duller, A.W.G. Aerial video inspection of overhead power lines. Power Eng. J. 2001, 15, 25–32. [CrossRef] Wang, B.; Chen, X. Power line inspection with a flying robot. In Proceedings of the 1st International Conference on Applied Robotics for the Power Industry, Montreal, QC, Canada, 5–7 October 2010. Dai, L.; Qi, J. Camera selection for unmanned helicopter power line inspection. In Proceedings of the 3rd IEEE PES Innovative Smart Grid Technologies—Asia (ISGT Asia), Tianjin, China, 21–24 May 2012. Li, H.M.; Wang, B. The design and application of SmartCopter: An unmanned helicopter based robot for transmission line inspection. In Proceedings of the Chinese Automation Congress, Changsha, China, 7–8 November 2013. Luque-Vega, L.F.; Castillo-Toledo, B. Power line inspection via an unmanned aerial system based on the quadrotor helicopter. In Proceedings of the 17th IEEE Mediterranean Electrotechnical Conference, Beirut, Lebanon, 13–16 April 2014. Rafique, S.F.; Bodla, M.K. Design and implementation of a UAV for power system utility inspection. In Proceedings of the 16th International Power Electronics and Motion Control Conference and Exposition (PEMC), Antalya, Turkey, 21–24 September 2014. Mulero-Pázmány, M.; Negro, J.J. A low cost way for assessing bird risk hazards in power lines: Fixed-wing small unmanned aircraft systems. J. Unmanned Veh. Syst. 2014, 2, 5–15. [CrossRef] Liu, C.A.; Dong, R. A 3D Laboratory Test-platform for Overhead Power Line Inspection. Int. J. Adv. Robot. Syst. 2016, 13, 72. [CrossRef] Zhang, Y.; Yuan, X. UAV Low Altitude Photogrammetry for Power Line Inspection. ISPRS Int. J. Geo-Inf. 2017, 6, 14. [CrossRef] McLaughlin, R.A. Extracting transmission lines from airborne LIDAR data. IEEE Geosci. Remote Sens. Lett. 2006, 3, 222–226. [CrossRef] Yoonseok, J.; Gunho, S. A Piecewise Catenary Curve Model Growing for 3D Power Line Reconstruction. Photogramm. Eng. Remote Sens. 2012, 78, 1227–1240. [CrossRef] Connie, K.O.; Remmel, T.K.; Sohn, G. Mapping tree genera using discrete LiDAR and geometric tree metrics. Bosque 2012, 33, 313–319. [CrossRef] Kim, H.B.; Sohn, G. Point-based Classification of Power Line Corridor Scene Using Random Forests. Photogramm. Eng. Remote Sens. 2013, 79, 821–833. [CrossRef]

Sensors 2017, 17, 1222

24.

25. 26. 27.

17 of 17

Zhou, X.Y.; Jia, Y. Experimental Validation of a Compound Control Scheme for a Two-Axis Inertially Stabilized Platform with Multi-Sensors in an Unmanned Helicopter-Based Airborne Power Line Inspection System. Sensors 2016, 16, 366. [CrossRef] [PubMed] Shanmugavel, M.; Tsourdos, A. Co-operative path planning of multiple UAVs using Dubins paths with clothoid arcs. Control Eng. Pract. 2010, 18, 1084–1092. [CrossRef] Lin, Y.; Saripalli, S. Path planning using 3D Dubins Curve for Unmanned Aerial Vehicles. In Proceedings of the International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014. Wang, Y.; Zhu, X.P. Path Planning Based on PH Curves for Unmanned Aerial Vehicles. Comput. Simul. 2013, 30, 76–79. © 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).