Comparison of Small Unmanned Aerial Vehicles Performance ... - MDPI

10 downloads 41677 Views 12MB Size Report
Jan 11, 2017 - autopilot systems and its sensor packages which includes GPS receiver and inertial sensors to estimate ... Software (Ground Station). Mission ...
Journal of

Imaging Article

Comparison of Small Unmanned Aerial Vehicles Performance Using Image Processing Esteban Cano, Ryan Horton, Chase Liljegren and Duke M. Bulanon * Department of Physics and Engineering, Northwest Nazarene University, Nampa, ID 83686, USA; [email protected] (E.C.); [email protected] (R.H.); [email protected] (C.L.) * Correspondence: [email protected]; Tel.: +1-208-467-8047 Academic Editor: Gonzalo Pajares Martinsanz Received: 1 October 2016; Accepted: 4 January 2017; Published: 11 January 2017

Abstract: Precision agriculture is a farm management technology that involves sensing and then responding to the observed variability in the field. Remote sensing is one of the tools of precision agriculture. The emergence of small unmanned aerial vehicles (sUAV) have paved the way to accessible remote sensing tools for farmers. This paper describes the development of an image processing approach to compare two popular off-the-shelf sUAVs: 3DR Iris+ and DJI Phantom 2. Both units are equipped with a camera gimbal attached with a GoPro camera. The comparison of the two sUAV involves a hovering test and a rectilinear motion test. In the hovering test, the sUAV was allowed to hover over a known object and images were taken every quarter of a second for two minutes. For the image processing evaluation, the position of the object in the images was measured and this was used to assess the stability of the sUAV while hovering. In the rectilinear test, the sUAV was allowed to follow a straight path and images of a lined track were acquired. The lines on the images were then measured on how accurate the sUAV followed the path. The hovering test results show that the 3DR Iris+ had a maximum position deviation of 0.64 m (0.126 m root mean square RMS displacement) while the DJI Phantom 2 had a maximum deviation of 0.79 m (0.150 m RMS displacement). In the rectilinear motion test, the maximum displacement for the 3DR Iris+ and the DJI phantom 2 were 0.85 m (0.134 m RMS displacement) and 0.73 m (0.372 m RMS displacement). These results demonstrated that the two sUAVs performed well in both the hovering test and the rectilinear motion test and thus demonstrated that both sUAVs can be used for civilian applications such as agricultural monitoring. The results also showed that the developed image processing approach can be used to evaluate performance of a sUAV and has the potential to be used as another feedback control parameter for autonomous navigation. Keywords: digital image processing; machine vision; unmanned aerial vehicle (UAV)

1. Introduction In the past decade, there has been an increased use of small unmanned aerial vehicles (sUAV) in the United States. The advent of sUAV was brought about by the advancement of miniaturized sensor systems, microprocessors, control systems, power supply technologies, and global positioning systems (GPS). With the recent release of the operational rules for commercial use of sUAV by the Federal Aviation Administration [1], this will open the doors to integrating sUAV into the national airspace. Applications of sUAV in the civilian area include real estate photography, fire scouting, payload delivery, and agriculture. Agriculture is one of the areas that will be largely impacted with the use of sUAV. For a farmer, knowing the health and current state of the year’s crop is essential for effective crop management. Precision Agriculture, which is a spatial-based technology, is used by farmers to monitor and manage

J. Imaging 2017, 3, 4; doi:10.3390/jimaging3010004

www.mdpi.com/journal/jimaging

J. Imaging 2017, 3, 4

2 of 14

their crop production [2,3]. Precision agriculture can not only improve crop productivity, but it also provides the means to an efficient use of resources (e.g., water, fertilizer, and pesticides). One of the tools of Precision Agriculture is remote sensing. Remote sensing is an approach in which information is obtained without requiring a person to be physically present to collect the data. Combined with Geographic Information Systems (GIS) and Global Positioning Systems (GPS), remote sensing provides farmers the technologies needed to maximize the output of their crops [4–6]. Satellite imaging has been used to remotely monitor crops [7]. However, the prohibitive cost, low image resolution, and low sampling frequency are some of the disadvantages of satellite imaging. Manned aircraft has also been tried in order to implement remote sensing in agriculture [8]. Similar to satellite imaging, the cost and the frequency of sampling are the inhibiting factors. Furthermore, a comparative study between using sUAV and manned aircraft to image citrus tree conditions showed that sUAV produced higher spatial resolution images [9]. One of the most useful yet affordable remote sensing systems can now be obtained with the purchase of a small quadcopter or drone [10,11] which can then be flown over a farmer’s desired agriculture fields. The drones can then take images of the farmer’s crop with a variety of camera filters to provide the farmer with multiple spectrums of imaging. Not only is a sUAV useful in providing current aerial images of their entire crop, but it also allows for the opportunity for image processing and analysis which can give even more information about the health of their crops as well as identifying areas of the crop that require specific forms of attention. A study by Bulanon et al. [12] to evaluate different irrigation systems of an apple orchard used a combination of sUAV and image processing. The small drones can be easily flown and maintained with little training making them a great option for farmers looking to further their farming by merging agriculture with the technology of remote sensing. With the price of remote sensing sUAVs becoming much more affordable and thus a realistic application for today’s farmers, the challenge is selecting the sUAV that will be suitable for the specific application. Chao et al. [13] conducted a survey on the different off-the-shelf autopilot packages for small unmanned aerial vehicles available in the market. The comparative review looked at different autopilot systems and its sensor packages which includes GPS receiver and inertial sensors to estimate 3-D position and attitude information. They also suggested possible sensor configurations, such as infrared sensors and vision sensors, to improve basic autonomous navigation tasks. An example of this is a vision-guided flight stability and autonomy system that was developed to control micro air vehicles [14]. A forward-looking vision-based horizon detection algorithm was used to control a fixed-wing unmanned aerial vehicle and the system appeared to produce more stable flights than those remotely controlled by a human pilot. The research work by Carnie et al. [15] investigated the feasibility of using computer vision to provide a level of situational awareness suitable for sense and avoid capability of a UAV. Two morphological filtering approaches were compared for target detection using a series of image streams featuring real collision-course aircraft against a variety of daytime backgrounds. These studies demonstrate the potential of using vision systems and image processing to add another level of control hierarchy to UAV systems. In this paper, the vision system of two popular off-the-shelf sUAVs were used to compare their performances in multiple aerial competence tests. A customized image processing algorithm was developed to analyze the acquired images and evaluate the performance of the sUAVs. The results of this study would be helpful for choosing a particular off-the-shelf sUAV for a certain application such as agricultural monitoring. The objectives of this paper are: 1. 2.

To compare the flight performance of two off-the-shelf sUAVs: 3DR Iris+ and DJI Phantom. To develop image processing algorithms to evaluate the performance of the two sUAVs.

J. Imaging 2017, 3, 4

3 of 14

2. Materials and Methods 2.1. Small Unmanned Aerial Vehicles Unmanned aerial vehicles (UAV) can be classified according to size. The classification includes J. Imaging 2017, 3, 4 of 14 micro UAV, small UAV, medium UAV, and large UAV [16]. The micro UAVs are extremely 3small in size and applies to sizes of about an insect to 30–50 cm long. The small UAV (sUAV) are UAVs with dimension greater than 50 cm and less than 2 m. The Federal Aviation Agency defines sUAV as an dimension greater than 50 cm and less than 2 m. The Federal Aviation Agency defines sUAV as an aircraft that weighs more than 0.25 kg but less than 25 kg [17]. The medium UAVs have dimension aircraft that weighs more than 0.25 kg but less than 25 kg [17]. The medium UAVs have dimension ranging from 5 to 10 m and can carry payloads of up to 200 kg, while large UAVs applies to the UAVs ranging from 5fortocombat 10 m and can carry of up kg, while large applies used mainly operations by payloads the military. In to this200 paper the focus is UAVs on sUAV and to its the UAVs used mainly for combat operations by the military. In this paper the focus is on sUAV and application to agriculture. While most people are able to build their own sUAV using do-it-yourself its application to agriculture. While mostare people are able toThe build their ownofsUAV using do-it-yourself kits, off-the-shelf ready-to-fly sUAV also available. advantages the off-the-shelf sUAV arekits, off-the-shelf are also available. The advantages of the off-the-shelf sUAV areInthat that they areready-to-fly ready to flysUAV and there is not much tuning involved as compared with the DIY kits. they are ready to off-the-shelf fly and theresUAV is not much tuning involved as compared with thebe DIY kits. used In addition, addition, these come with camera gimbals that could then easily for these off-the-shelf sUAV come with camera gimbals that could then be easily used for agricultural agricultural surveying. Two of the most popular sUAV in the market were used in this study: surveying. Two of the popular sUAV in the market were used this study: (1) are 3DR Iris+ [18]inand (1) 3DR Iris+ [18] andmost (2) DJI Phantom 2 [19]. Some specifications oninthe two drones provided 1. An image of the 3DRspecifications Iris+ is shownon in Figure and is noticeably wider DJI image Phantom (2)Table DJI Phantom 2 [19]. Some the two1 drones are provided in than Tablethe 1. An of the 2 which is pictured in Figure 2. The greater width of the Iris+ makes it so that the distance between the 3DR Iris+ is shown Figure 1 and is noticeably wider than the DJI Phantom 2 which is pictured in front and back props width is less than the distance from side tothe side. In contrast, the DJI Phantom 2 provides Figure 2. The greater of the Iris+ makes it so that distance between the front and back props locations that are symmetrically set inIn a square around the Phantom center of the drone. Both of locations the dronesthat isprop less than the distance from side to side. contrast, the DJI 2 provides prop use functionally similar operatethe a Go-Pro in flight are symmetrically set in gimbals a squaretoaround centercamera of the for drone. Bothimaging. of the drones use functionally similar gimbals to operate a Go-Pro camera for in flight imaging. Table 1. Iris+ and Phantom 2 Specifications.

Table 1. Iris+ and IRIS+ Phantom 2 Specifications. Features Phantom 2 Motors 4 4 Features Phantom Max Payload 400 g IRIS+ 300 g2 Flight Time Motors 16–22 min 4 14–25 4 min Max Payload 400 g 300 Max Flight Speed 22.7 m/s 15gm/s Flight Time 16–22 min 14–25 Motor to Motor Dimensions 550 mm 350min mm Max Flight Speed 22.7 m/s 15 m/s Flight Controller Pixhawk NASA-MV2 Motor to Motor Dimensions 550 mm 350 mm Software (Ground Station) Mission Planner DJI Ground Station Flight Controller Pixhawk NASA-MV2 Flight Modes Manual Manual Software (Ground Station) Mission Planner DJI Ground Station HoverManual Hover Flight Modes Manual Hover Auto Hover Auto Auto AutomAh Battery 5100 mAh 5200 Battery 5100 mAh 5200 mAh Gimbal Tarot Go-Pro Gimbal DJI Go-Pro Gimbal Gimbal

(a)

Tarot Go-Pro Gimbal

DJI Go-Pro Gimbal

(b)

Figure 1. 3DR Iris+ and DJI Phantom 2 quadcopter drones. (a) 3DR Iris+; (b) DJI Phantom 2. Figure 1. 3DR Iris+ and DJI Phantom 2 quadcopter drones. (a) 3DR Iris+; (b) DJI Phantom 2.

2.2. Image Acquisition for Stability Evaluation The image acquisition system used for both sUAV was a GoPro Hero 3 camera, which was attached to the sUAV using a customized camera gimbal. The GoPro Hero 3 can shoot both videos and pictures. In this paper, the camera was used to acquire images. The camera was set to capture images with 11 megapixel resolution and the white balance was set to 5500 K.

J. Imaging 2017, 3, 4

4 of 14

2.2. Image Acquisition for Stability Evaluation The image acquisition system used for both sUAV was a GoPro Hero 3 camera, which was attached to the sUAV using a customized camera gimbal. The GoPro Hero 3 can shoot both videos and pictures. In this paper, the camera was used to acquire images. The camera was set to capture images J. Imaging 2017, 3, 4 4 of 14 with 11 megapixel resolution and the white balance was set to 5500 K. zcamera

xcamera ycamera

zworld PVC Squa re

xworld yworld Camera Field of View

Figure 2. 2. sUAV camera frame frame and and world world frame frame relationship. relationship. Figure sUAV camera

To acquire the images for stability evaluation, the focal plane of the camera was set parallel to To acquire the images for stability evaluation, the focal plane of the camera was set parallel to the ground and this was accomplished with the camera gimbal (Figure 2). The field of view in the the ground and this was accomplished with the camera gimbal (Figure 2). The field of view in the real world coordinates with planar dimensions (xworld, yworld) was mapped to the camera’s discrete real world coordinates with planar dimensions (xworld , yworld ) was mapped to the camera’s discrete sensor elements (xcamera, ycamera). Ground control points, which include a PVC square and lined tracks, sensor elements (xcamera , ycamera ). Ground control points, which include a PVC square and lined tracks, were used to calculate the spatial resolution (Δxworld, Δyworld). The spatial resolution was determined were used to calculate the spatial resolution (∆xworld , ∆yworld ). The spatial resolution was determined by the number of pixels in the sensor array and the field of view of the camera [20]. by the number of pixels in the sensor array and the field of view of the camera [20]. = ∆ x ∆xworld = world xcamera = y ∆yworld = world ycamera Since the camera was set parallel to the ground, the relationship between the two-dimensional Since camera was set parallelscene to the[21,22] ground, thebe relationship two-dimensional image andthethe three-dimensional can expressedbetween using athe simple perspective image and the three-dimensional scene [21,22] can be expressed using a simple perspective transformation expressed by the following equation: transformation expressed by the following equation: ∆

0  f  camera 0 h h x i= i   xworld camera =  0 zworld  f camera ycamera yworld 0 zworld where fcamera is the focal length of the camera. where fcamera is the focal length of the camera. 2.3. Performance Evaluation Tests 2.3. Performance Evaluation Tests The application of sUAV for civilian applications such as agricultural surveying and real estate The application of sUAV applications as agricultural and real estate photography involves takingfor a civilian single picture at a such certain altitude or surveying taking multiple pictures photography involves taking a single picture at auser. certain altitude or taking multiple pictures following following a waypoint path generated by the Based on these applications, two performance aevaluation waypoint tests path were generated by the user. Based on these applications, two performance evaluation conducted to compare the two off-the-shelf sUAVs. These tests were the tests were test conducted to compare the two sUAVs. These tests were the hovering test hovering and rectilinear motion test.off-the-shelf In both of these tests, a GoPro camera is attached toand the sUAV camera gimbal and the camera takes images. The images are then used to evaluate the flight performance of the sUAV using image processing and analysis. Xiang and Tian evaluated the performance of an autonomous helicopter by hovering using onboard sensors [23]. 2.3.1. Hovering Test

J. Imaging 2017, 3, 4

5 of 14

rectilinear motion test. In both of these tests, a GoPro camera is attached to the sUAV camera gimbal and the camera takes images. The images are then used to evaluate the flight performance of the sUAV using image processing and analysis. Xiang and Tian evaluated the performance of an autonomous helicopter by hovering using onboard sensors [23]. 2.3.1. Hovering Test In the hovering test, the sUAV was flown over a 2-m square PVC pipe at three different altitudes: 5, 15 and 25 m. At each altitude, the sUAV was allowed to hover for two minutes and images were taken J. Imaging 2017, 3, 4 5 of 14 every 0.5 s. The pixel resolution for each altitude are the following: 116 pixels/m (5 m), 71 pixels/m (15 m),71and 51 pixels/m The two (25 sUAV tested on the same dayonwith wind speed (5 m), pixels/m (15 m),(25 andm). 51 pixels/m m).were The two sUAV were tested the asame day withofa 2wind miles per hour (ESE). The time-lapse images of the PVC square were used to measure stability of the speed of 2 miles per hour (ESE). The time-lapse images of the PVC square were used to measure sUAV while hovering. Figure 3 showsFigure one of3the hovering for the DJI Phantom. stability of the sUAV while hovering. shows one oftests the hovering tests for the DJI Phantom.

Figure 3. Hovering test for the DJI Phantom. Figure 3. Hovering test for the DJI Phantom.

2.3.2. Rectilinear Motion Test 2.3.2. Rectilinear Motion Test In the rectilinear motion test, the sUAV was programmed to fly a straight path over the running In the rectilinear motion test, the sUAV was programmed to fly a straight path over the running track at Northwest Nazarene University (NNU). The waypoint path of the sUAV was based on the track at Northwest Nazarene University (NNU). The waypoint path of the sUAV was based on the straight line markings of the running track, which constrains the straight line motion of the sUAV. straight line markings of the running track, which constrains the straight line motion of the sUAV. The wind condition during this test was 4 miles per hour (ESE). Similar to the hovering test, the The wind condition during this test was 4 miles per hour (ESE). Similar to the hovering test, the camera camera was also programmed to acquire images as it moved over the track every 0.5 s. The acquired was also programmed to acquire images as it moved over the track every 0.5 s. The acquired images images were then used to measure the stability in straight line motion. Figure 4 shows one of the tests were then used to measure the stability in straight line motion. Figure 4 shows one of the tests for the for the 3DR Iris+. 3DR Iris+.

In the rectilinear motion test, the sUAV was programmed to fly a straight path over the running track at Northwest Nazarene University (NNU). The waypoint path of the sUAV was based on the straight line markings of the running track, which constrains the straight line motion of the sUAV. The wind condition during this test was 4 miles per hour (ESE). Similar to the hovering test, the camera J. Imaging was 2017, also 3, 4 programmed to acquire images as it moved over the track every 0.5 s. The acquired 6 of 14 images were then used to measure the stability in straight line motion. Figure 4 shows one of the tests for the 3DR Iris+.

J. Imaging 2017, 3, 4

6 of 14

Figure 4. Rectilinear motion test for the 3DR Iris+. Figure 4. Rectilinear motion test for the 3DR Iris+.

2.4. Image Processing for Performance Evaluation 2.4. Image Processing for Performance Evaluation 2.4.1. Hovering Test

Vertical Position

2.4.1.To Hovering evaluateTest the hovering test using image processing [24], the center of the area of the PVC square thethe image was used as theimage stability parameter. on the To inside evaluate hovering test using processing [24],The thestability center ofwas the measured area of thebased PVC square change of the center of the area. The image processing was based on position tracking of the center inside the image was used as the stability parameter. The stability was measured based on the change of the PVC square. Figure 5 demonstrates this concept. The solid black line is the segmented PVC of the center of the area. The image processing was based on position tracking of the center of the PVC square Figure from the first acquired image andThe thissolid wasblack usedline as is thethesetsegmented point. The gray linesfrom are the the square. 5 demonstrates this concept. PVC square segmented PVC square from the subsequent images. The position of the center of the area for each first acquired image and this was used as the set point. The gray lines are the segmented PVC square subsequent image were then compared with the center set point image, used to evaluate the from the subsequent images. The position of the of the areaand for this eachwas subsequent image were stability in the hovering test. The position displacement from the set point image was calculated for then compared with the set point image, and this was used to evaluate the stability in the hovering each image. test. The position displacement from the set point image was calculated for each image.

2 1 4 3

Horizontal Position Figure 5. 5. Concept of the hovering test evaluation evaluation using using image image processing. processing.

2.4.2. Rectilinear Motion Motion Test Test 2.4.2. Figure 6 shows motion testtest evaluation. TheThe black sUAV is theisstart Figure shows the theconcept conceptofofthe therectilinear rectilinear motion evaluation. black sUAV the position and the enclosing the sUAV represents the field view of the camera. As the start position andsquare the square enclosing the sUAV represents theoffield of(FOV) view (FOV) of the camera. sUAV follows the programmed straight path, the actual position of the sUAV is different from the directed path and this is shown by the gray sUAVs with their respective FOVs. The features in the running track were then used to measure the deviation from the programmed straight path by comparing the line positions from the image acquired from the start position. Similar to the hovering test, the position displacement of the line was calculated for each image.

J. Imaging 2017, 3, 4

7 of 14

As the sUAV follows the programmed straight path, the actual position of the sUAV is different from the directed path and this is shown by the gray sUAVs with their respective FOVs. The features in the running track were then used to measure the deviation from the programmed straight path by comparing the line positions from the image acquired from the start position. Similar to the hovering J. Imaging Imaging 2017, 3, 3, 44 displacement of the line was calculated for each image. of 14 14 test, the position J. 2017, 77 of

Figure Figure 6. 6. Concept rectilinear motion motion test test evaluation evaluation using using image image processing. processing. Figure 6. Concept of of rectilinear rectilinear motion test evaluation using image processing.

3. Results and and Discussion Discussion 3. 3. Discussion 3.1. 3.1. Hovering Hovering Test Test 3.1. Hovering Test Figure Figure777 shows shows the the original original image image of of the the PVC PVC square square and and the the segmented segmented image. image. The The high high contrast contrast Figure shows the original image of the PVC square and the segmented image. The high contrast between the PVC and the grass facilitated the segmentation of the PVC from the background andaa between the PVC and the grass grass facilitated facilitated the the segmentation segmentation of of the the PVC PVC from from the the background background and and between asimple simplethresholding thresholdingoperation operationwas wasimplemented. implemented.After Afterthresholding, thresholding,the the object object features simple thresholding operation was implemented. After thresholding, the object features such as such as as centroid position and box length were calculated. This image and its features were used as the set centroid position and box length were calculated. used as the set centroid position and box length were calculated. This image and its features were used as the set point point image image and and the the position position of of the the PVC PVC square square from from the the subsequent subsequent image image was was compared compared to to this this point image and the position of the PVC square from the subsequent image was compared to this image and the position displacement calculated. image and the position displacement calculated. image and the position displacement calculated.

(a) (a)

(b) (b)

Figure 7. 7. Set Set point point image image for for the the hovering hovering test. test. (a) (a) Original Original image image for for hovering hovering test; test; (b) (b) Segmented Segmented Figure Figure 7. Set point image for the hovering test. (a) Original image for hovering test; (b) Segmented image of PVC square. image of of PVC PVC square. square. image

Figure 88 shows shows the the comparison comparison between between the the set set point point image image and and aa subsequent subsequent image. image. The The Figure overlaid images images show show that that the the position position of of the the PVC PVC square square changed, changed, which which means means that that the the sUAV sUAV is is overlaid moving even though it is in hover mode. The overlaid images demonstrated that a simple image moving even though it is in hover mode. The overlaid images demonstrated that a simple image processing approach, approach, such such as as image image subtraction, subtraction, could could be be used used to to evaluate evaluate the the stability stability of of the the sUAV sUAV processing in hover hover mode. mode. in

J. Imaging 2017, 3, 4

8 of 14

Figure 8 shows the comparison between the set point image and a subsequent image. The overlaid images show that the position of the PVC square changed, which means that the sUAV is moving even though it is in hover mode. The overlaid images demonstrated that a simple image processing approach, such as image subtraction, could be used to evaluate the stability of the sUAV in hover mode. J. Imaging 2017, 3, 4 8 of 14

(a)

(b)

Figure 8. 8. Comparison between the the set set point point image image and and subsequent subsequent image. image. (a) (a) Overlay Overlay of of image image 11 and and Figure Comparison between image 2; 2; (b) (b) Image Image subtraction subtraction of of images image 11and image and2.2.

Figure 9 shows the position displacement of the 3DR Iris+ from its starting position while it is in Figure 9 shows the position displacement of the 3DR Iris+ from its starting position while it is hover mode at the three different altitudes. The position displacement is the difference between the in hover mode at the three different altitudes. The position displacement is the difference between centers of the PVC square from the set point image and the subsequent images for both the x and y the centers of the PVC square from the set point image and the subsequent images for both the x axes. The hovering results of the 3DR Iris+ showed that the maximum position displacement was 0.64 and y axes. The hovering results of the 3DR Iris+ showed that the maximum position displacement m while hovering at the lowest height of 5 m and the maximum position displacement was was 0.64 m while hovering at the lowest height of 5 m and the maximum position displacement 0.34 m while hovering at 25 m. The lowest altitude had a mean position deviation of 0.29 m (RMS was 0.34 m while hovering at 25 m. The lowest altitude had a mean position deviation of 0.29 m displacement of 0.126 m) and the highest altitude had a mean position deviation of 0.14 m (RMS (RMS displacement of 0.126 m) and the highest altitude had a mean position deviation of 0.14 m displacement of 0.052 m). A similar trend can be observed from the hovering results of the DJI (RMS displacement of 0.052 m). A similar trend can be observed from the hovering results of the DJI Phantom (Figure 10), which showed a maximum displacement of 0.79 m while hovering at 5 m and Phantom (Figure 10), which showed a maximum displacement of 0.79 m while hovering at 5 m and a maximum displacement of 0.34 m at a hovering height of 25 m. The mean position displacement a maximum displacement of 0.34 m at a hovering height of 25 m. The mean position displacement values for 5 and 25 m were 0.36 m (RMS displacement of 0.15 m) and 0.11 m (RMS displacement of values for 5 and 25 m were 0.36 m (RMS displacement of 0.15 m) and 0.11 m (RMS displacement 0.09 m), respectively. In both sUAVs, the lower altitude showed the highest position deviation. It is of 0.09 m), respectively. In both sUAVs, the lower altitude showed the highest position deviation. noted that as the altitude increased, the pixel resolution decreased which could result in higher It is noted that as the altitude increased, the pixel resolution decreased which could result in higher uncertainty and in the deviation variability as the height was increased. Although both sUAVs were uncertainty and in the deviation variability as the height was increased. Although both sUAVs were in in hover mode, it was expected that they deviate from their set position. The position displacement hover mode, it was expected that they deviate from their set position. The position displacement is is brought about by a number of factors. The first is that the GPS sensor found in most of the off-thebrought about by a number of factors. The first is that the GPS sensor found in most of the off-the-shelf shelf sUAVs have hover accuracies in the ±2 m range. This is proven by the deviation from the hover sUAVs have hover accuracies in the ±2 m range. This is proven by the deviation from the hover images. The second factor is the disturbance caused by wind. The third is the feedback control system images. The second factor is the disturbance caused by wind. The third is the feedback control system that holds the position of the sUAV. The Proportional Integral Derivative (PID) controller is the most that holds the position of the sUAV. The Proportional Integral Derivative (PID) controller is the most commonly used for off-the-shelf sUAVs. The gains for the PID controller affect the response of the commonly used for off-the-shelf sUAVs. The gains for the PID controller affect the response of the sUAV to disturbance. The results of this hovering test will be very useful when performing image sUAV to disturbance. The results of this hovering test will be very useful when performing image mosaicking, which is a process of stitching images to form an image with a much larger field of view. mosaicking, which is a process of stitching images to form an image with a much larger field of Based on these results, when performing image mosaicking, the image altitude should be taken into view. Based on these results, when performing image mosaicking, the image altitude should be taken account when configuring the image overlaps in the mission planning. As mentioned by Xiang and into account when configuring the image overlaps in the mission planning. As mentioned by Xiang Tian [23], as the sUAV continuously vary around the hover point, it affects the coverage of a single and Tian [23], as the sUAV continuously vary around the hover point, it affects the coverage of a image and also impacts the required overlap amount of the images during the flight. It was single image and also impacts the required overlap amount of the images during the flight. It was recommended to investigate such variations to determine the effects on the overlaps. Hunt et al. [25] recommended to investigate such variations to determine the effects on the overlaps. Hunt et al. [25] described the different overlaps used at varying heights for an unmanned aircraft monitoring winter described the different overlaps used at varying heights for an unmanned aircraft monitoring winter wheat fields. wheat fields.

J. Imaging 2017, 3, 4 J. Imaging 2017, 3, 4

Figure Figure 9. 9. 3DR 3DR Iris+ Iris+ position position displacement displacement during during hovering hovering test test using using image image processing. processing.

9 of 14 9 of 14

J.J.Imaging Imaging2017, 2017,3,3,44

Figure10. 10.DJI DJIPhantom Phantom22deviation deviationduring duringthe thehovering hoveringtest testusing usingimage imageprocessing. processing. Figure

10 10ofof14 14

J. Imaging 2017, 3, 4

11 of 14

J. Imaging 2017, 3, 4

11 of 14

3.2. Rectilinear Motion Test 3.2. Rectilinear Motion Test Figure 11 shows an example of the image processing for evaluating the rectilinear motion test. Figure an example of the image evaluating motionused test. The first step11isshows to segment the feature that will processing be used forfor evaluation. Inthe thisrectilinear case, the feature The first step isbleacher. to segment the feature that will be used for evaluation. Into this case, the feature was was the green A color-based segmentation method was used segment the greenused bleacher the green A color-based method wasparts usedofto the green bleacher because of bleacher. the high color contrast assegmentation compared with the other thesegment image. After segmentation, of the colortocontrast compared with thenoise. other Following parts of thethe image. After segmentation, abecause size filter washigh passed removeas the salt and pepper filtering was an operationa size was passed remove saltobject and pepper noise.which Following filtering was operation to to fillfilter the holes and to to extract thethe large in the image was the bleacher. Thean position of the fill the holes extract the largethe object in theof image whichfrom wasthe thestart bleacher. The position of the bleacher was and thentoused to measure deviation the sUAV position. bleacher was then used to measure the deviation of the sUAV from the start position.

(a)

(b)

(c)

(d)

Figure 11. Example image processing for the rectilinear motion evaluation. (a) Original image of Figure 11. Example image processing for the rectilinear motion evaluation. (a) Original image of running track; (b) Segmented image; (c) Filling hole operation; (d) Extraction of line. running track; (b) Segmented image; (c) Filling hole operation; (d) Extraction of line.

Figure 12 shows the path of the sUAV calculated using image processing for both sUAV. For the Figure shows were the path of the calculated using image processing for both the1 time that the12sUAVs tested, thesUAV maximum position deviation from the center linesUAV. is lessFor than time that the sUAVs were tested, the maximum position deviation from the center line is less than m. It can be noted that both sUAVs deviated from the center line and moved with a sinusoidal 1characteristic, m. It can be which noted is that bothfor sUAVs deviated from the center and moved with3DR a sinusoidal typical a position control system tryingline to correct itself. The Iris+ had characteristic, which is typical for a position control system trying to correct itself. The 3DR Iris+ had a maximum position displacement of 0.85 m (RMS displacement of 0.314 m) while the DJI Phantom a2 maximum position displacement 0.85 (RMS displacement 0.314 while to thethe DJIhover Phantom had a maximum displacement of of 0.73 mm (RMS displacement of of 0.372 m).m)Similar test, 2this had a maximum displacement of 0.73 m (RMS displacement of 0.372 m). Similar to the hover test, deviation is attributed to the GPS receiver’s hover accuracy and the PID gains for the controller. this deviation is attributed to the GPS receiver’s hover accuracy and the PID gains for the controller. The segmentation of the image features also affected the calculation of the position deviation. The The of ground the image features also at affected the positions calculation thesUAV position The line line segmentation features on the were acquired different asofthe wasdeviation. moving forward, features on thethe ground were acquired different positions the sUAV movingevery forward, thus affecting segmentation of theatline features. As theasimages werewas acquired 0.5 s,thus the affecting the segmentation of the line features. As the images were acquired every 0.5 s, the overall overall image intensity would vary for every image and thus affected the segmentation performance. image intensity would vary formean everyposition image and thus affected the segmentation performance. It is It is noted that the values of the deviation for the two sUAVs were within the reported noted that the values of the mean position deviation for the two sUAVs were within the reported GPS receiver’s accuracy. These results show the current state of-the-art of the control system of the GPS receiver’ssUAVs accuracy. show thetheir current state of-the-art of the control of the off-the-shelf andThese this results demonstrates capabilities in surveying taskssystem for civilian off-the-shelf applications.sUAVs and this demonstrates their capabilities in surveying tasks for civilian applications.

Imaging 2017, 3, 3, 4 J.J.J.Imaging Imaging 2017, 2017, 3, 44

12 of14 14 12 12 of of 14

Figure12. 12.Estimated Estimatedpath pathof ofthe thesUAVs sUAVs for for the the rectilinear rectilinear motion motion test test using using image image processing. processing. Figure Figure 12. Estimated path of the sUAVs for the rectilinear motion test using image processing.

Although the image image processing processingalgorithm algorithmdeveloped developed this study was developed to evaluate Although the inin this study was developed to evaluate and and compare the basic performance of two popular off-the-shelf sUAVs, this image compare the basic performance of two popular off-the-shelf sUAVs, this image processingprocessing algorithm algorithm can betoextended to controlling the sUAV theavoid” “sensealgorithm and avoid” algorithm can be extended controlling the sUAV similar to thesimilar “sensetoand developed by developed by Carnie et al. [15]. In addition to the other sensors in the sUAV, a vision sensor can be Carnie et al. [15]. In addition to the other sensors in the sUAV, a vision sensor can be utilized to utilized estimate the sUAV attitudewith combined with other inertial measurements andcase, GPS.features In this estimatetothe sUAV attitude combined other inertial measurements and GPS. In this case, ofimage the captured image be used to estimate the relative position image and of thefeatures captured will be used to will estimate the relative position of the image and of usethe it as feedback use it as feedback information to regulate its position such as hovering or moving in a straight information to regulate its position such as hovering or moving in a straight line motion (Figureline 13). motion (Figure 13). ofThe direction of thisthe study will beimage to extend the developed image The future direction thisfuture study will be to extend developed processing algorithm and use processing algorithm and for usethe it as an additional sensor for the sUAV. it as an additional sensor sUAV. Reference Reference features features

+

Position Position Deviation Deviation

Controller

sUAV Dynamics

Image Processing

Onboard Camera

(Position,Velocity,Attitude) (Position,Velocity,Attitude)

Extracted Extracted features features

(Image (Image feature feature analysis) analysis)

Figure 13. Concept of vision system as a feedback sensor for controlling the sUAV. Figure Figure13. 13.Concept Conceptof ofvision visionsystem systemas asaafeedback feedbacksensor sensor for for controlling controlling the the sUAV. sUAV.

J. Imaging 2017, 3, 4

13 of 14

4. Conclusions Two of the popular sUAVs in the market: 3DR Iris+ and DJI Phantom 2 were compared and evaluated using image processing. The comparison included the hovering test and the rectilinear motion test. For the hovering test, the sUAV took images of an object and the position displacement of the object in the images were used to evaluate the stability of the sUAV. The rectilinear motion test evaluated the performance of the sUAVs as it followed a straight line path. Image processing algorithms were developed to evaluate both tests. Results showed that for the hovering test, the 3DR Iris+ had a maximum position deviation of 0.64 m (RMS displacement of 0.126 m) while the DJI Phantom 2 had a maximum deviation of 0.79 m (RMS displacement of 0.15 m). In the rectilinear motion test, the maximum displacement for the 3DR Iris+ and the DJI phantom 2 were 0.85 m (RMS displacement of 0.314 m) and 0.73 m (RMS displacement of 0.372 m), respectively. These results show that both sUAVs are capable for surveying applications such as agricultural field monitoring. Acknowledgments: This research was supported by the Idaho State Department of Agriculture (Idaho Specialty Crop Block Grant 2014) and Northwest Nazarene University. Author Contributions: Duke M. Bulanon conceived the study, made the literature review, designed the experiment, processed the data, interpreted the results, and wrote the paper. Esteban Cano, Ryan Horton, and Chase Liljegren acquired the images, developed the image processing algorithm, and processed the data. Conflicts of Interest: The authors declare no conflict of interest.

Abbreviations The following abbreviations are used in this manuscript: DIY GIS GPS NGB NNU PVC RMS RGB sUAV UAV VI

Do it yourself Geographical Information System Global Positioning System Near-infrared, Green, Blue Northwest Nazarene University Polyvinyl Chloride Root Mean Square Red, Green, Blue Small Unmanned Aerial Vehicle Unmanned Aerial Vehicle Vegetation Index

References 1. 2. 3. 4. 5. 6. 7.

8.

Federal Aviation Administration. Summary of Small Unmanned Aircraft Rule. 2016. Available online: https://www.faa.gov/uas/media/Part_107_Summary.pdf (accessed on 6 January 2017). Robert, P.C. Precision agriculture: A challenge for crop nutrition management. Plant Soil 2002, 247, 143–149. [CrossRef] Lee, W.S.; Alchanatis, V.; Yang, C.; Hirafuji, M.; Moshou, D.; Li, C. Sensing technologies for precision specialty crop production. Comput. Electron. Agric. 2010, 74, 2–33. [CrossRef] Koch, B.; Khosla, R. The role of precision agriculture in cropping systems. J. Crop Prod. 2003, 9, 361–381. [CrossRef] Seelan, S.K.; Laguette, S.; Casady, G.M.; Seielstad, G.A. Remote sensing applications for precision agriculture: A learning community approach. Remote Sens. Environ. 2003, 88, 157–169. [CrossRef] Sugiura, R.; Noguchi, N.; Ishii, K. Remote-sensing technology for vegetation monitoring using an unmanned helicopter. Biosyst. Eng. 2005, 90, 369–379. [CrossRef] Albergel, C.; De Rosnay, P.; Gruhier, C.; Munoz-Sabater, J.; Hasenauer, S.; Isaken, L.; Kerr, Y.; Wagner, W. Evaluation of remotely sensed and modelled soil moisture products using global ground-based in situ observations. Remote Sens. Environ. 2012, 118, 215–226. [CrossRef] Lan, Y.; Thomson, S.J.; Huang, Y.; Hoffmann, W.C.; Zhang, H. Current status and future directions of precision aerial application for site-specific crop management in the USA. Comput. Electron. Agric. 2010, 74, 34–38. [CrossRef]

J. Imaging 2017, 3, 4

9.

10. 11. 12. 13. 14. 15.

16. 17. 18. 19. 20. 21.

22. 23. 24. 25.

14 of 14

Garcia-Ruiz, F.; Sankaran, S.; Maja, J.M.; Lee, W.S.; Rasmussen, J.; Ehsani, R. Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees. Comput. Electron. Agric. 2013, 91, 106–115. [CrossRef] Pajares, G. Overview and current status of remote sensing applications based on Unmanned Aerial Vehicles (UAVs). Photogramm. Eng. Remote Sens. 2015, 81, 281–329. [CrossRef] Thomasson, J.A.; Valasek, J. Small UAS in agricultural remote-sensing research at Texas A&M. ASABE Resour. Mag. 2016, 23, 19–21. Bulanon, D.M.; Lonai, J.; Skovgard, H.; Fallahi, E. Evaluation of different irrigation methods for an apple orchard using an aerial imaging system. ISPRS Int. J. Geo-Inf. 2016, 5, 79. [CrossRef] Chao, H.Y.; Cao, Y.C.; Chen, Y.Q. Autopilots for small unmanned aerial vehicles: A survey. Int. J. Control Autom. Syst. 2010, 8, 36–44. [CrossRef] Ettinger, S.; Nechyba, M.; Ifju, P.; Waszak, M. Vision-guided flight stability and control for microair vehicles. Adv. Robot. 2003, 17, 617–640. [CrossRef] Carnie, R.; Walker, R.; Corke, P. Image Processing Algorithms for UAV “Sense and Avoid”. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation, Orlando, FL, USA, 15–19 May 2006; pp. 2828–2853. Classification of the Unmanned Aerial Systems. Available online: https://www.e-education.psu.edu/ geog892/node/5 (accessed on 29 June 2016). Federal Aviation Administration. Available online: https://registermyuas.faa.gov (accessed on 29 June 2016). 3D Robotics (3DR). Available online: https://store.3dr.com/products/IRIS+ (accessed on 29 June 2016). Da-Jiang Innovations (DJI). Available online: http://www.dji.com/product/phantom-2 (accessed on 29 June 2016). Cetinkunt, S. Mechatronics with Experiments, 2nd ed.; Wiley & Sons Ltd.: West Sussex, UK, 2015. Pajares, G.; García-Santillán, I.; Campos, Y.; Montalvo, M.; Guerrero, J.M.; Emmi, L.; Romeo, J.; Guijarro, M.; Gonzalez-de-Santos, P. Machine-vision systems selection for agricultural vehicles: A guide. J. Imaging 2016, 2, 34. [CrossRef] Stadler, W. Analytical Robotics and Mechatronics; McGraw-Hill: New York, NY, USA, 1995. Xiang, H.; Tian, L. Development of a low-cost agricultural remote sensing system based on an autonomous unmanned aerial vehicle (UAV). Biosyst. Eng. 2011, 108, 174–190. [CrossRef] Gonzalez, R.C.; Woods, R.E. Digital Image Processing, 3rd ed.; Pearson: New York, NY, USA, 2007. Hunt, E.R., Jr.; Hively, W.D.; Fujikawa, S.J.; Linden, D.S.; Daughtry, C.S.T.; McCarty, G.W. Acquisition of NIR-Green-Blue digital photographs from unmanned aircraft for crop monitoring. Remote Sens. 2010, 2, 290–305. [CrossRef] © 2017 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).