Animal Detection Using Thermal Images and Its Required ... - MDPI

20 downloads 0 Views 3MB Size Report
Jul 3, 2018 - The shooting time difference was 30 min. The pixel resolutions of ... Shooting plan for a pair of thermal images in the UAV experiment. Figure 4.
remote sensing Article

Animal Detection Using Thermal Images and Its Required Observation Conditions Yu Oishi 1,2, * ID , Hiroyuki Oguma 3 , Ayako Tamura 4 , Ryosuke Nakamura 1 and Tsuneo Matsunaga 5 ID 1 2 3 4 5

*

Artificial Intelligence Research Center, National Institute of Advanced Industrial Science and Technology, 2-4-7 Aomi, Koto, Tokyo 135-0064, Japan; [email protected] Department of Planning and Coordination, Headquarter, National Agriculture and Food Research, Organization, 3-1-1 Kannondai, Tsukuba 305-8517, Ibaraki, Japan Center for Environmental Biology and Ecosystem Studies, National Institute for Environmental Studies, 16-2 Onogawa, Tsukuba 305-8506, Ibaraki, Japan; [email protected] Research and Survey Department, Nakanihon Air Service Co., Ltd., 17-1 Wakamiya, Nishikasugai 480-0202, Aichi, Japan; [email protected] Center for Global Environmental Research, National institute for Environmental Studies, 16-2 Onogawa, Tsukuba 305-8506, Ibaraki, Japan; [email protected] Correspondence: [email protected]; Tel.: +81-29-838-6756  

Received: 28 May 2018; Accepted: 2 July 2018; Published: 3 July 2018

Abstract: Information about changes in the population sizes of wild animals is extremely important for conservation and management. Wild animal populations have been estimated using statistical methods, but it is difficult to apply such methods to large areas. To address this problem, we have developed several support systems for the automated detection of wild animals in remote sensing images. In this study, we applied one of the developed algorithms, the computer-aided detection of moving wild animals (DWA) algorithm, to thermal remote sensing images. We also performed several analyses to confirm that the DWA algorithm is useful for thermal images and to clarify the optimal conditions for obtaining thermal images (during predawn hours and on overcast days). We developed a method based on the algorithm to extract moving wild animals from thermal remote sensing images. Then, accuracy was evaluated by applying the method to airborne thermal images in a wide area. We found that the producer’s accuracy of the method was approximately 77.3% and the user’s accuracy of the method was approximately 29.3%. This means that the proposed method can reduce the person-hours required to survey moving wild animals from large numbers of thermal remote sensing images. Furthermore, we confirmed the extracted sika deer candidates in a pair of images and found 24 moving objects that were not identified by visual inspection by an expert. Therefore, the proposed method can also reduce oversight when identifying moving wild animals. The detection accuracy is expected to increase by setting suitable observation conditions for surveying moving wild animals. Accordingly, we also discuss the required observation conditions. The discussions about the required observation conditions would be extremely useful for people monitoring animal population changes using thermal remote sensing images. Keywords: movement; observation conditions; survey; thermal image; wild animal

1. Introduction Problems related to wild animals in Japan are classified as human–wildlife conflict, endangered species, invasive alien species, or overabundant species. Human–wildlife conflict refers to sudden attacks resulting from unexpected encounters with wild animals and to agricultural damage by animals. Endangered

Remote Sens. 2018, 10, 1050; doi:10.3390/rs10071050

www.mdpi.com/journal/remotesensing

Remote Sens. 2018, 10, 1050

2 of 13

species are threatened with extinction by hunting or human activities, such as land-use changes. Invasive alien species were introduced intentionally or unintentionally by humans from overseas or from other areas in the country. Overabundant species cause serious damage to agriculture and forestry and affect ecosystems by their considerable population increases and expanding distributions. These problems are not independent, but rather are connected [1]. In particular, conflict between wildlife and humans is a serious problem at the blurred boundary between species near urban areas in Japan. These imprecise boundaries are caused by abandoned farmlands and depopulation by hunters accompanying the rapidly decreasing and ageing rural population. ‘Satoyama’ is defined as a unique human-influenced natural environment that has been shaped and sustained over a long period by diverse human activity. Studies are needed to determine how to manage satoyama landscapes in such a way that the balance between the needs of humans and nature can be restored, particularly in regard to damage to forests and farmland caused by wild animals [2]. In particular, sika deer (Cervus nippon) have caused serious damage to the agricultural and forestry industries by tree bark-stripping and ecosystem changes. The amount of damage to agriculture by wild animals totaled about 166 million US dollars, and that by sika deer was about 57 million US dollars in 2014 (1 US dollar = 115 Japanese yen) [3]. To address these problems, the conservation and management of wild animals is necessary. The adaptive management of wild animals, a systematic approach for improving resource management by learning from management outcomes [4,5], is essential. Adaptive management of wild animals consists of the prediction of increases and decreases, plans for hunting goals, abundance adjustment, and monitoring abundance using population indices to determine changes in the population size of wild animals. However, there is insufficient population information for sika deer, which are crepuscular animals with large habitat areas [6]. Although wild animal populations have been estimated using statistical methods, it is difficult to apply such methods to large areas because they require man power and are tremendously laborious. To resolve these issues, remote sensing is a promising method. However, even in open areas, it is difficult to identify animals in remote sensing images because the shapes of objects may differ markedly when viewed from above instead of from the side, as humans are accustomed to doing. Moreover, there is the potential for oversight because an enormous amount of data must be analyzed [7]. To address this issue, we have developed several support systems for the automated detection of wild animals in remote sensing images. These systems reduce the number of person-hours required to survey wild animals from large numbers of remote sensing images. One of these is the computer-aided detection of moving wild animals (DWA) algorithm [8]. The algorithm automatically extracts moving wild animals using the time difference between two overlapping visible images. The advantages of the algorithm are as follows. (i) Almost no detection errors occur, even in sparse forest areas: In forest areas, a tip of a tree appears in different positions when viewed from different points. Accordingly, simple differences between two images can cause detection errors. The DWA algorithm can be applied to sparse forests because relief displacement effects do not cause false detection; (ii) Applicable to large areas: The DWA algorithm does not require fixed cameras because it uses overlapping areas in photographs taken from a flying platform. Thus far, the use of the algorithm has been limited to the daytime because visible and near-infrared images have been used. However, many large mammals, such as sika deer, are crepuscular. For this reason, we used thermal images, in which animals can be identified in semi-dark conditions. Very few studies have used thermal remote sensing images to monitor wild animals [9–12]. Furthermore, it is difficult to distinguish animals from trees in thermal images in some observation conditions [9,10] because surface temperature contrast between detection targets and the background is essential to extract targets from thermal images. Therefore, existing studies of the application of thermal remote sensing images to monitor wild animals [11,12] are limited to open, cool areas. Urban areas contain many hotspots, such as streetlights; thus, we attempted to use pairs of overlapping thermal images obtained at different times to automatically extract only moving animals.

Remote Sens. 2018, 10, 1050

3 of 13

The three major goals of the present study were as follows. First, we evaluated the applicability of the DWA algorithm; if it could extract moving wild animals from thermal images; and the conditions necessary to apply the algorithm to thermal images. Secondly, we designed an experimental method based on the DWA algorithm to extract moving wild animals from thermal remote sensing images and applied the method to airborne thermal images in a wide area around Nara Park. Because sika deer are important for Japan’s native religion, Shinto, they are sometimes kept on the grounds of Shinto Shrines, such as Nara Park in Nara [6]. Therefore, sika deer at Nara Park have been preserved for a long time. However, agricultural damage near Nara Park has become more serious, and abundance Remote Sens. 2018, x, x FOR PEER REVIEW    3 of 13  adjustment of sika deer near Nara Park was initiated on 17 August 2017. Then, we established the required observation conditions to monitor animal population changes using thermal remote sensing remote sensing images and applied the method to airborne thermal images in a wide area around  images through the results of the airborne experiment. Nara Park. Because sika deer are important for Japan’s native religion, Shinto, they are sometimes  kept on the grounds of Shinto Shrines, such as Nara Park in Nara [6]. Therefore, sika deer at Nara  Remote Sens. 2018, x, x FOR PEER REVIEW    3 of 13  Park have been preserved for a long time. However, agricultural damage near Nara Park has become  2. Methods remote sensing images and applied the method to airborne thermal images in a wide area around  more serious, and abundance adjustment of sika deer near Nara Park was initiated on 17 August 2017.  Nara Park. Because sika deer are important for Japan’s native religion, Shinto, they are sometimes  Then,  we  established  the  required  observation  conditions  to  monitor  animal  population  changes  2.1. Applicable Evaluation kept on the grounds of Shinto Shrines, such as Nara Park in Nara [6]. Therefore, sika deer at Nara  using thermal remote sensing images through the results of the airborne experiment.  Park have been preserved for a long time. However, agricultural damage near Nara Park has become 

We evaluated whether the DWA algorithm automatically extracts moving wild animals from more serious, and abundance adjustment of sika deer near Nara Park was initiated on 17 August 2017.  2. Methods  thermal images. swimming wild ducks wereconditions  captured at intervals 30 s using a fixed thermal Then, First, we  established  the  required  observation  to  monitor  animal of population  changes  using thermal remote sensing images through the results of the airborne experiment.  camera (Thermo Shot F30; NEC Avio Infrared Technologies Co., Ltd., Tokyo, Japan), on the parapet 2.1. Applicable Evaluation  of a bridge in2. Methods  central Tokyo, Japan, on the night of 10 June 2010. The pixel resolution of images was We evaluated whether the DWA algorithm automatically extracts moving wild animals from  approximately 3 cm, and the image area was 7 m × 10 m. thermal images. First, swimming wild ducks were captured at intervals of 30 s using a fixed thermal  2.1. Applicable Evaluation  camera (Thermo Shot F30; NEC Avio Infrared Technologies Co., Ltd., Tokyo, Japan), on the parapet  We performed an unmanned air vehicle (UAV) experiment at the National Institute for Environmental of a bridge in central Tokyo, Japan, on the night of 10 June 2010. The pixel resolution of images was  Studies, which isWe evaluated whether the DWA algorithm automatically extracts moving wild animals from  located in Tsukuba, Japan (Figure 1). The details of the UAV experiment are described approximately 3 cm, and the image area was 7 m × 10 m.  thermal images. First, swimming wild ducks were captured at intervals of 30 s using a fixed thermal  later. We collected thermal images on the air  morning 6 April 2012 at an altitude of 30 m with 60% image camera (Thermo Shot F30; NEC Avio Infrared Technologies Co., Ltd., Tokyo, Japan), on the parapet  We  performed  an  unmanned  vehicle of (UAV)  experiment  at  the  National  Institute  for  overlap and a 4 sec interval using the Thermo Shot F30 loaded on a UAV (Grass HOPPER; Information & of a bridge in central Tokyo, Japan, on the night of 10 June 2010. The pixel resolution of images was  Environmental  Studies,  which  is  located  in  Tsukuba,  Japan  (Figure  1).  The  details  of  the  UAV  approximately 3 cm, and the image area was 7 m × 10 m.  experiment are described later. We collected thermal images on the morning of 6 April 2012 at an  Science Techno-System Co., Ltd., Tsukuba, Japan; Figure 2). Grass HOPPER has two automatic modes: An We  performed  an  unmanned  air  vehicle  (UAV)  experiment  at  the  National  Institute  for  automatic altitude of 30 m with 60% image overlap and a 4 sec interval using the Thermo Shot F30 loaded on a  hovering modeStudies,  and anwhich  automatic return and landing mode. UAV of  can capture Environmental  is  located  in  Tsukuba,  Japan  (Figure  1).  The The  details  the  UAV  images by UAV (Grass HOPPER; Information & Science Techno‐System Co., Ltd., Tsukuba, Japan; Figure 2).  operation on the ground. The pixel resolution of the images was approximately 5 cm. The image region experiment are described later. We collected thermal images on the morning of 6 April 2012 at an  Grass HOPPER has two automatic modes: An automatic hovering mode and an automatic return and  altitude of 30 m with 60% image overlap and a 4 sec interval using the Thermo Shot F30 loaded on a  was 15 m × 11 m. landing mode. The UAV can capture images by operation on the ground. The pixel resolution of the  UAV (Grass HOPPER; Information & Science Techno‐System Co., Ltd., Tsukuba, Japan; Figure 2).  images was approximately 5 cm. The image region was 15 m × 11 m.  Furthermore, we measured the change in the surface temperature of several objects during the Grass HOPPER has two automatic modes: An automatic hovering mode and an automatic return and  Furthermore, we measured the change in the surface temperature of several objects during the  day using a landing mode. The UAV can capture images by operation on the ground. The pixel resolution of the  radiation-thermometer (i-square ii-1064; HORIBA, Ltd., Kyoto, Japan) to evaluate the day using  a  radiation‐thermometer  (i‐square ii‐1064;  HORIBA,  Ltd.,  Kyoto, Japan)  to  evaluate  the  images was approximately 5 cm. The image region was 15 m × 11 m.  conditionsconditions necessary to obtain thermal images.  necessary to obtain thermal images. Furthermore, we measured the change in the surface temperature of several objects during the  day using  a  radiation‐thermometer  (i‐square ii‐1064;  HORIBA,  Ltd.,  Kyoto, Japan)  to  evaluate  the  conditions necessary to obtain thermal images. 

  Figure 1. Study area of the unmanned air vehicle (UAV) experiment. (a) Photo of the area; (b) Land 

  Photo of the area; (b) Land Figure 1. Study area of the unmanned air vehicle (UAV) experiment. (a) cover of the area.  Figure 1. Study area of the unmanned air vehicle (UAV) experiment. (a) Photo of the area; (b) Land  cover of the area. cover of the area. 

    Figure 2. Photography instruments used for the UAV experiment.  Figure 2. Photography instruments used for the UAV experiment. 

Figure 2. Photography instruments used for the UAV experiment.  

 

Remote Sens. 2018, x, x FOR PEER REVIEW    Remote Sens. 2018, 10, 1050

4 of 13  4 of 13

2.2. Airborne Platform Imagery  2.2. Airborne Platform Imagery Four pairs of airborne thermal images were collected using a thermal sensor (ITRES TABI‐1800)  Four pairs of airborne thermal images were collected using a thermal sensor (ITRES TABI-1800) by by Nakanihon Air Service Co., Ltd. at Nara Park in Nara, Japan at 19:22–20:22 h on 11 September  Nakanihon Air Service Co., Ltd. at Nara Park in Nara, Japan at 19:22–20:22 h on 11 September 2015 2015 (Figure 3). The air temperature at the time was approximately 20 °C. Images were obtained twice  (Figure 3). The of  air about  temperature at and  the time 20 ◦ C.difference  Images were at an at  an  altitude  1000  m  1300 was m.  approximately The  shooting  time  was obtained 30  min. twice The  pixel  altitude of about 1000 m and 1300 m. The shooting time difference was 30 min. The pixel resolutions of resolutions of the images were approximately 40 cm and 50 cm, and the image area was 2.9 km × 1.9  the images were approximately 40 cm and 11,000  50 cm, and the pixels.  image area 2.9 kmthermal  × 1.9 km. The image km.  The  image  size  was  approximately  ×  8000  The was airborne  images  were  size was approximately 11,000 × 8000 pixels. The airborne thermal images were already map projected already map projected in the following procedure. (i) Positioning decision of the aircraft using the  in the following procedure. Positioning decision theInertial  aircraftMeasurement  using the Global Navigating Satellite Global  Navigating  Satellite (i)System  (GNSS)  and  of the  Unit  (IMU);  (ii)  Map  System (GNSS) and the Inertial Measurement Unit (IMU); (ii) Map projection using free Digital Elevation projection  using  free  Digital  Elevation  Model  (DEM)  of  5  m  resolution  which  is  provided  by  the  Model (DEM) of 5 m resolution which is provided by the Geospatial Information Authority of Japan [13], Geospatial Information Authority of Japan [13], after rearranging every pixel to be 40 cm per pixel.  after rearranging every pixel to be 40 cm per pixel. We also used the visual inspection results of the airborne We also used the visual inspection results of the airborne thermal images by an expert using pairs of  thermal images by an expert using pairs of the thermal images for the evaluation of the proposed method. the thermal images for the evaluation of the proposed method. The visual inspection results were  The visual inspection results were verified by ground observation counting at the same time in a previous verified by ground observation counting at the same time in a previous study. In the comparative  study. In the comparative result, the averaged accuracy of the visual inspection was over 88% [14]. result, the averaged accuracy of the visual inspection was over 88% [14]. 

  Figure 3. Location of the study area of the airborne experiment; the red rectangle on a Landsat 8 Pan‐ Figure 3. Location of the study area of the airborne experiment; the red rectangle on a Landsat sharpen image shows the range used for aerial photography at Nara Park, Japan. Nara Park is located  8 Pan-sharpen image shows the range used for aerial photography at Nara Park, Japan. Nara Park is in the boundary between an urban region and a mountainous region.  located in the boundary between an urban region and a mountainous region.

2.3. Methods for the Ground Experiment  2.3. Methods for the Ground Experiment At the first, we confirm that the DWA algorithm can automatically extract moving targets from  At the first, we confirm that the DWA algorithm can automatically extract moving targets from thermal  images. We We  captured  swimming  ducks  thermal  to  a The bridge.  The  thermal images. captured swimming wildwild  ducks with with  thermal cameracamera  fixed tofixed  a bridge. shooting shooting time difference was 30 s.  time difference was 30 s. 2.4. Methods for the Unmanned Air Vehicle Unmanned Air Vehicle (UAV) Experiment  2.4. Methods for the Unmanned Air Vehicle Unmanned Air Vehicle (UAV) Experiment To evaluate the application of the method, it is necessary to confirm the following two points:  To evaluate the application of the method, it is necessary to confirm the following two points: 1. 1. 2. 2.

1.

1.

Detection of moving targets  Detection of moving targets It is necessary to confirm that the DWA algorithm can automatically It is necessary to confirm that the DWA algorithm can automatically extract moving targets from  extract moving targets from thermal UAV images. thermal UAV images.  Detection error of non-moving objects The DWA algorithm was designed to avoid the extraction Detection error of non‐moving objects  of non-moving objects. Accordingly, it is necessary to confirm that the DWA algorithm does not The DWA algorithm was designed to avoid the extraction of non‐moving objects. Accordingly,  extract non-moving objects, which only change in shape in thermal images. it is necessary to confirm that the DWA algorithm does not extract non‐moving objects, which  We used two pairs of thermal images obtained the following shooting procedure (Figure 4). only change in shape in thermal images.  A walking human A standing human on the road was photographed by a hovering UAV. The same We used two pairs of thermal images obtained the following shooting procedure (Figure 4).  walking human on the road was photographed again by the hovering UAV after the position of A walking human  the UAV was moved.

Remote Sens. 2018, x, x FOR PEER REVIEW     Remote Sens. 2018, x, x FOR PEER REVIEW 

5 of 13  5 of 13 

A  standing  standing  human  human  on  on  the  the  road  road  was  was  photographed  photographed  by  by  a  a  hovering  hovering  UAV.  UAV.  The  The  same  same  walking  walking  A  human on the road was photographed again by the hovering UAV after the position of the UAV  human on the road was photographed again by the hovering UAV after the position of the UAV  Remote Sens. 2018, 10, 1050 5 of 13 was moved.  was moved.  2. A standing human and dog that only change their poses  A standing human and dog that only change their poses  2. A standing human and dog on the grass were photographed by the hovering UAV. After only  2. A standing human and dog on the grass were photographed by the hovering UAV. After only  A standing human and dog that only change their poses A standing human and dog on the grass changing poses, they were photographed again by the hovering UAV after the position of the  were photographed by the hovering UAV. After only changing poses, they were photographed changing poses, they were photographed again by the hovering UAV after the position of the  UAV was moved.  again by the hovering UAV after the position of the UAV was moved. UAV was moved. 

   Figure 4. Shooting plan for a pair of thermal images in the UAV experiment.  Figure 4. Shooting plan for a pair of thermal images in the UAV experiment.  Figure 4. Shooting plan for a pair of thermal images in the UAV experiment.

2.5. Outline of the Computer‐Aided Detection of Moving Wild Animals (DWA) Algorithm  2.5. Outline of the Computer‐Aided Detection of Moving Wild Animals (DWA) Algorithm  2.5. Outline of the Computer-Aided Detection of Moving Wild Animals (DWA) Algorithm Several  studies  studies  have  have  been  been  conducted  conducted  to  to  extract  extract  moving  moving  objects  objects  from  from  images,  images,  which  which  were  were  Several  Several studies havecamera,  been conducted to extract moving on  objects from images, which[15].  wereHowever,  captured captured  with  a  fixed  using  logical  operations  positional  differences  captured  with  a  fixed  camera,  using  logical  operations  on  positional  differences  [15].  However,  with a fixed camera, using logical operations positional differences However,caused  becauseby  these because  these  methods  misidentify  trees  as  as on moving  objects  due  to  to [15]. displacement  the  because  these  methods  misidentify  trees  moving  objects  due  displacement  caused  by  the  methods misidentify trees as moving objects due to displacement caused by the movement of the observer, movement of the observer, such as UAV or aircraft, these methods cannot be applied to aerial images.  movement of the observer, such as UAV or aircraft, these methods cannot be applied to aerial images.  such as UAVto  oreliminate  aircraft, these methods cannotmoving  be applied to aerial images. Therefore, to eliminate trees as Therefore,  trees  as candidate  candidate  objects,  we  developed  a logical  logical  operation  that  Therefore,  to  eliminate  trees  as  moving  objects,  we  developed  a  operation  that  candidate moving objects, we developed a logical operation that considers displacement. considers displacement.  considers displacement.  Figure 5 shows the relief displacement effects and the outline of the DWA algorithm. The top of a Figure 5 shows the relief displacement effects and the outline of the DWA algorithm. The top of  Figure 5 shows the relief displacement effects and the outline of the DWA algorithm. The top of  tree (A) is observed at different positions (A1 and A2) in overlapping images due to the movement a tree (A) is observed at different positions (A1 and A2) in overlapping images due to the movement  a tree (A) is observed at different positions (A1 and A2) in overlapping images due to the movement  of the observer. This is referred to as the relief displacement effect. However, the base of the tree (B) of the observer. This is referred to as the relief displacement effect. However, the base of the tree (B)  of the observer. This is referred to as the relief displacement effect. However, the base of the tree (B)  does not move. In the case of a moving animal, the toes of the animal (C1 and C2) appear at different does not move. In the case of a moving animal, the toes of the animal (C1 and C2) appear at different  does not move. In the case of a moving animal, the toes of the animal (C1 and C2) appear at different  positions. If part of a candidate-object overlaps in different images, then this candidate is rejected; positions. If part of a candidate‐object overlaps in different images, then this candidate is rejected; the  positions. If part of a candidate‐object overlaps in different images, then this candidate is rejected; the  the object is not a moving object. This is a simple, but highly effective method [8]. object is not a moving object. This is a simple, but highly effective method [8].  object is not a moving object. This is a simple, but highly effective method [8]. 

   Figure 5.  5. Difference  Difference  between  a  fixed object  object and  and a  moving animal  animal in  in the  the time‐series  time‐series images  images by  by the  the  Figure  Figure 5. Difference between  between a  a fixed  fixed object and aa moving  moving animal in the time-series images by the movement of the observer. (a) Schematic diagram: The top of a tree (A) captured from point O1 is  movement of the observer. (a) Schematic diagram: The top of a tree (A) captured from point O1 is  movement of the observer. (a) Schematic diagram: The top of a tree (A) captured from point O1 is located at A1, whereas the top of the same tree captured from point O2 is located at A2. The base of  located at A1, whereas the top of the same tree captured from point O2 is located at A2. The base of  located at A1, whereas the top of the same tree captured from point O2 is located at A2. The base of the tree (B) captured from points O1 and O2 is always located in the same position; (b) In the case of  the tree (B) captured from points O1 and O2 is always located in the same position; (b) In the case of  the tree (B) captured from points O1 and O2 is always located in the same position; (b) In the case of a of the animal appear in different positions when viewed from points O1  a moving animal, the toes moving animal, the toes of  of the animal appear in different positions when viewed from points O1  the animal appear in different positions when viewed from points O1 and a moving animal, the toes and  O2;  (c)  Schematic  diagram  of the  the computer‐aided  computer‐aided  detection  of wild moving  wild  animals  (DWA)  O2; (c) Schematic diagram of the computer-aided detectiondetection  of moving animals (DWA) algorithm. and  O2;  (c)  Schematic  diagram  of  of  moving  wild  animals  (DWA)  algorithm.  algorithm. 

  

Remote Sens. 2018, 10, 1050 Remote Sens. 2018, x, x FOR PEER REVIEW   

6 of 13 6 of 13 

2.6. Application of the Method to Thermal Images 2.6. Application of the Method to Thermal Images  Figure 6 shows examples of thermal images and their corresponding images for comparison. It is Figure 6 shows examples of thermal images and their corresponding images for comparison. It  impossible to identify wild animals only using a single thermal image because there are several kinds is impossible to identify wild animals only using a single thermal image because there are several  of kinds of hot objects and many local hot spots. Therefore, we must extract hot moving objects using  hot objects and many local hot spots. Therefore, we must extract hot moving objects using the gap in the gap in a pair of thermal images obtained at different time points.  a pair of thermal images obtained at different time points.

  Figure 6. Example input images and comparative images. Expanded image size is approximately 900 ×  Figure 6. Example input images and comparative images. Expanded image size is approximately 1000 pixels. (a) An original image. The red rectangle indicates the location of (b–d) and the green rectangle  900 × 1000 pixels. (a) An original image. The red rectangle indicates the location of (b–d) and the indicates the location of (e–g); (b) Input image 1 in an urban area; (c) Input image 2, corresponding to image  green rectangle indicates the location of (e–g); (b) Input image 1 in an urban area; (c) Input image 1; (d) Results obtained by the visual inspection of input image 2. Red circles indicate sika deer identified by  2, corresponding to image 1; (d) Results obtained by the visual inspection of input image 2. Red an expert. Blue circles indicate local hot spots that are not sika deer. Green circles likely indicate streetlights;  circles indicate sika deer identified by an expert. Blue circles indicate local hot spots that are not sika (e) Input image 3 in a forest area; (f) Input image 4, corresponding to image 3; (g) Input image 4 in color.  deer. Green circles likely indicate streetlights; (e) Input image 3 in a forest area; (f) Input image 4, The red circle indicates sika deer identified by an expert.  corresponding to image 3; (g) Input image 4 in color. The red circle indicates sika deer identified by anFigure 7 shows the flowchart of moving animal detection from thermal remote sensing images.  expert.

The method can be used to extract moving sika deer in the thermal images as follows:  Figure 7 shows the flowchart of moving animal detection from thermal remote sensing images. 1. Make binary images  The method can be used to extract moving sika deer in the thermal images as follows:

Remote Sens. 2018, 10, 1050 Remote Sens. 2018, x, x FOR PEER REVIEW   

1.

2. 2.

3. 3. 4. 4.

5.

7 of 13 7 of 13 

Initially, binary images are constructed using the Laplacian histogram method [16] with a moving  Make binary images Initially, binary images are constructed using the Laplacian histogram window function, the P‐tile method [17], and the Otsu method [18] to extract objects whose surface  method [16] with a moving window function, the P-tile method [17], and the Otsu method [18] temperature is higher than the surrounding temperature. The binarization is performed only for  to extract objects whose surface temperature thanIn the surrounding temperature. pixels  in  the  possible  surface  temperature  range isof higher sika  deer.  this  study,  we  set  thresholds  The binarization is performed only for pixels in the possible surface temperature range of sika surface temperature range of 15.0–20.0 °C based on the previous study that showed the surface  ◦ C based on the deer. In this study, we set thresholds surface temperature range of 15.0–20.0 temperature range of sika deer in the airborne thermal images was 17.0–18.0 °C [14].  previous study that showed the surface temperature range of sika deer in the airborne thermal Edge detection  images was 17.0–18.0 ◦ C [14]. Although the surface temperatures in many regions are higher than the surrounding temperature,  Edge detection Although the surface temperatures in many regions are higher than the surrounding the curve of the change in surface temperature is typically gentle. However, the contours of sika  temperature, the curve of the change in surface temperature is typically gentle. However, the contours deer are more distinct. Therefore, edge detection using the Laplacian filter is performed.  of sika deer are more distinct. Therefore, edge detection using the Laplacian filter is performed. Extract only overlapping objects  Extract only overlapping objects To integrate the results of (1) and (2), we extract only objects that To integrate the results of (1) and (2), we extract only objects that overlap between (1) and (2).  overlap between (1) and (2). Classify moving animal candidates according to area  Classify moving animal candidates according to area We reject the extracted objects which are different We reject the extracted objects which are different size from target objects in the thermal images,  size from target objects inand  the human.  thermal images, as cars, artifacts, and human. In this set such  as  cars,  artifacts,  In  this such study,  we  set  thresholds  size  range  of study, 3–25  we pixels  thresholds size range of 3–25 pixels because the pixel resolution of the airborne thermal images (40 cm because the pixel resolution of the airborne thermal images (40 cm and 50 cm) and head and  and 50 cm) and head and body length of sika deer is 90–190 cm [6]. body length of sika deer is 90–190 cm [6].  Compare two images If part of a candidate object overlaps in different images or the surface Compare two images  temperature of the candidate object isin  almost equal for the pixels in different images, If  part  of  a  candidate  object  overlaps  different  images  or same the  surface  temperature  of  the  then this candidate is rejected, and the object is not considered a moving object. In this study, candidate object is almost equal for the same pixels in different images, then this candidate is  we set a and  threshold surface temperature gap of 0 ◦ C. object.  Furthermore, changed the rejected,  the  object  is  not  considered  a  moving  In  this we study,  we  set  a  threshold threshold  surface temperature gap from 0 to 0.5, according to the gap in the average surface temperatures surface temperature gap of 0 °C. Furthermore, we changed the threshold surface temperature  in each image. gap from 0 to 0.5, according to the gap in the average surface temperatures in each image. 

  Figure 7. Flowchart of the proposed method to extract moving animals from thermal remote sensing images.  Figure 7. Flowchart of the proposed method to extract moving animals from thermal remote sensing images.

3. Results  3. Results 3.1. Applicability Results  3.1. Applicability Results We  applied the the  DWA  algorithm  to  wild  ducks  swimming  the in river  in  thermal  images  to  We applied DWA algorithm to wild ducks swimming in thein  river thermal images to confirm confirm whether the algorithm automatically extracts moving wild animals in thermal images. Figure  whether the algorithm automatically extracts moving wild animals in thermal images. Figure 8 shows 8 shows that the algorithm successfully extracted two wild ducks.  that the algorithm successfully extracted two wild ducks. Next,  we performed performed  a  UAV  experiment  using  two  pairs  of  thermal  by  two  Next, we a UAV experiment using two pairs of thermal imagesimages  obtainedobtained  by two shooting shooting plans. Figure 9 shows the input images and Figure 10 shows the results. We were able to  plans. Figure 9 shows the input images and Figure 10 shows the results. We were able to automatically automatically extract a walking human in the area of overlap in a pair of thermal images by applying  extract a walking human in the area of overlap in a pair of thermal images by applying the algorithm the algorithm (Figure 10a), and the algorithm did not extract a human and a dog that changed their  (Figure 10a), and the algorithm did not extract a human and a dog that changed their poses but did not poses but did not move (Figure 10b).  move (Figure 10b).

Remote Sens. 2018, 10, 1050

8 of 13

Remote Sens. 2018, x, x FOR PEER REVIEW   

8 of 13 

Remote Sens. 2018, x, x FOR PEER REVIEW   

8 of 13 

Remote Sens. 2018, x, x FOR PEER REVIEW   

8 of 13 

  Figure 8. Thermal images of two wild ducks swimming in the river captured with a fixed thermal 

Figure 8. Thermal images of two wild ducks swimming in the river captured with a fixed thermal   camera (320 × 240 pixels) fixed to a bridge in Tokyo. (a,b) Thermal images 1 and 2; (c) Two ducks were    camera (320 × 240 pixels) fixed to a bridge in Tokyo. (a,b) Thermal images 1 and 2; (c) Two ducks were Figure 8. Thermal images of two wild ducks swimming in the river captured with a fixed thermal  automatically extracted by the DWA algorithm.  Figure 8. Thermal images of two wild ducks swimming in the river captured with a fixed thermal  automatically extracted by the DWA algorithm. camera (320 × 240 pixels) fixed to a bridge in Tokyo. (a,b) Thermal images 1 and 2; (c) Two ducks were  camera (320 × 240 pixels) fixed to a bridge in Tokyo. (a,b) Thermal images 1 and 2; (c) Two ducks were 

automatically extracted by the DWA algorithm.  automatically extracted by the DWA algorithm. 

    Figure  9.  Two  pairs  of  thermal  images  used  for  an  UAV  experiment.  (a)  White  circles  indicate  a    circles  indicate  a  Figure  9.  Two  pairs  of  thermal  images  used  for  an  UAV  experiment.  (a)  White  walking human; (b) White circles indicate a standing human and a dog. A red circle indicates a human  Figure 9. Two pairs of thermal images used for an UAV experiment. (a) White circles indicate a walking walking human; (b) White circles indicate a standing human and a dog. A red circle indicates a human  Figure  9.  Two  pairs  of  thermal  images  used  for  an  UAV  experiment.  (a)  White  circles  indicate  a  who is a non‐target for detection because the human is not in an overlapping area.  human; (b)who is a non‐target for detection because the human is not in an overlapping area.  White circles indicate a standing human and a dog. A red circle indicates a human who is a walking human; (b) White circles indicate a standing human and a dog. A red circle indicates a human  non-target for detection because the human is not in an overlapping area. who is a non‐target for detection because the human is not in an overlapping area. 

 

 

Figure 10. Results of an UAV experiment. (a) A walking human was automatically extracted by the  Figure 10. Results of an UAV experiment. (a) A walking human was automatically extracted by the  DWA algorithm; (b) A standing human and dog were not extracted by the DWA algorithm.   

DWA algorithm; (b) A standing human and dog were not extracted by the DWA algorithm.  Figure 10. Results of an UAV experiment. (a) A walking human was automatically extracted by the  Figure 10.In this experimental case, it was easy to distinguish the detection target from the background  Results of an UAV experiment. (a) A walking human was automatically extracted by the DWA algorithm; (b) A standing human and dog were not extracted by the DWA algorithm.  In this experimental case, it was easy to distinguish the detection target from the background  because the surface temperatures of understory plants and tree leaves were lower than those of a 

DWA algorithm; (b) A standing human and dog were not extracted by the DWA algorithm. because the surface temperatures of understory plants and tree leaves were lower than those of a  human  and  a  dog.  However,  thermal  contrast  might  be  small  depending  on  the  observation  In this experimental case, it was easy to distinguish the detection target from the background  conditions.  We  examined  difficult  cases  for  the  application  thermal  images  by on  measuring  the  human  and  a  dog.  However,  thermal  contrast  might  be of  small  depending  the  observation  surface temperature of objects as background using a radiation thermometer. Figure 11a shows data  because the surface temperatures of understory plants and tree leaves were lower than those of a  In this experimental case, it wascases  easy for  to distinguish theof  detection target from the background conditions.  We  examined  difficult  the  application  thermal  images  by  measuring  the  obtained  a  rainy  spring  day.  surface  temperature  of small  the  is  approximately  25  °C. those of a human  and  a on  dog.  However,  thermal  contrast  might and be  depending  the  observation  because the surface temperatures of The  understory plants treehuman  leaves wereon  lower than surface temperature of objects as background using a radiation thermometer. Figure 11a shows data  Animals can be identified in thermal images throughout the day. Figure 11b,c show data obtained on  conditions.  examined  difficult  cases  for might the  application  of  thermal  images  by  measuring  the  obtained  on We  a However, rainy  spring  day.  The  surface  temperature  the  human  is the approximately  25  °C.  human and a dog. thermal contrast be smallof  depending on observation conditions. two clear spring days. On both days, the surface temperature of the human is over 25 °C. In these  surface temperature of objects as background using a radiation thermometer. Figure 11a shows data  Animals can be identified in thermal images throughout the day. Figure 11b,c show data obtained on  We examined difficult cases for the application of thermal images by measuring the surface temperature cases, animals can be identified in thermal images in the morning or evening. Figure 11d shows data  obtained  on  a  rainy  spring  day.  The  surface  temperature  of  the  human  is  approximately  25  °C.  two clear spring days. On both days, the surface temperature of the human is over 25 °C. In these  obtained on a summer day. The surface temperature of the human was approximately 35 °C in the  of objects as background using a radiation thermometer. Figure 11a shows data obtained on a rainy Animals can be identified in thermal images throughout the day. Figure 11b,c show data obtained on  cases, animals can be identified in thermal images in the morning or evening. Figure 11d shows data  spring day.early morning and over 40 °C at 10:00 h. Animals can be identified in thermal images only in the  The surface temperature of the human is approximately 25 ◦ C. Animals can be identified two clear spring days. On both days, the surface temperature of the human is over 25 °C. In these  obtained on a summer day. The surface temperature of the human was approximately 35 °C in the  morning. Figure 12 shows visible and thermal images of an elephant and deer at Inokashira Park Zoo  in thermal images throughout the day. Figure 11b,c show data obtained on two clear spring days. cases, animals can be identified in thermal images in the morning or evening. Figure 11d shows data  in Tokyo, Japan. The surface temperature of the elephant is higher than that of the deer. Since animals  early morning and over 40 °C at 10:00 h. Animals can be identified in thermal images only in the  On both days, the surface temperature of the human is over 25 ◦ C. In these cases, animals can be covered with their hair can suppress heat radiation, their surface temperature, which is close to the  obtained on a summer day. The surface temperature of the human was approximately 35 °C in the  morning. Figure 12 shows visible and thermal images of an elephant and deer at Inokashira Park Zoo  outside air temperature, is lower than their body temperature (Figure 12d). However even in the deer  identified in thermal images in the morning or evening. Figure 11d shows data obtained on a summer early morning and over 40 °C at 10:00 h. Animals can be identified in thermal images only in the  in Tokyo, Japan. The surface temperature of the elephant is higher than that of the deer. Since animals  the surface temperature of eyes and nose was higher than that of other body parts (Figure 12d).  morning. Figure 12 shows visible and thermal images of an elephant and deer at Inokashira Park Zoo  day. The surface temperature of the human was approximately 35 ◦ C in the early morning and over covered with their hair can suppress heat radiation, their surface temperature, which is close to the  in Tokyo, Japan. The surface temperature of the elephant is higher than that of the deer. Since animals  40 ◦ Coutside air temperature, is lower than their body temperature (Figure 12d). However even in the deer  at 10:00 h. Animals can be identified in thermal images only in the morning. Figure 12 shows covered with their hair can suppress heat radiation, their surface temperature, which is close to the  the surface temperature of eyes and nose was higher than that of other body parts (Figure 12d).  visible and thermal images of an elephant and deer at Inokashira Park Zoo in Tokyo, Japan. The surface outside air temperature, is lower than their body temperature (Figure 12d). However even in the deer  temperature of the elephant is higher than that of the deer. Since animals covered with their hair the surface temperature of eyes and nose was higher than that of other body parts (Figure 12d). 

can suppress heat radiation, their surface temperature, which is close to the outside air temperature, is lower than their body temperature (Figure 12d). However even in the deer the surface temperature of eyes and nose was higher than that of other body parts (Figure 12d).

Remote Sens. 2018, 10, 1050

9 of 13

Remote Sens. 2018, x, x FOR PEER REVIEW   

9 of 13 

Remote Sens. 2018, x, x FOR PEER REVIEW   

9 of 13 

  Figure 11. The change in air temperature and surface temperature of the background. (a) 11 April 2012.    Figure 11. The change in air temperature and surface temperature of the background. (a) 11 April 2012. Conditions were rainy and cool. The surface temperature of the human was approximately 25 °C; (b) 12  Figure 11. The change in air temperature and surface temperature of the background. (a) 11 April 2012.  Conditions were rainy and cool. The surface temperature of the human was approximately 25 ◦ C; April 2012. Conditions were clear and warm in the daytime. The surface temperature of the human was  Conditions were rainy and cool. The surface temperature of the human was approximately 25 °C; (b) 12  (b) 12 April 2012. Conditions were clear and warm in the daytime. The surface temperature of the human over  25  °C;  (c)  18  May  2012.  Conditions  were  clear  after  torrential  rain  in  the  morning.  The  surface  April 2012. Conditions were clear and warm in the daytime. The surface temperature of the human was  was over 25 ◦ C; (c) 18 May 2012. Conditions were clear after torrential rain in the morning. The surface temperature of the human was over 25 °C; (d) Late July 2012. Conditions were clear and hot. The surface  over  25  °C;  (c)  18  May  2012.  Conditions  were  clear  after  torrential  rain  in  the  morning.  The  surface  temperature of the human was over 25 ◦ C; (d) Late July 2012. Conditions were clear and hot. The surface temperature of the human was approximately 35 °C in the early morning and over 40 °C at 10:00 h.  temperature of the human was over 25 °C; (d) Late July 2012. Conditions were clear and hot. The surface 

temperature of the human was approximately 35 ◦ C in the early morning and over 40 ◦ C at 10:00 h. temperature of the human was approximately 35 °C in the early morning and over 40 °C at 10:00 h. 

  Figure 12. Visible and thermal images of an elephant and deer at Inokashira Park Zoo in Tokyo, Japan.    (a)  Visible  image  of  an  elephant;  (b)  Thermal  image  of  an  elephant;  (c)  Visible  image  of  deer;  (d)  Figure 12. Visible and thermal images of an elephant and deer at Inokashira Park Zoo in Tokyo, Japan.  Figure 12. Visible and thermal images of an elephant and deer at Inokashira Park Zoo in Tokyo, Thermal image of deer.  (a)  Visible  image  of  an  elephant;  (b)  Thermal  image  of  an  elephant;  (c)  Visible  image  of  deer;  (d) 

Japan. (a) Visible image of an elephant; (b) Thermal image of an elephant; (c) Visible image of deer; Thermal image of deer.  (d) Thermal image of deer. 3.2. Airborne Examination Results 

3.2. Airborne Examination Results  We applied the proposed method to four pairs of airborne thermal images (Figure 13a,b), and 

3.2. Airborne Examination Results then compared the processed results with results obtained by visual inspection by an expert (Figure  We applied the proposed method to four pairs of airborne thermal images (Figure 13a,b), and 

13d). Figure 13e shows an extracted result obtained by the proposed method. Red dots represent the  then compared the processed results with results obtained by visual inspection by an expert (Figure  We applied the proposed method to four pairs of airborne thermal images (Figure 13a,b), and then extracted results as moving animal candidates. The producer’s accuracies, which correspond to the  13d). Figure 13e shows an extracted result obtained by the proposed method. Red dots represent the  compared the processed results with results obtained by visual inspection by an expert (Figure 13d). error of omission, were 75.3%, 77.5%, 78.8%, and 77.7%. Moreover, we confirmed the extracted objects  extracted results as moving animal candidates. The producer’s accuracies, which correspond to the  Figure 13e shows an extracted result obtained by the proposed method. Red dots represent the error of omission, were 75.3%, 77.5%, 78.8%, and 77.7%. Moreover, we confirmed the extracted objects  extracted results as moving animal candidates. The producer’s accuracies, which correspond to the error of omission, were 75.3%, 77.5%, 78.8%, and 77.7%. Moreover, we confirmed the extracted objects by the proposed method for a pair of thermal images (Table 1). The user’s accuracy, which corresponds

Remote Sens. 2018, 10, 1050

10 of 13

Remote Sens. 2018, x, x FOR PEER REVIEW   

10 of 13 

to theby error of commission, 29.3%. we(Table  found1).  24The  moving thatwhich  were not the  proposed  method was for  a  pair  of Additionally, thermal  images  user’s objects accuracy,  identified by visual inspection by an expert. These oversights were mainly explained by the threshold corresponds to the error of commission, was 29.3%. Additionally, we found 24 moving objects that  were not identified by visual inspection by an expert. These oversights were mainly explained by the  surface temperature gap being too restrictive for the comparison between two images. There were being  too  restrictive  for the the expert. comparison  between  two  of images.  manythreshold  cases in surface  which temperature  objects weregap  distinguishable by only The main causes detection errorsThere were many cases in which objects were distinguishable by only the expert. The main causes of  were noise, aberrant positions, and local hot spots. detection errors were noise, aberrant positions, and local hot spots.  Furthermore, we changed the threshold surface temperature gap from 0 to 0.5, according to Furthermore, we changed the threshold surface temperature gap from 0 to 0.5, according to the  the gap in the average surface temperatures in each image (Figure 12) because there are tradeoffs in gap  in  the  average  surface  temperatures  in  each  image  (Figure  12)  because  there  are  tradeoffs  in  maximizing accuracy while minimizing oversight and overestimate. In this analysis, the producer’s maximizing accuracy while minimizing oversight and overestimate. In this analysis, the producer’s  accuracy was 84.2%, but the user’s accuracy was 3.4%. accuracy was 84.2%, but the user’s accuracy was 3.4%. 

  Figure 13. A pair of airborne thermal images. (a) The average surface temperature of thermal image  Figure 13. A pair of airborne thermal images. (a) The average surface temperature of thermal image 1 was 17.0 °C and (b) the average of surface temperature of thermal image 2 was 16.5 °C; (c) Gap in  1 was 17.0 ◦ C and (b) the average of surface temperature of thermal image 2 was 16.5 ◦ C; (c) Gap in the surface temperature between (a,b); (d) Visual inspection results obtained by an expert. Red dots  the surface temperature between (a,b); (d) Visual inspection results obtained by an expert. Red dots represent  the moving  sika deer;  (e)  Extraction  results  obtained  by  the  proposed  method.  Red  dots  represent the moving sika deer; (e) Extraction results obtained by the proposed method. Red dots represent the extracted results as moving animal candidates. 

represent the extracted results as moving animal candidates. Table  1.  Producer’s  accuracy  and  User’s  accuracy  calculated  by  comparisons  between  visual 

Tableinspection results and extraction results using a pair of airborne thermal images.  1. Producer’s accuracy and User’s accuracy calculated by comparisons between visual inspection results and extraction results The Number of Sika  using a pair of airborne thermal images. The Number of Sika    Total  Moving sika deer  Stopping sika deer  Total Not in an overlapping area  Moving sika   deer

Stopping sika deer Not in an overlapping area 4. Discussion 

Deer Identified by  Visual Inspection  The Number of Sika 357  Deer Identified by Visual299  Inspection 155  357 58  299  

Deer Extracted by the  Proposed Method  The Number of Sika 849  Deer Extracted by the 225  Method Producer’s accuracy 75.3%  Proposed     849 225 Producer’s accuracy 75.3% User’s accuracy 29.3% 

155 58 User’s accuracy 29.3%

In this section we discuss the required observation conditions for monitoring animal population  changes using thermal remote sensing images through issues and methods for improving detection.  4. Discussion The  main  causes  of  overabundances  were  noise,  aberrant  positions,  and  local  hot  spots.  This  indicates  that  some  pre‐processing  for  noise  reduction  is  necessary.  There  are animal two  potential  In this section we discuss the required observation conditions for monitoring population explanations  for  the  aberration  of  a  position,  registration  errors  of  two  images  and  the  relief  changes using thermal remote sensing images through issues and methods for improving detection. displacement effect. With respect to registration errors, the addition of pre‐processing may be useful.  The main causes of overabundances were noise, aberrant positions, and local hot spots. This indicates The direction of relief displacement is determined by the geometry between a sensor position and a  that some pre-processing for noise reduction is necessary. There are two potential explanations for the target position. However, to determine the distance of relief displacement, height information for the  aberration of a position, registration errors of two images and the relief displacement effect. With respect to target  is  necessary.  Thus,  there  is  almost  no  case  in  which  distance  can  be  determined  from  only  registration errors, the addition of pre-processing may be useful. The direction of relief displacement is thermal images. This is particularly difficult for objects such as streetlights, for which only the top is 

determined by the geometry between a sensor position and a target position. However, to determine the distance of relief displacement, height information for the target is necessary. Thus, there is almost no case in which distance can be determined from only thermal images. This is particularly difficult for objects such as streetlights, for which only the top is hot; although streetlights were rejected by their surface temperature,

Remote Sens. 2018, 10, 1050

11 of 13

over 20 ◦ C, in this study. Therefore, a major area for improvement is to use the same observation geometry to obtain both images. The overabundances, which include local hot spots, are related to oversights caused by a threshold surface temperature gap in comparing two images. To solve this problem, it is necessary to optimize thresholds and perform additional processing. The factors that determine the extraction of moving wild animals from remote sensing images have been discussed previously [8]. These previous analyses indicated the importance of the following factors: 1. 2.

3.

4.

The spatial resolution must be finer than one-fifth of the body length of the target species to automatically extract targets from remote sensing images. Objects under tree crowns do not appear in aerial images. The possibility of extracting moving wild animals decreases as the area of tree crowns in an image increases. Although a correction of the number of extracted moving wild animals using the proportion of forest is necessary for population estimates, the correction is not necessary to grasp the population change by using the number of extracted animals as a population index. Wild animals exhibit well-defined activity patterns, such as sleeping, foraging, migration, feeding, and resting. To extract moving wild animals, the target species should be moving when the survey is conducted. When shooting intervals are too long, targets can move out of the area of overlap between two images. In contrast, when shooting intervals are too short, targets cannot be extracted because the movement distance in a given interval must be longer than the body length. Thus, shooting intervals are decided after a survey of the movement speed of the target species in observation period.

The pixel resolution of airborne thermal images in this study (40 cm and 50 cm) was rather low because the head and body length of sika deer show striking variation, at 90–190 cm [6]. Meanwhile, approximately one-third of sika deer were not moving. Thus, we should reconsider the best time for obtaining images. Moreover, we must consider another factor, radiative cooling, when determining shooting intervals. The surface temperature of animals covered with hair differs from the air temperature because the hair isolates the external heat. Although the gap in the surface temperature between sika deer and the background temperature was not large, the shooting intervals cause a thermal gap between two images by radiative cooling. Furthermore, conductivity is related to thermal conductivity, which differs according to materials (Figure 12c). Therefore, we firmly recommend that shooting be performed in the predawn hours or early morning. The proposed method in consideration of these is potentially capable to extract moving wild animals in thermal remote sensing images to monitor animal population changes. In the image recognition field, some related studies were presented [19,20]. These studies about saliency detection method for thermal images. It is considered that these methods are useful to extract moving animal candidates from thermal images in open areas not like Figure 6. Meanwhile, the latter study applies steering kernel regression [21], which is a robust method to random noise in a thermal image. In the future, we will try to use the image texture to screen the noise because we were able to judge noise using the image texture based on visual inspection; and apply the steering kernel regression. Furthermore, some existing studies use deep learning methods for saliency detection [22]. Therefore, we will try to use machine learning methods, including deep learning, to resolve these issues and achieve quasi-real-time processing. Nakanihon Air Service Co., Ltd. (Aichi, Japan) operates a combination system, which consists of a hyper spectral sensor, a thermal sensor, a Light Detection and Ranging (LiDAR), and an RGB camera. In the future, we will consider data fusion in the morning because CAST can obtain these data and images simultaneously. We might be able to detect no-moving animals by data fusion of thermal images and LiDAR data. Data acquisition of animal spectra is carried out by our team at the National Institute of Advanced Industrial Science and Technology in collaboration with Ueno Zoological Gardens for the purpose of discriminating animal species; hence, we also consider the use of hyperspectral data.

Remote Sens. 2018, 10, 1050

12 of 13

5. Conclusions In this study, we applied an existing algorithm, the DWA algorithm, to thermal airborne remote sensing images. We found that the producer’s accuracy of the method was approximately 77.3% and the user’s accuracy of the method was approximately 29.3%. This means that the proposed method can reduce the person-hours required to survey moving wild animals from large numbers of thermal remote sensing images. Furthermore, we confirmed the extracted sika deer candidates in a pair of images and found 24 moving objects that were not identified by visual inspection by an expert. Therefore, the proposed method can also reduce oversights when identifying moving wild animals. The detection accuracy is expected to increase by setting suitable observation conditions for surveying moving wild animals. Accordingly, we also discuss the required observation conditions. The discussions about the required observation conditions would be extremely useful for people to monitor animal population changes using thermal remote sensing images. The factors that determine the extraction of moving wild animals from thermal remote sensing images are the following: 1. 2. 3.

4.

5.

Using the same observation geometry to obtain pairs of thermal images. The spatial resolution must be finer than one-fifth of the body length of the target species. Wild animals exhibit well-defined activity patterns, such as sleeping, foraging, migration, feeding, and resting. To extract moving wild animals, the target species should be moving when the survey is conducted. When shooting intervals are too long, targets can move out of the area of overlap between two images. In contrast, when shooting intervals are too short, targets cannot be extracted because the movement distance in a given interval must be longer than the body length. Thus, shooting intervals are decided after a survey of the movement speed of the target species in observation period. The shooting intervals cause a thermal gap between two images by radiative cooling. Furthermore, conductivity is related to thermal conductivity, which differs according to materials. Therefore, we firmly recommend that shooting be performed in the early morning.

The proposed method in consideration of these is potentially capable to extract moving wild animals in thermal remote sensing images to monitor animal population changes. Author Contributions: Y.O., H.O., R.N. and T.M. conceived and designed the studies; Y.O. and H.O. performed applicable evaluation; A.T. took airborne imagery and performed visual inspection for airborne imagery; Y.O. performed evaluations, analyzed the results, and wrote the paper. Funding: This research is based on results obtained from a project commissioned by the New Energy and Industrial Technology Development Organization (NEDO), Japan. Acknowledgments: This research is based on results obtained from a project commissioned by the New Energy and Industrial Technology Development Organization (2017). The authors would like to thank Information & Science Techno-System Co., Ltd., Y. Watabe, K. Sakurai, A. Kamei, and S. Kato for help obtaining thermal images using Grass HOPPER, and Nakanihon Air Service Co., Ltd. for providing airborne images. Conflicts of Interest: The authors declare no conflict of interest. The founding sponsor had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References 1. 2. 3. 4.

Oishi, Y. The use of remote sensing to reduce friction between human and wild animals and sustainably manage biodiversity. J. Remote Sens. Soc. Jpn. 2016, 36, 152–155, (In Japanese with English Abstract). [CrossRef] Watanabe, T.; Okuyama, M.; Fukamachi, K. A review of Japan’s environmental policies for Satoyama and Satoumi landscape restoration. Glob. Environ. Res. 2012, 16, 125–135. Annual Report on Food, Agriculture and Rural Area in Japan FY 2015. Available online: http://www.maff. go.jp/e/data/publish/attach/pdf/index-34.pdf (accessed on 17 August 2017). Adaptive Management. Available online: http://www2.usgs.gov/sdc/doc/DOI-AdaptiveManagementTechGuide. pdf (accessed on 21 August 2017).

Remote Sens. 2018, 10, 1050

5.

6. 7.

8. 9. 10. 11. 12. 13. 14.

15. 16. 17. 18. 19. 20. 21. 22.

13 of 13

Lancia, R.A.; Braun, C.E.; Collopy, M.W.; Dueser, R.D.; Kie, J.G.; Martinka, C.J.; Nichols, J.D.; Nudds, T.D.; Porath, W.R.; Tilghman, N.G. ARM! For the future: Adaptive resource management in the wildlife profession. Wildl. Soc. Bull. 1996, 24, 436–442. Ohdachi, S.D.; Ishibashi, Y.; Iwasa, M.A.; Saito, T. The Wild Mammals of Japan; Shoukadoh Book Sellers: Kyoto, Japan, 2009; ISBN 9784879746269. Oishi, Y.; Matsunaga, T.; Nakasugi, O. Automatic detection of the tracks of wild animals in the snow in airborne remote sensing images and its use. J. Remote Sens. Soc. Jpn. 2010, 30, 19–30, (In Japanese with English Abstract). [CrossRef] Oishi, Y.; Matsunaga, T. Support system for surveying moving wild animals in the snow using aerial remote-sensing images. Int. J. Remote Sens. 2014, 35, 1374–1394. [CrossRef] Kissell, R.E., Jr.; Tappe, P.A. Assessment of thermal infrared detection rates using white-tailed deer surrogates. J. Ark. Acad. Sci. 2004, 58, 70–73. Chretien, L.; Theau, J.; Menard, P. Visible and Thermal Infrared Remote Sensing for the Detection of White-tailed Deer Using an Unmanned Aerial System. Wildl. Soc. Bull. 2016, 40, 181–191. [CrossRef] Christiansen, P.; Steen, K.A.; Jorgensen, R.N.; Karstoft, H. Automated detection and recognition of wildlife using thermal cameras. Sensors 2014, 14, 13778–13793. [CrossRef] [PubMed] Terletzky, P.A.; Ramsey, R.D. Comparison of three techniques to identify and count individual animals in aerial imagery. J. Signal Inf. Process. 2016, 7, 123–135. [CrossRef] Maps and Geospatial Information. Available online: http://www.gsi.go.jp/kiban/index.html (accessed on 4 December 2017). Tamura, A.; Miyasaka, S.; Yoshida, N.; Unome, S. New approach of data acquisition by state of the art airborne sensor: Detection of wild animal by airborne thermal sensor system. J. Adv. Surv. Technol. 2016, 108, 38–49. (In Japanese) Steen, K.A.; Villa-Henriksen, A.; Therkildsen, O.R.; Green, O. Automatic detection of animals in mowing operations using thermal cameras. Sensors 2012, 12, 7587–7597. [CrossRef] [PubMed] Weszka, J.S.; Nagel, R.N.; Rosenfeld, A. A threshold selection technique. IEEE Trans. Comput. 1974, C-23, 1322–1326. [CrossRef] Doyle, W. Operation useful for similarity-invariant pattern recognition. J. Assoc. Comput. Mach. 1962, 9, 259–267. [CrossRef] Otsu, N. A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 1979, SMC-9, 62–66. [CrossRef] Li, Y.; Zhang, Y.; Yu, J.; Tan, Y.; Tian, J.; Ma, J. A novel spatio-temporal saliency approach for robust dim moving target detection from airborne infrared image sequences. Inf. Sci. 2016, 369, 548–563. [CrossRef] Li, Y.; Zhang, Y. Robust infrared small target detection using local steering kernel reconstruction. Pattern Recognit. 2018, 77, 113–125. [CrossRef] Takeda, H.; Farsiu, S.; Milanfar, P. Kernel regression for image processing and reconstruction. IEEE Trans. Image Process. 2007, 16, 346–366. [CrossRef] Imamoglu, N.; Zhang, C.; Shimoda, W.; Fang, Y.; Shi, B. Saliency Detection by Forward and Backward Cues in Deep-CNN. In Proceedings of the Computer Vision and Pattern Recognition 2017, Honolulu, HI, USA, 21–26 July 2017. © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

Suggest Documents