Data Fusion of Acoustics, Infrared, and Marine Radar ...

5 downloads 62568 Views 874KB Size Report
to detect echolocation calls of Ohio local bats (frequency range of 15KHz- 96KHz). SM2 package is employed for monitoring diverse set of bird classes with calls ...
Data Fusion of Acoustics, Infrared, and Marine Radar for Avian Study Golrokh Mirzaei1, Mohsin M. Jamali2 Department of Computer Science and Engineering Ohio State University1, Marion, USA University of Toledo2, Toledo, USA

Jeremy Ross, Peter V. Gorsevski, Verner P. Bingman Geospatial Science School of Earth & Environment Bowling Green State University Bowling Green, USA

Corresponding Author Email: [email protected] Abstract— A multi-sensor data fusion approach via acoustics, infrared camera (IR), and marine radar is proposed and described in the application of avian monitoring. The ultimate scope of avian monitoring is to preserve the population of birds and bats especially those listed in endangered list, by observing their activity and behavior over the migration period. With the significant attention towards the construction of off-shore/onshore wind farms in recent decades, the wind turbines are more threatening the avian life with increasing the risk of birds/bats collision with turbine blades. In order to address this problem, this paper proposes a fuzzy Bayesian based multi-sensory data fusion approach to provide the activity information regarding the targets in the application of avian monitoring. The developed technique is used to process the spring and fall 2011 migration period. Key word: Sensor, Acoustics, Infrared, Radar, Data Fusion

I.

INTRODUCTION

Multi-sensor data fusion is an automated process which integrates information from variety of sources and provides an overall meaningful description of targets of interest. In fact, the data fusion converts the raw data which is drawn from multiple sources into a unified set of inferences. The major benefit of such discipline is to obtain information which may not be possible from a single sensor alone. This work is focused on the application of data fusion for avian monitoring in wind farms. Wind power industry is one of the pioneer forms of renewable energy which has been considered significantly over last decades. Wind turbines are operative power sources which play an important role to harness the power of wind. This proven effectiveness has led to significant increases in the number of turbines which are currently under construction. Despite the environmentally beneficial issues of wind energy, wind turbines developments are not without potential adverse impacts on the wildlife and environment. Birds and bats are susceptible to mortality due to collision with turbine blades. According to the reports [1], there are large amount of bird and bat mortality in wind farms. Not only collision with blades causes the bird/bats to death, but “barotrauma” is also a significant cause of bat fatalities at wind farms which is caused by high fast decrease of atmospheric- pressure adjacent in turbine blades. The aim of this work is to demonstrate the potential of a monitoring system comprise of acoustic detectors, Infrared This work is partially supported by DOE Contract #DE-FG36-06G086096, Facilities are provided by US Fish and Wildlife Service (USFWS)

camera, and marine radar and develop a data fusion approach which aggregates different sensory information and produces a unique depiction of the activity and behaviors of birds/bats in the potential area for construction of wind farms. The data fusion of the acoustics/IR/radar produces a combined result that can provide the most detailed and reliable information possible as well as efficient representation of the data. Acoustic data are beneficial to obtain the identity of birds/bats in the class/species level. The conspicuous of the songs/calls can be used for differentiation between varieties of individuals. IR camera is useful to provide information on the flight pattern, x-y coordinates, direction, velocity, heat intensity, and flight straightness index. Finally, marine radar helps in detection and tracking of birds/bats especially at night and also in situations with poor visibility due to fog or clouds. Also, radar can detect targets over a wider range and longer distances as well as providing altitude information (zcoordinates), range, PPI area, PPI perimeter, intensity, and other statistics. Targets in this work are birds and bats, which are referred to as “targets” in the rest of the paper. II.

BACKGROUND AND RELATED WORK

Data fusion have been used in different applications such as target recognition and tracking, traffic control, remote sensing, road obstacle detection, maintenance engineering, mine detection, robotics, biometric applications, and medical imaging[2][3][4][5]. There are several data fusion models available in the literature [6][7][8]. Joint Directories of Laboratories (JDL) [6] is a generalized model in multiple levels and has several revisions. Data Fusion Information Group (DFIG) [7] introduced a data fusion model by incorporating human decision making and resource management levels into higher levels of fusion model. The State Transition Data Fusion model (STDF) [8] rejected the separation or division of levels to machine labor and human decision making, and provided a mix inference of both. Thompoulos Architecture [9] is a data fusion architecture containing three modules of signal level, evidence level, and dynamic level fusion. Single or combination of levels can be used in different applications. Luo and Kay [10] proposed multi sensor integration and made a differentiation between multi-sensor integration and fusion. Data fusion is used in detection and tracking targets in surveillance applications. Yang et. al [11] proposed a technique to make synergy decision from data driven by Infrared Search and Track (IRST) and intermittent-working radar. They aimed to decrease the chance of radar being

locked by adverse Electronic Support Measure (ESM). A distributed data fusion technique is provided by Akselrod et al. [12] in multi-sensor multi-target tracking. They presented a decision mechanism which provides the required data for fusion process while reducing the redundancy in the information. Jesus et. al. [13] developed a surveillance system using acoustic radar and video capture modules. In the application of ornithology, Gauthreaux et al. [14] fused data from fixed-beam radar and thermal camera. In their work, the camera and radar were aligned together to get the altitude information by radar, and locational information by IR. They manually compared two sets of data. III.

(a)

(b)

SENSORS

Three different sensors are used in this work: acoustic detectors, Infrared camera (IR), and marine radar. Two different types of acoustic detectors are used, namely SM2 and SM2BAT from Wildlife Acoustics [15] for detection of birds/bats calls, respectively. This is due to the difference in the frequency range of signals of songs and calls for birds and bats. SM2BAT with sampling frequency of 192 KHz is able to detect echolocation calls of Ohio local bats (frequency range of 15KHz- 96KHz). SM2 package is employed for monitoring diverse set of bird classes with calls in frequency range of 200 Hz -15 KHz. A thermal Infrared camera (FLIR-SR19)[16] is used to collect infrared data. The camera has a focal length of 19 mm (36° HFOV) and standard resolution Focal Plane Array (FPA) of 320(H) × 240(V) pixels. The recording rate is 33 frames per seconds. Thermal images are acquired by sensing the radiation in IR spectrum that is emitted by the object. On the other hand the thermal images have a very low SNR, which provides limited information for proper detection and tracking. Radar data are collected using Furuno 1500 Mark 3 marine radar[17]. The radar is 25 KW Furuno X-band radar (frequency = 9410 GHz, wavelength = 3 cm, model # FAR2127BB, Furuno Electric Company, USA). Radar can be equipped with a T-bar antenna (XN20AF) or a parabolic dish. T-bar antenna produces electromagnetic beam 1.23° wide × 20° high, and 6.5’ length array antenna is rotating in vertical plane to extract the altitude information. The parabolic antenna with 3.5° beamwidth and 15° angle of elevation provides a conical shape. Fig. 1 shows the sensors used in this work. Single-sensor data are processed separately to detect and track the targets and acquire their features. The target information and their features are then used for fusion process. Sensory processing is performed based on different techniques due to the variety of the sensors technology. Acoustic processing is performed in two steps: feature extraction and classification. Feature extraction is used to transform input data to set of features which preserve the relevant information of the signal. Different signal processing based feature extraction techniques are used to derive the important information of the signals [18][19]. Classification is performed using Evolutionary Neural Network [20] to identify the targets in species/class level.

(d) (c) Figure 1: Different types of sensors used in this work (a) Marine radar with parabolic dish (b) IR camera (FLIR-SR19) (c) Acoustic detector Song Meter (d) Marine radar with T-bar antenna [17]

Infrared imagery processing is performed using background subtraction, blip detection, thresholding, and noise suppression. Noise suppression is applied based on opening closing morphological filters to remove the clutter and noise in the resultants foreground regions. Tracking is used to maintain temporal consistency of the foreground objects in the frames [21]. Radar data are processed for noise suppression and blip detection using radR. Target tracking is then performed based on two parts namely estimation and data association. Estimation is applied to predict the next position of the target and it is implemented based on the particle filter. The most likely measured particle location is used to update the target's state estimator. This is generally known as the data association problem and it is performed based on the Nearest Neighbor (NN)[22]. IV.

DATA FUSION ARCHITECTURE

The fusion system is implemented based on an autonomous model, which allows each sensor to perform its specific processing and generates state vectors and/or identity declaration. The resultant vectors are transmitted to fusion node where data are integrated. Pre-processing is the processing of raw data from single sensors. Fusion node 1 and 2 are the integration of IR with radar, and IR/radar with acoustics, respectively. A two-level of fusion hierarchy is used based on feature level in level 1(L1) and decision level in level 2 (L2) fusion, as shown in Fig. 3. Feature level is extraction of features from different sensors when sensor measures are not commensurate. IR and radar are integrated in feature level. Features are combined into composite feature vectors, named L1 feature vectors. The constructed feature vectors are an end-to-end individual sensors feature vectors which serve as inputs to the next level which is a decision level.

Imagery

Level 1 Fusion

IR

Feature Vector

Fusion Feature Level Radar

Decision Level Level 2 Fusion

Joint Identity

Declarat

satisfying the gating criteria. Let set of the properties of sensor A (IR) and B (radar) be denoted as 𝑆𝐴 = { 𝑝𝐴1 , 𝑝𝐴2 , … , 𝑝𝐴𝑚 } and 𝑆𝐵 = { 𝑝𝐵1 , 𝑝𝐵2 , … , 𝑝𝐵𝑛 }, respectively, where -

𝑝𝐴𝑖 and 𝑝𝐵𝑖 are the ith property of sensor A and B, respectively, and they are pair of temporal and locational properties as 𝑝𝐴𝑖 = (𝑡𝐴 , 𝑙𝐴 (𝑥, 𝑦)) and 𝑝𝐵𝑖 = (𝑡𝐵 , 𝑙𝐵 (𝑥, 𝑦)).

-

𝑡𝐴 and 𝑡𝐵 represent the timestamp of sensor A and B, respectively.

-

𝑙𝐴 (𝑥, 𝑦) and 𝑙𝐵 (𝑥, 𝑦) represent the locational information of sensor A and B, respectively.

-

m and n are the total number of sample properties available for sensor A and B, respectively.

Acoustics

3:

Homogenous Dissimilar Fusion

Heterogeneous Fusion

Figure 3: Acoustics/IR/Radar Fusion Hierarchy Model

Decision level is utilization of inferences made from different sensors and combination of them to yield a final fused decision. Acoustic data and L1 feature vectors are fused in decision level. The resultant vectors provide the information on movement rates associated with specific features and target’s class/species. Fusion of IR/radar is a homogenous fusion as both sensors are imagery while fusion of IR/radar/acoustics is a heterogeneous fusion which is integration of sensors of different types and technologies. It is the most inspiring fusion requiring more effort and its use is proposed in this work. V.

L1 FEATURE LEVEL FUSION

L1 fusion process deals with the integration of IR and radar data in feature level. It is implemented based on different fusion modules and procedures which involve data alignment and data association, and they are explained in following sections. A. Data Alignment and Association The goal of data alignment is to provide a common representational format of data from different sensors. Data alignment is divided into spatial and temporal alignment. The process involves geo-referencing the location and field of view of IR and radar. Experiments are performed to implement alignment so they refer to the same target and feasible track-to-track association. The alignment is employed based on the vertical mode of radar which has common coverage area with field of view of IR. In this work, the radar recording system is in Greenwich Mean Time (GMT) format, while the IR data is recorded in Eastern Standard Time (EST). Thus, EST format is used as the time reference system and all of the GMT-based data is converted to the EST time format. The resultant data are aligned to a common EST format. Data association is implemented based on gating, association matrix, and assignment. Gating is applied to remove unlikely target-to-target pairs while keeping most putative pairs. An association matrix is created to measure similarities between IR and radar target pairs. Assignment is performed to define track-to-track pairs. Timestamp and locational information are considered as association properties in L1 fusion. Temporal and locational gates are considered based on a user-defined threshold as 𝜀 , where [𝑝𝑖 − 𝜀, 𝑝𝑖 + 𝜀] is the range of gate for the property of 𝑝𝑖 . The association matrix is generated for putative targets

The gating function of 𝑔𝐵𝑗 (𝑝𝐴𝑖 ) is defined to determine if the 𝑝𝐵𝑗 , 𝑓𝑜𝑟 𝑗 = 1, 2, . . 𝑛 falls within the gate of the 𝑝𝐴𝑖 for 𝑖 = 1, 2, . . 𝑚. For each sample, if the criterion is satisfied, the gating function returns a value of “1” as the presence status and “0” as the absence status. Thus, (n×m) Status Matrix (SM) is created with elements of presence and absence statuses for n and m properties of sensors B and A, respectively. For properties with presence status, score measures are given based on the local and global distances. Local distance measures similarities between the timestamps of the pairs and also locational information, while global distance measures similarity between overall properties between two pair of targets. The Euclidean distance is used as a score measure to evaluate the similarities as: 𝑑1(𝑡𝑖 , 𝑡𝑗 ) = |𝑡𝑖 − 𝑡𝑗 |

(1)

𝑑2(𝑙𝑖 , 𝑙𝑗 ) = ‖𝑙𝑖 − 𝑙𝑗 ‖ = √(𝑥𝐴𝑖 − 𝑥𝐵𝑗 )2 + (𝑦𝐴𝑖 − 𝑦𝐵𝑗 )2 (2) Global distance between two properties is defined as: 𝐷(𝑝𝐵 , 𝑝𝐴 ) = ‖𝑝𝐵 − 𝑝𝐴 ‖

(3)

where 𝑝 = (𝑡, 𝑙(𝑥, 𝑦)) is the property of a target, 𝑑1(. ) and 𝑑2(. ) are the local distances to measure the similarities between the temporal and locational information, respectively and D(.) is the global distance. The dissimilarity increases with the increase in the distance. The distance value of zero indicates the maximum similarity. Index Score Matrix (ISM) is generated whose elements have two values of index and score (distance) of presence statuses. It is an n×k matrix where k (k ≤ m) is the number of presence status values. Total number of sample properties for sensor A and B are m and n, respectively. As there might be a possibility for a 𝑝𝐵𝑗 to have presence status value for several 𝑝𝐴𝑖 , the best candidate in terms of minimum score value is selected and other candidates are replaced with null value. Index Score Matrix (ISM) is transformed into n×k Association Matrix (AM) with dimension and elements of unique presence status. The AM matrix is modified by selecting the minimum of score of each row, which provides a one-to-one assignment from 𝑝𝐴𝑖 and presence status values.

Sensor B 𝑝𝐵1

𝑝𝐴1

𝑝𝐵𝑚

.... ...

𝑠11

𝑠12

.... ....

𝑝𝐵1

𝑝𝐵2

....

𝑠1𝑚



𝑝𝐴𝑛

L2 DECISION LEVEL FUSION

𝑝𝐵𝑚 ...

𝑔𝐵 (𝑝𝐴1 )

L2 fusion is the integration of acoustics and L1 features which are processed separately. L1 features contain the fused IR/radar information of the targets and the results of acoustic processing named acoustic features contain the identity of the targets at species/class level. Acoustic and L1 features are fed into the L2 fusion, which is a decision level fusion. In this level, the inferences regarding the targets consist of quantity, identity, and other information will be achieved. As there is no locational information for targets available through acoustics, there is no spatial alignment possible in the L2 fusion. Timestamp is assigned to targets and data association is performed for acoustic/L1 targets with the same logic of Fig. 4. Fuzzy-Bayesian technique is proposed to perform the L2 fusion. In avian monitoring, a priori information of the targets usually is not exact value and is more or less fuzzy. The fuzzy Bayesian decision making approach is developed into two steps: a) fuzzy system providing a priori probability of avian category, b) Bayesian inference providing posteriori probability of species/class of the targets. The inferences regarding the targets consist of quantity, identity, and other information will be obtained.

𝑝𝐵2

𝑔𝐵 (𝑝𝐴1 )

… …

VI.

Sensor A



The overall process of association is shown in Fig. 4. In this figure, the SM matrix contains presence and absence statuses of 𝑠𝑖𝑗 from ith property of sensor B to jth property of sensor A. ISM matrix contains pair of index values of 𝑖𝑙𝑘 and score values of 𝑐𝑙𝑘 from the lth element of the present status of matrix SM and kth element of property in sensor B. AM matrix contains element of 𝑎𝑖𝑗 of unique status elements and is replaced with the minimum score of each elements for every row in modified AM. Thus, the modified AM contains one-to-one assignments for IR-to-radar targets. The features of assigned targets are integrated to create one feature vector of L1 fusion, as shown in Fig. 5.

𝑠11

𝑠12

𝑠1𝑚

.... ....

Status Matrix(SM) Index Score Matrix (ISM) Association Matrix (AM) Modified Association Matrix (Modified AM) Figure 4: Data Association IR

I1

I2

I3

Radar

I4

I5

R1

I6

R2

R3

R2

R3 R4

R4 R5

L1 Fusion Gating Data Alignment

A. Avian Category Fuzzy System (FS) Different avian categories have some common features with less and more measures. For instance, the flapping rate of birds generally is higher than that of bats which results in higher heat intensity for birds; however, in some situations they have same range of heat intensity. Also, flight straightness index values of birds are higher than those of bats as birds fly in a straighter pattern while bats fly in a zigzag pattern. The degree of certainty in the features is not well defined. Fuzzy logic approximates and provides an inference structure which allows the human reasoning capability to confront with presence of uncertainty. Let 𝜒 denote the set of L1 feature vectors, as 𝜒 = [𝑋𝑖 , 𝑖 = 1,2, … 𝑛], where n is the total number of L1 feature vectors. We define the set of straightness index values as ƴ = [𝑦𝑖 , 𝑖 = 1, … 𝑛], heat intensity as Ȥ = [𝑧𝑖 , 𝑖 = 1, … 𝑛], range as Ƙ = [𝑘𝑖 , 𝑖 = 1, … 𝑛], and finally velocity as Ʈ = [𝑡𝑖 , 𝑖 = 1, … 𝑛]. We are interested in set of 𝐴̃ with the subsets of avian category, as 𝐴̃ = [𝐶̃𝑗 , 𝑗 = 1,2, … 𝑚], where 𝑚 denotes the total number of avian categories. Suppose we have three different categories of birds, bats, and insects, as:

Association

I1

I2

I3

I4

I5

I6

R1

R5

I1= Straightness Index, I2=Direction, I3= Heat , I4= Distance, I5=Velocity , I6=Size , R1= Range, R2=Angular Span, R3=Radial Span , R4=PPI Area, R5=PPI Perimeter

Figure 5: L1 Fusion Feature Vector

̃1 = [𝑋𝑖 | 𝑋𝑖 ∈ 𝜒, 𝑋𝑖 𝑖𝑠 𝑎 𝐵𝑖𝑟𝑑] 𝐶 ̃2 = [𝑋𝑖 | 𝑋𝑖 ∈ 𝜒, 𝑋𝑖 𝑖𝑠 𝑎 𝐵𝑎𝑡] 𝐶 ̃3 = [𝑋𝑖 | 𝑋𝑖 ∈ 𝜒, 𝑋𝑖 𝑖𝑠 𝑎𝑛 𝐼𝑛𝑠𝑒𝑐𝑡] 𝐶 ̃1 , 𝐶 ̃2 , and 𝐶 ̃3 , we have to define precisely what is To specify 𝐶 meant by a “bird”, “bat”, or “insect”. This means we have to operationalize these terms to recognize the category of the targets. However, the boundaries of members in 𝐴̃ are not crisp values and this causes 𝐶̃𝑗 as a member of the set 𝐴̃

TABLE I: Input Fuzzy Sets

becomes a fuzzy set. Let Ī denotes the set of inputs of the system with k input as Ī = [𝑉̃𝑡 , 𝑡 = 1, . . . 𝑘], where 𝑉̃𝑡 is an input subset. The fuzzy inputs with 𝑘 = 4 are given as: SI:

𝑉̃1 = [𝑣𝑖 | 𝑣𝑖 ∈ ƴ ]

Heat:

𝑉̃2 = [𝑣𝑖 | 𝑣𝑖 ∈ Ȥ ]

Range:

𝑉̃3 = [𝑣𝑖 | 𝑣𝑖 ∈ Ƙ ]

Velocity:

𝑉̃4 = [𝑣𝑖 | 𝑣𝑖 ∈ Ʈ ]

̃ ̃ ̃ ̃ where 𝑉 1 , 𝑉2 , 𝑉3 , and 𝑉4 are the fuzzy input sets of straightness index, heat, range, and velocity, respectively. ̃1 , 𝐶 ̃2 , and 𝐶 ̃3 on the L1 feature set Consider fuzzy sets of 𝐶 of χ with membership functions of 𝜇𝑐1 ̃ (𝑥), 𝜇𝑐2 ̃ (𝑥), and 𝜇𝑐3 ̃ (𝑥). The functional operations of the sets consist of union, intersection, and complement and are defined as: 𝜇𝑐1̃ ̃ (𝑥) ∪ 𝜇𝑐2 ̃ (𝑥)] ∪𝑐2 (𝑥) = 𝑚𝑎𝑥[𝜇𝑐1

(4)

𝜇𝑐1̃ ̃ (𝑥) ∩ 𝜇𝑐2 ̃ (𝑥)] ∩𝑐2 (𝑥) = 𝑚𝑖𝑛 [𝜇𝑐1

(5)

𝜇̅̅̅̅ ̃ (𝑥) ̃ (𝑥) = 1 − 𝜇𝑐1 𝑐1

(6)

The nonlinear mapping from the inputs to the output is performed through a set of fuzzy rules known as rule based system. The antecedents of each rule represent the inputs which are based on L1 feature vectors and consequents denote the outputs which are different categories of target signature. The avian fuzzy system is shown in Fig. 6. Different notations of input sets and their numerical ranges used are shown in Table I. The trapezoidal function is used for the input sets’ membership function. Three membership function (Low, Medium, and High) are considered for each of the four inputs (straightness index, heat, range, and velocity) and their numerical range are shown in table I. Numerical ranges are obtained based on the experiment and available facts. Each membership function in the set is associated with a particular rule and maps input variable to output through set of the nine rules. The avian fuzzy rules are given based on the author’s consideration of available facts regarding the activity and behavior of birds, bats, and insects. However, the rules are subject to change based on the biologist’s interests. Several assumptions are considered in making the rules, including:  

   

Feeding is at medium or lower ranges Migrating is at high or medium ranges Insects have lower ranges than birds and bats Velocity is inversely proportional to range Heat of birds and bats are inversely proportional to range Straightness index of bat is lower than birds and insect Fuzzy Rule Based 𝑉̃1 𝑉̃2 𝑉̃3 𝑉̃4

Features

Inputs

Straightness Index

̃𝟏 𝑽

Heat

̃𝟐 𝑽

Range

̃𝟑 𝑽

Velocity

̃𝟒 𝑽

Notation Low(L) Medium(M) High(H) Low(L) Medium(M) High(H) Low(L) Medium(M) High(H) Low(L) Medium(M) High(H)

Numerical Range [ 0-0.8] [0.6-0.90] [0.8-1] [0-100] [80-200] [180-250] [0-180] [150-350] [300-1500] [0-2000] [1500-3500] [3000-5000]

Fuzziness helps in evaluating the fuzzy rules, but the final output of a fuzzy system has to be a crisp number. By the defuzzification process, the crisp outputs will be produced which are the best representation of the fuzzy sets. The input of defuzzification is the aggregate output fuzzy set, and the output is a single number. It takes the output distribution and finds its center of mass to create one crisp number. B. Species/Class Bayesian Inference Bayesian inference reflects the conditional probability of hypotheses for given prior probabilistic beliefs based on Bayes theorem. The notation of probability of event 𝐴 as 𝑝(𝐴) is shortened to just (𝐴) in the rest of this paper to simplify the expressions. The Bayes’ theorem emanating from the conditional probability describing the probability of hypothesis 𝐻, given the evidence 𝐸 as: (𝐻|𝐸) =

(𝐸|𝐻)(𝐻) (𝐸)

(7)

The generalize form of the Bayes’ rule is based on the partitions of the event space. If hypothesis {𝐻𝐽 } is a sample of the event space (𝐻 = {𝐻𝑖 , 𝑖 = 1, . . 𝑛 }), then for each sample, the Bayes theorem is extended to: (𝐻𝑖 |𝐸) =

(𝐸|𝐻𝑖 )(𝐻𝑖 ) (𝐸|𝐻𝑖 )(𝐻𝑖 ) = [∑𝑖(𝐸|𝐻𝑖 ) (𝐻𝑖 )] (𝐸)

(8)

There are two types of data (observation) from acoustic sensor (sensor 1), and IR/radar sensor (sensor 2). Let E1 and E2 denote the evidences or observations driven from sensor 1 and sensor 2, respectively. Also let H and C be the hypothesis or event space of target signature from sensor 1 and sensor 2, respectively. Event space of sensor 1 is defined as the class/species of the birds/bats which are under study in this work. These species are also documented as the available species in Ohio. The list of the hypothesis or event space of H is shown in Table II. Event space of target signature from sensor 2 is defined as C. It is the category of the avian targets which can be a “Bird”, “Bat”, or “Insect”. The species in the list of event space are mutually exclusive and have as union of the entire sample space as: 𝑙

Fuzzy System (FS) Figure 6: Fuzzy System

̃1 𝐶 ̃2 𝐶 ̃3 𝐶

∑(𝐻𝑖 |𝐵𝑎𝑡) = 1 , 𝑗=1

𝑙=7

(9)

𝑚

∑(𝐻𝑖 | 𝐵𝑖𝑟𝑑) = 1 ,

𝑚=3

TABLE II: Hypothesis and Categories Hypothesis (𝐻𝑘 ), k =1,…10 Categories (ci) , i=1,..I

(10)

𝑗=1

(𝐻, 𝐶|𝐸1, 𝐸2) = (𝐻|𝐸1, 𝐸2) × (𝐶|𝐸1, 𝐸2)

(11)

Also sensor measurements are independent, thus we have (𝐻|𝐸1, 𝐸2) = (𝐻|𝐸1) × (𝐻|𝐸2) (𝐶|𝐸1, 𝐸2) = (𝐶|𝐸1) × (𝐶|𝐸2)

= (𝐻|𝐸1) × (𝐶|𝐸1, 𝐸2) = (𝐶|𝐸1). (𝐶|𝐸2) = (𝐶|𝐸1) ×

(𝐸2|𝐻)(𝐻) (𝐸2)

(𝐸2|𝐶) (𝐶) (𝐸2)

1 (𝐻) = 10

(𝐶|𝐸1) is the probability of the target being one of the categories of bird, bat, or insect , given the observation driven from sensor 1 (acoustic detector). (H) is the probability of a target being one of the ten species/classes in Table II. (C) is the probability of the target being one of the categories of bird, bat, or insect. We assume the same a priori probabilities for all of species/classes. For example the probability of the target to be in the Warbler class is the same as if the target to be in the Sparrow class. Similarly, we assume the same probabilities for all of the avian categories. For example the probability of the target to be a bird is the same as that to be a bat, as shown in Table III. The probability of (𝐻|𝐸1) is driven from acoustic sensors which is processed using acoustic sensor processing. The probability of (𝐶|𝐸2) is driven from L1 fusion (IR/radar) and its value is obtained from the fuzzy system. According to the Bayes’ rule, the probability of a target being one of the bird classes for example a Warbler, given the target is a bird is calculated as: 𝑝( 𝑊𝑎𝑟𝑏𝑙𝑒𝑟 ∣ 𝐵𝑖𝑟𝑑 ) 𝑝( 𝐵𝑖𝑟𝑑 ∣ 𝑊𝑎𝑟𝑏𝑙𝑒𝑟 ) 𝑝(𝑊𝑎𝑟𝑏𝑙𝑒𝑟) = (16) 𝑝(𝐵𝑖𝑟𝑑)

1

Bird

2

Bat

3

Insect

Avian Category (𝐶 ) =

1 3

The Equation 16 can be extended to the generalized form of Bayes’ rules and can be defined as:

(14)

where, (𝐻|𝐸1) is the probability of the target being one of the ten species/classes in Table II, given the observation driven from sensor 1 (acoustic detector).

𝒄𝒊

Priori Probabilities Bird/Bat Species/Class

(13)

(15)

i

TABLE III: Probability of Species/Class and Categories

(12)

Equations 12 and 13 can be expressed in terms of theirs constituents using Bayes’rule as: (𝐻|𝐸1, 𝐸2) = (𝐻|𝐸1). (𝐻|𝐸2)

𝑯𝒌 Warbler Trush Sparrow Labo Lano Epfu Mylu Pesu Nyhu Laci

k 1 2 3 4 5 6 7 8 9 10

where l is the total number of bat species ( Labo, Lano, Epfu, Mylu, Pesu, Nyhu and Laci) and m is the total number of bird classes (Warbler, Thrush, and Sparrow) used in this work according to Table II. The fusion node wishes to know the probabilities of the hypothesis 𝐻 and 𝐶, given the observation from sensors (E1, E2) as (𝐻, 𝐶|𝐸1, 𝐸2). Since the 𝐻 and 𝐶 are independent, the posterior probability is defined as:

𝑝( 𝑊𝑎𝑟𝑏𝑙𝑒𝑟 ∣ 𝐵𝑖𝑟𝑑 ) =

=

𝑝( 𝐵𝑖𝑟𝑑 ∣ 𝑊𝑎𝑟𝑏𝑙𝑒𝑟 ) × 𝑝(𝑊𝑎𝑟𝑏𝑙𝑒𝑟) = ∑3𝑘=1 𝑝( 𝐵𝑖𝑟𝑑 ∣ 𝐻𝑘 ) × 𝑝(𝐻𝑘 )

𝑝( 𝐵𝑖𝑟𝑑 ∣ 𝑊𝑎𝑟𝑏𝑙𝑒𝑟 ) × 𝑝(𝑊𝑎𝑟𝑏𝑙𝑒𝑟) 𝑝( 𝐵𝑖𝑟𝑑 ∣ 𝑇ℎ𝑟𝑢𝑠ℎ ) 𝑝(𝑇ℎ𝑟𝑢𝑠ℎ) + 𝑝( 𝐵𝑖𝑟𝑑 ∣ 𝑊𝑎𝑟𝑏𝑙𝑒𝑟 ) 𝑝(𝑊𝑎𝑟𝑏𝑙𝑒𝑟) + 𝑝( 𝐵𝑖𝑟𝑑 ∣ 𝑆𝑝𝑎𝑟𝑟𝑜𝑤 ) 𝑝(𝑆𝑝𝑎𝑟𝑟𝑜𝑤)

(17)

Similarly, the probability of a target being one of the bat species for example a Lano, given the target is a bat is calculated as: 𝑝( 𝐿𝑎𝑏𝑜 ∣ 𝐵𝑎𝑡 ) =

𝑝( 𝐵𝑎𝑡 ∣ 𝐿𝑎𝑏𝑜 ) × 𝑝(𝐿𝑎𝑏𝑜) ∑7𝑘=1 𝑝( 𝐵𝑎𝑡 ∣ 𝐻𝑘 ) × 𝑝(𝐻𝑘 )

(18)

Equation 18 can be extended to the all samples in event space of bat species. If the posterior probability is less than a predefined threshold, the target assignment is discarded and it is assumed that the target in acoustics does not have any match in the L1 vectors. On the other hand, if the posterior probability exceeds the predefined threshold, there is a match between acoustics and L1 feature vector and an identity tag describing the class/species of the target is added to L1 feature vector. This combined vector is named an L2 feature vector which contains the information of target signature from IR, radar, and acoustics. VII.

EXPERIMENTS AND RESULTS

The multi-sensory monitoring system was deployed along the coastline of the western Lake Erie basin in Ottawa National Wildlife Refuge (ONWR: N 41.633˚, W 83.200˚) and at the University of Toledo [UToledo (acoustics only): N 41.656˚, W 83.606˚] during the spring (May-July) and fall (Aug-Oct) migration seasons of 2011. SM2BAT/SM2, FLIR Infrared camera, and Furuno marine radar were used as the acquisition sensors. Radar was operating in vertical mode which provides ascending and descending rate of the birds/bats. The IR camera was located 25 meters from the

radar. The sensors were run through the night and the data were analyzed to study the movement and activity of migrants. Two different database from Sonobat[23] and Cornell University were used to train the bat calls and bird songs, respectively. The databases contain the calls/songs of available bat species and bird classes in NW Ohio. Although the three type of the sensors (IR,radar, and acoustics) have different detection range [15][16][17], the targets detected by all three sensors are used for data fusion.

    

 Straightness Index  Direction  Heat

Two fusion nodes are used for two different levels of fusion. Feature vector of 𝐹𝑉𝐿1 is resulted from the fusion of IR and radar features and it is fed to the fusion node 2. In this level, the previously processed acoustic data in the form of acoustic features are integrated with 𝐹𝑉𝐿1 through the fuzzy Bayesian technique. The results are the L2 feature vectors (𝐹𝑉𝐿2 ) which are the data combination of three sensors.

Figure 7: Feature Vectors Resulted by Three Sensors

Fusion Node 1

Feature

IR IR IR Radar Radar Radar

Distance Velocity Direction Range SI* Angular Span Radial Span Category Species

Radar Acoustics Acoustics

𝐹𝑉𝐿1 𝐹𝑉𝐼𝑅 FV

𝐹𝑉𝑎𝑐 ac

Target #3 327.78 425 NW 324 1 9.1

Target #4 156.01 343 E 758 0.99 8.1

Target #5 346.2 535 NW 433 0.7 10

15.3

17.021

16.2

18.1

14

Bird Sparrow

Bird Warbler

Bird Sparrow

Bird Thrush

Bat Epfu

L1

Bayesian Inference

𝐻𝑘 , 𝑘 = 1, . . 𝐾 FV D

Mono

L2

D

DDual

tri

Inference Figure 8: Fusion Scheme

90

270

90

90

270 270

180

180

Radar Images of targets for some sample nights in fall 2011

Altitude:12 m Category: Insect

Heat: 112 , Direction: East Straightness Index: 0.83

(b)

0

0

0

Altitude: 205 m Category: Bird

Target #2 640.58 213 NE 654 0.96 11.2

Fuzzy System r=1, …,R

D

(a)

Target #1 342.6 623 N 345 0.99 8.32

*SI= Straightness Index

DMono

IR

180

TABLE V: L2 Vector Samples from Real Data Sensor

D

𝐹𝑉𝑟𝑎𝑑

rad

270

AvianFIS ‘mamdani’ 4 1 9 ‘min’ ‘max’ ‘min’ ‘max’ ‘centroid’

D

Dual

D

DMono

Fusion Node 2

Name Type Number of Inputs Number of Outputs Number of rules AndMethod OrMethod ImpMethod AggMethod DefuzzMethod

V e c t o r

Species/Class

The L2 feature vector driven from features of three sensors is shown in Fig. 7. The overall fusion based on fuzzy Bayesian technique is shown in Fig. 8. 𝐷𝑚𝑜𝑛𝑜 , 𝐷𝐷𝑢𝑎𝑙 , and Dtri are three different types of data which are drawn from single sensor, combination of two sensors, and combination of three sensors, respectively. 𝐹𝑉𝑟𝑎𝑑 , 𝐹𝑉𝐼𝑅 , and 𝐹𝑉𝑎𝑐𝑐 are single sensor features of radar, IR, and acoustics, respectively. The parameters used in the fuzzy system are shown in Table IV. Table V shows number of samples of L2 vectors obtained from the real data. Fig. 9(a) shows sample radar images of collected data in three different nights in fall 2011 and their corresponding IR tracks for the same night is illustrated in Figure 9(b). Also, the information resulted by fused data (acoustics, IR, and radar) is shown for each track.Altitude and category is obtained by radar and acoustics, respectively. Heat, direction, and straithness index is obtaned by IR camera. Part of the result of data fusion for fall and spring 2011-2012 migration period is shown in Figures 10-12. Figure 10 shows the direction of bat passes and bird classes in the area in fall 2011. It can be seen that 53% of bat passes and 55% of the bird flights were toward the south. This information is obtained based on the data fusion of acoustics and (IR/radar). Figure 11 shows the direction of flights of bats and birds by species/class level. Epfus and TABLE IV: Parameters in Fuzzy System Feature Value

F e a t u r e

Range PPI Area PPI Perimeter Angular Span Radial Span

Heat: 157 , Direction: North Straightness Index: 0.99

Altitude: 75 m Category: Bat

Heat: 120.7, Direction: Northwest Straightness Index: 0.47

IR tracks with fused data from radar and acoustics Figure 9: Sample tracks of fused data

warblers are the largest number of migrants toward the south. Then Thrushes and Labos are the second largest migrants. The information was achieved based on the data fusion of Acoustics and IR. The identity of the targets is provided by

Direction of Bat Passes

and their nocturnal flight behavior while migrating. The research/system is potentially of enormous value to biologists and conservation decision makers to rapidly but effectively asses bird and bat density, diversity and, most importantly, behavior within natural areas or proposed wind development sites.

Direction of bird Flights

North 9%

West 36%

West 18%

East 6% South 55%

North 23%

REFERENCES

South 53%

East 0%

[1] J. Rydell, L.Bach, et. al. , “Bat Mortality at Wind Turbines in Northwestern Europe”, Acta Chiropetrologica, 2010, pp. 261-274

[2] J. A. Besada, G. Frontera, et. al. , “Adaptive Data Fusion for Air Figure 10: Direction of Flights in Fall 2011

[3]

Direction By Species/Families 70.00% 60.00%

[4]

50.00% 40.00% 30.00%

South

20.00%

North

10.00%

East

[5]

West

0.00% Epfu

Mylu

Labo

Nyhu

Warbler

Sparrow

[6]

Thrush

Bird Classes

Bat Species

[7]

Altitude (m)

Figure 11: Direction of Different Bat Species and Bird Classes

[8]

450 400 350 300 250 200 150 100 50 0

[9] Warbler Thrush 1

2

3

4

5

6

7

8

9

[11]

10

Targets Figure 12. Flight Altitude of Warblers and Thrushes (Bird Classes in the Area)

acoustics and the direction is obtained by IR/Radar. Figure 12 compares the flight altitude of two bird classes (Warbler and Thrush). The information is provided by data fusion of acoustics (identity) and radar (altitude). These graphs help biologist to have an estimation of altitude and flight ranges of different species/classes. VIII.

[10]

CONCLUSION

A comprehensive framework of multisensory data fusion approach is developed using acoustics, infrared camera, and marine radar in the application of avian monitoring. It is proposed in a hierarchal model of feature and decision levels. The sensors were deployed in the western basin of Lake Erie in Ohio. Acoustics is useful for target identification at the taxonomic level. Infrared sensing and IR video processing provides quantitative information and x-y coordinates of the target. Marine radar gives target altitude (z-coordinates) as well as other information. The developed system is used to process the spring and fall 2011 data collected in the Ottawa National Wildlife Refuge. The processing was performed both in sensory and fusion mode. The result of fusion provides the complementary information of the migrant birds/bats during migration period of seasons. Our approach provides a detailed panorama of local avian species diversity

[12]

[13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23]

Traffic Control Surveillance”, Hybrid Artificial Intelligent System, Springer, V. 6679, 2011, pp. 118-126 C. Dambra, F. Relit, et. al., “Remote Sensing Data Fusion by Means of a Region-Overlapping Technique”, IEEE Geoscience and Remote Sensing Symposium, 1991, pp. 1091-1094 Y. Wang, F. Chu, et. al. , “Multisensor Data Fusion for Automotive Engine Fault Diagnosis”, IEEE Tsinghua Science and Technology, Vol. 9, Issue: 3, 2004, pp. 262-265 G. Bokade, A. Sapkal, “Feature Level Fusion of Palm and Face for Secure Recognition”, International Journal of Computer and Electrical Engineering, 4(2), 2012, pp. 157-160 F. E. White, “Data Fusion Lexicon”, The Data Fusion Subpanel of the Joint Directors of the Laboratories, Technical Panel for C3,1991 E. Blasch, S. Plano, “DFIG Level 5: Issues Supporting Situational Assesment Reasoning, International Conference of Information Fusion, 2005, pp. xxxv - xliii D. A. Lambert, “STDF Model Based Maritime Situation Assesment”, IEEE International Conference on Information Fusion, 2007, pp.1-8 S. C. A. Thomopoulos, “Sensor Integration and Data Fusion”, SPIE Sensor Fusion II: Human and Machine Strategies, 1989, pp. 178-191 R. C. Luo, M. G. Kay, “Multisensor Integration and Fusion, Issues and Approaches”, SPIE Sensor Fusion, 1989, pp. 42-49 G. Yang, L. Duo, et. al. ,“ Synergy Decision in the Multi-Target Tracking Based on IRST and Intermittent-Working Radar”, Information Fusion, ELSEVIER, 2001, pp. 243-250 D. Akselrod, A. Sinha, et. al. , “Information Flow Control for Collaborative Distributed Data Fusion and Multisensor Multitarget Tracking”, IEEE Transactions on Systems, Man, and Cybernetics-Part C: Applications and Reviews, 42(4), 2012, pp. 501-517 J. D. de Jesus, J. J. V. Calvo, et. al. , “Surveillance System Based on Data Fusion From Image and Acoustic Array Sensors”, IEEE Aerospace and Electronic Systems Magazine, 15(2), 2000, pp. 9-6 S. A. Gauthreaux, J. Livingston, “Monitoring Bird Migration with a Fixed-beam Radar and a Thermal-Imaging Camera”, Journal of Field Ornithology,77(3), 2006, pp. 319-328 Wildlife Acoustics, Available from: http://www.wildlifeacoustics.com/ FLIR, Available from:http://www.flir.com Furunu, Available from:http://www.furunousa.com G. Mirzaei, W. Majid, et. al., “The Bio-Acoustic Feature Extraction and Classification of Bat Echolocation Calls”, IEEE Internatinal Conference on Electro/Information Technology, 2012, pp.1-4 S. Bastas, M. Majid, G. Mirzaei, et. al. , “A Novel Feature Extraction Algorithm for Classification of Bird Flight Calls”, IEEE International Symposium on Circuits and Systems, 2012, pp. 1676-1679 G. Mirzaei, M. Majid, et. al., “The Application of Evolutionary Neural Network for Bat Echolocation Calls Recognition”, IEEE International Joint Conference on Neural Networks, 2011, pp. 1106-1111 G. Mirzaei, M. Majid, et. al. “Avian Detection and Tracking Algorithm using Infrared Imaging”, IEEE International Conference on Electro/Information Technology, 2012, pp.1-4 G. Mirzaei, M. Majid, et. al, “Radar-based Monitoring System for Nocturnal Assesment”, IEEE International Conference on Electro/Information Technology, 2014. Pp. 252-255 Sonobat, http://www.sonobat.com