Feature Extraction in FLIR Imagery Using the Continuous ... - CiteSeerX

24 downloads 1337 Views 204KB Size Report
Phone: (404) 880-8655. Center for Signal and Image Processing 3. School of Electrical Engineering, Georgia Institute of Technology. Atlanta, GA .... they contain a certain number of targets embedded .... Orlando, Florida, April 8-12, (1996).
Feature Extraction in FLIR Imagery Using the Continuous Wavelet Transform (CWT) Romain Murenzi,1 Lance Kaplan,1;2 Fernando Mujica,1;3 and Sandor Der 4

[email protected], [email protected], [email protected], [email protected]

Center for Theoretical Studies of Physical Systems 1;2 Department of Physics 1 and Engineering 2 , Clark Atlanta University Atlanta, GA 30314 Phone: (404) 880-8655 Center for Signal and Image Processing 3 School of Electrical Engineering, Georgia Institute of Technology Atlanta, GA 30332 Phone: (404) 894-2969 Army Research Laboratory 4 ARL-SE-SE, MD 20783 Phone: (301) 394-0807

Abstract

This paper investigates the use of the Continuous Wavelet Transform (CWT) for detection of targets in cluttered environments. We rst de ne the multiple 2D CWT energy densities in the feature space. We then use the spatial energy density as a computational core for our target-detection algorithm. Results from FLIR images are presented at the end of the paper. We used both the Mexican Hat wavelet and target-adapted wavelets in the simulations.

Introduction

Automatic target detection and recognition (ATD/R) consists of computer processing to detect and recognize target signatures embedded in a cluttered environment. Typical targets include planes and tanks; clutter includes grass, trees, topographical features, and atmospheric phenomena (e.g., clouds and smoke). In general, the problem can be modeled using the following equation: s(~x) = n(~x) +

X T (~x ? ~x ); N

l=1

l

ol

(1)

where n(~x) represents an additive noise (clutter + measurement noise), Tl (~x) are targets to be detected

and recognized, and s(~x) represents the accessible measured signal. Automatic or assisted target detection and identi cation for FLIR imagery requires the ability to extract the essential features of an object from cluttered environments under the condition that the range to the target is unknown. Multiscale techniques (e.g., the multidimensional continuous wavelet transform (MCWT)) are highly desirable, because they can extract and normalize both the unknown scale and orientation of the target. The typical features to be extracted from the image of a target in a cluttered environment are the position of the target, the spatial extent (size) of the target (i.e., the scale), and the shape of the target, including its orientation and symmetry. To extract such target features, it is preferable not to work in the image space, but rather to map the image into a feature space of position, scale, and orientation | the MCWT performs this mapping exactly, as discussed in detail in [1, 2]. The 2D CWT is a decomposition [3, 4] of an image on a set of dilated, rotated, and translated versions of a single function called the mother wavelet. The scale dependence allows sensitivity to variations in sensor resolution as well as determination of target size. Rotation dependence leads to robust behavior, under varying target

orientations. Integration over all the parameters of the squaredmodulus of the CWT gives the total energy in the signal, which is Parseval's energy conservation condition. Therefore, integration on a subset of the parameters gives an energy density in the remaining variables. A total of 14 energy densities (4 1D densities, 6 2D densities, and 4 3D densities) exist. Those densities have what we refer to as CWT features. In the next section; we de ne 2D CWT energy densities. We then describe an algorithm for target detection in cluttered environments, using the position energy density, and present results of the algorithm on FLIR data from the TRIM2 database, using the anisotropic Mexican hat wavelet, as well as targetadapted wavelets.

Continuous Wavelet Transform and Energy Densities in Feature Space

The 2D CWT is a linear transformation from 2~ L2 (IR2 ; d2~x) to L2 (IR+   [0; 2[IR2 ; da a3 d d b),

Z

1  ( 1 r (~x ? ~b))s(~x) ; (2) a a  IR2 where s is the input image, S is the CWT, is the mother wavelet, a is the dilation parameter,  is the rotation parameter (with r , the corresponding 2D rotation matrix), ~b = [bx ; by ]0 is the translation vector, the asterisk denotes complex conjugation, and the hat denotes the Fourier transform. Considering the energy density js(~x)j2 , js^(~k)j2 , we have the following energy-conservation theorem: S (a; ; ~b) =

Z

IR2

d2 ~x

d2~xjs(~x)j2 =

=

Z d2~kjs^(~k)j2 (3) IR ZZZ 1 d2~b djS (a; ; ~b)j2 : 2

a3

This theorem leads to the interpretation of

jS (a; ; ~b)j2 as an energy density of the signal, s, in

position, scale, and orientation parameters. It is clear that any partial integration of the CWT energy density in any subset of the parameter space gives an energy density in the remaining variables. We restrict the analysis to 2D energy densities, giving four 1D dimensional densities, six 2D densities, and four 3D densities. We have thus the following 2D CWT energy densities:

 Space (range and aspect) energy density E34 (bx ; by ) =

Z 1da Z 2 djS(a; ; b ; b )j2 (4) 

0 a3 0

x

y

 Scale-angle energy density

Z +1db Z +1db jS(a; ; b ; b )j2 (5)

E12 (a; ) =

x

?1

y

?1

x

y

 Anisotropy angle and aspect energy density (bx; by ) = j~bj(cos( ); sin( ))

Z 1daZ +1dj~bjj~bjjS(a; ; j~bj; )j2 (6)

E24 (; ) =

0 a3 0

 Scale-range energy density E13 (a; j~bj) =

Z 2 dZ 2 d jS(a; ; j~bj; )j2 

0



0

(7)

 Anisotropy angle-range energy density E23 (; j~bj) =

Z 1da Z 2d jS(a; ; j~bj; )j2 

0 a3 0

(8)

 Scale-aspect energy density E14 (a; ) =

Z 1dj~bjj~bj Z 2djS(a; ; j~bj; )j2: (9) 

0

0

We are currently being investigating these energy densities in order to build a detector based on CWT 2D features of a given target type under various conditions. These features will then be used as inputs to a convolutional neural network (CNN) algorithm [5]. These densities will also be used to design an ATR algorithm for detection, classi cation, and recognition of targets in FLIR and SAR imagery.

Detection Algorithm Using the Position (Range-Aspect) Energy Density

Consider the FLIR images shown in gures 1 and 2: they contain a certain number of targets embedded in a cluttered environment. To enhance these images, we compute a 2D CWT energy density over position, while suppressing clutter at the same time. First, we compute the CWT in the position representation at all relevant scales a = aj and angles  = j , that is, S (aj ; j ; bx ; by ). For detection, we take the image obtained for each xed a = aj and  = j , threshold it, and add all the images together. Thresholding is performed in an adaptive way, becoming more severe for smaller a. The e ect of this adaptation is to suppress the clutter information, while preserving the target information. Note that other nonlinear transformations (e.g., enhancement, morphological operators) may also be applied. After wavelet image integration, the target information is

 ??

s(~ x)

?? -? @ @@ @@ R

S (a1; 1; ~b)j2

j

.. . S (an ; n; ~b)j2

j

- 6Threshold-

 +f 

@@ @@R - 6Threshold-. ?? ? - 6Threshold ? -

-

E3;4(~b)

Figure 3: Detection algorithm. sulting composite image. These centroids correspond to the positions of potential targets; one controls the false-alarm rate by adjusting the thresholds to eliminate ambiguous targets.

50 100 150 200

Application to Detection of Target in FLIR Images

250 300

We apply the CWT-based algorithm to the two images shown in gures 1 and 2 taken from the TRIM2 database. We then compute the composite images as described above. Figures 4 and 5 show the results for the anisotropic Mexican Hat wavelet (with anisotropy parameter  = Figure 1: A FLIR image consisting of targets (tanks M1) 5). Figures 6 and 7 show the results using a mother and clutter, from the TRIM2 database. wavelet adapted to the targets. We performed all calculations at scales a = 1=8; 1=4; 1=2; 1 and orientations  = 0; 4 ; 2 . 350 400 450 500

100

200

300

400

500

50

100 150

50

200

100

250

150

300

200

350

250

400

300

450

350 400

500 100

200

300

400

500

450 500

Figure 2: A FLIR image consisting of targets (helicopters) and clutter, from the TRIM2 database.

100

200

300

400

500

Figure 4: Results for gure 1, using an anisotropic Mexican Hat wavelet with anisotropy parameter  = 5:0 .

reinforced and becomes visually enhanced. Figure 3 shows a block diagram of the proposed target detec- As the gures show, the CWT-based detection altion algorithm. gorithm makes the targets stand out from their backTo complete the steps of the detection algorithm, grounds with more de nition than the original imwe compute the centroids ~b = ~bi ; i = 1; :::; L in the re- ages. This suggests that the CWT may be used

50

50

100

100

150

150

200

200

250

250

300

300

350

350

400

400

450

450

500

500 100

200

300

400

500

100

200

300

400

500

Figure 5: Results for gure 2, using an anisotropic Mexi- Figure 7: Results for gure 2, using a target-adapted can Hat wavelet with anisotropy parameter  = 5:0 . wavelet.

References

50 100 150 200 250 300 350 400 450 500 100

200

300

400

500

Figure 6: Results for gure 1, using a target-adapted wavelet.

for target detection, as described in earlier, and that CWT-based features might be appropriate as input to a convolutional neural network [5].

Conclusions

In this paper, we presented a family of wavelet features based on the CWT, and we examined the e ectiveness of the CWT in separating targets from background in synthetic FLIR imagery. Target signatures extracted from a training set can be formed into the CWT feature space, allowing targets to be detected through the nonlinear matching algorithm described in [5]. Our future work will include automated learning of wavelet features, using arti cial neural nets on a large set of real FLIR and SAR images.

Acknowledgments

The authors wish to thank Drs. Marvin Cohen and Mark J. T. Smith for their valuable assistance in our research.

[1] Jean-Pierre Antoine, K. Bouyoucef, P. Vandergheyst and R. Murenzi.,"Target Detection and Target Recognition Using 2 Dimensional Continuous Isotropic and Anisotropic Waveletes," in Automatic Object Recognition V, SPIE-1995, Vol. 2485, 20-31, (1995). [2] Sang-II Park, Mark J.T. Smith and R. Murenzi, \Multidimensional Wavelets for Target Detection and Recognition," in SPIE International Symposium on Aerospace/Defense Sensing and Controls, Orlando, Florida, April 8-12, (1996). [3] J-P. Antoine and R. Murenzi, \Two Dimensional Directional Wavelets and the Scale-Angle Representation," Signal Processing 52, 259-281, (1996). [4] J-P. Antoine, R. Murenzi and P. Vandergheynst, \Two Dimensional Directional Wavelets in Image Processing," International Journal for Imaging Systems and Technology, Vol.7, 152-165, (1996). [5] Syed A. Rizvi and Vincent Mirelli, \Automatic Target Recognition Using a Multi-layer Convolution Neural Network," preprint (1996).

Suggest Documents