Data fusion of multiple polarimetric SAR images using ... - IEEE Xplore

4 downloads 0 Views 516KB Size Report
Sahyun Hong a, Wooil M. Moon a,b, Hong-Yul Paik c, Gi-Hyuk Choi c a School of ... monitoring of environmental surveillance and man-made target tracking.
Data Fusion of multiple Polarimetric SAR images using Discrete Wavelet Transform (DWT) a

Sahyun Hong a, Wooil M. Moon a,b, Hong-Yul Paik c, Gi-Hyuk Choi c

School of Earth & Environmental Sciences, Seoul National University, Seoul, 151-742, Korea ([email protected], [email protected]) b Geophysics, The University of Manitoba, Winnipeg, Canada R3T 2N2 ([email protected]) c Satellite Operation & Application Center, Korea Aerospace Research Institute, Taejon, 305-600, Korea ([email protected], [email protected])

Abstract Data fusion is a very effective technique which can be applied to many remote sensing areas such as classification, monitoring of environmental surveillance and man-made target tracking. In this paper, we tested fusion of multiple frequency (C-, and L-band), multiple polarization (HH, HV and VV) and multiresolution data sets. One can obtain a polarimetric SAR data after enhancing spatial resolution through image fusion process. In order to fuse multiple SAR data and high spatial resolution data, they have to be geometrically co-registered over the same target area and have the same pixel size (spatial registrations). At this stage, we used the nearest neighbor resampling so that to avoid spectral distortion by interpolation. Multiresolution polarimetric SAR image fusion was performed using multiscale image fusion technique - Discrete Wavelet Transform after spatial registrations. To evaluate the spectral fidelity of fused polarimetric SAR data, spectral dissimilarity was calculated at each wavelet decomposition level. Resulting classification map based on polarimetric feature vector shows better class separation after application of fusion processing than without fusion. The polarimetric SAR data over the Gong-ju areas, tested in this research, were acquired during NASA/JPL AIRSAR PACRIM-II experiment in 2000.

Previous studies [2] have focused on the merging of a high spatial resolution panchromatic and a low spatial resolution multispectral image data, but in this research, we investigated and tested fusion of multiple SAR image data and a high spatial resolution panchromatic image data. II. STUDY AREA AND DATASET This study is carried out over the Gong-Ju area in the South Korean peninsula where the topography relatively flat. The test area has simple land cover classes such as farm lands, small mountainous forests, man-made roads and bridges. Multiple polarimetric airborne SAR data (C-, L-band and HH, HV, VV polarizations) were collected during the NASA/JPL PACRIM-II AIRSAR Experiment in 2000 over Korea. KOMPSAT-1 (Korea Multi-Purpose Satellite one) EOC high spatial resolution optical data are utilized along with the above polarimetric SAR data. Because the polarimetric random noise speckles influence the data fusion results, averaging of the data with a 7 X 7 window was used to reduce speckle noise effects. III. MULTIRESOLUTION ANALYSIS A. Discrete Wavelet Transform (DWT) Before fusion of multisource data, geometric coregistration must be done. SAR images are registered geometrically onto the KOMPSAT-1 EOC panchromatic image and all of datasets include the same geographical areas and have the same pixel size. Then, decomposition performed using Discrete Wavelet Transform (DWT). Let aj is a finite image with N X N pixels, then the decomposition algorithm becomes separable along rows and columns (Fig.1). The approximation of a twodimensional finite energy function included in L2(R) aj at resolution 2j, where j is a decomposition level, can be characterized by the coefficients calculated by the convolution with wavelet function [1]. The approximation and detail coefficients can be calculated with a pyramidal algorithms [1]. Each level of the pyramid is an approximation of the original image. DWT employs two sets of functions: scaling function and wavelet function, which are associated with lowpass( g~ )

I. INTRODUCTION Data fusion is an effective technique for various target oriented classification of remote sensing data and plays a key role in many remote sensing areas. A variety of multisensor data fusion technique has been developed recently. For example, fusion of high spatial resolution data and multispectral data gives a hybrid image, which has good terrain details, and useful spectral information, which in turn can discriminate small objects or land cover types. Multisource image fusion methods like as Intensity-HueSaturation (IHS), Principal Component Analysis (PCA), and Brovey Transform have been tested with mostly optical image data. In this paper, we investigated a multiresolution image fusion through a multiresolution wavelet transform, which decomposes data into a coarser resolution representation for the approximation of low frequency information, and a finer representation for detailed high frequency information. Most polarimetric SAR data are useful for classifying the land cover types with respect to target’s intrinsic structural and dielectric properties. However, they tend to have a poor spatial resolution.

0-7803-7536-X/02/$17.00 (C) 2002 IEEE

3323

~ and highpass ( h ) filters, respectively. The decomposition of the signal into different frequency bands is simply obtained by successive highpass filtering and low pass filtering of the signal [1],[2]. Here, we used the Daubechies Wavelet basis function to decompose the signal into wavelet coefficients. B. Image Fusion with Wavelet Transform A SAR image and EOC panchromatic image were decomposed into a wavelet representation at the same coarser resolution. The components of (j+1)th approximation derived from EOC panchromatic image were replaced by the corresponding coefficients derived from polarimetric SAR data for each band. Then, inverse discrete wavelet transform (IDWT) of combined wavelet representations is fulfilled. This simple image fusion scheme is shown in Fig. 1.

the original spectral information. Test results show that the polarimetric SAR data or the KOMPSAT-1 EOC high spatial resolution data alone do not have accurate land class identifications. This deficiency is improved by fusing with the other data sources, which have improved the final results. When the fusion of multiresolution data is planned, it must first be remembered that, (1) the wavelet decomposition level modifies the quality of resulting fused image in both spectral and spatial aspects. There is a trade-off between the spectral and the spatial information. (2) the (high) spatial resolution of the image to be merged can affect the performance of the fusion processes, which means the same spatial resolution image from the different sensors can produce different results.

IV. EXPERIMENTAL RESULTS We tested the fusion of multiresolution polarimetric SAR data - three data polarizations (HH, HV and VV) in each of the two bands (C- and L-band), and the Kompsat-1 EOC high spatial panchromatic image. Fig. 2 (a) displays the KOMPSAT-1 EOC high spatial panchromatic image, where one can easily identify the borders between land covers distinctly and Fig.2 (b), and (c) show the fused polarimetric C- and L-band SAR images, respectively. Spectral quality of fused images could be evaluated by calculating pixel-bypixel differences (1) from the original image (Table 1) and Signal-to-Noise Ratios (SNR) (Table 2) quantitatively [2]. The pixel by pixel dissimilarity is

ACKNOWLEDGMENT This research is funded by the SEES (BK21), Seoul National University, Interdisciplinary Collaborative Research Grant (SNU) and partially funded by NSERC of Canada grant No. 7400 to W. M. Moon. The AIRSAR Polarimetric SAR data used in this research were acquired during the NASA(JPL) PACRIM-II experiment. The KOMPSAT-1 EOC data were acquired and provided by the Korea Aerospace Research Institute (KARI).

Dk =

1

∑ ∑ | V 'kij −Vkij | n i j

REFERENCES [1] Stèphane Mallat, “A Wavelet Tour of Signal Processing”, San Diego, USA, Academic Press,1998 [2] J. Zhou, D. L. Civco and J. A. Silander, “A wavelet transform method to merge Landsat TM and SPOT panchromatic data”, Int. J. Remote Sens., Vol. 19, No. 4, pp. 743-757, 1998 [3] G. Simone, A. Farina, F. C. Morabito, S. B. Serpio, L. Bruzzone, “Image fusion techniques for remote sensing applications”, Information Fusion, Vol. 3, No. 1, pp. 3-15, 2002

(1)

, where k is band number, V′ and V are pixel values of the fused and the original SAR images, respectively; i, j are ith row and jth column, and n is the total number of pixels [2]. The differences between the merged SAR and the original SAR image are smallest at the decomposition level 2. In addition, Table 2 shows that the fused data have somewhat higher SNR (indicated bold figures) than original data, which indicates some noises could be removed during the wavelet transform [3]. To validate the utility of the wavelet transform in data fusion, Bayesian ML classifier was carried out using polarimetric feature vector (2), which is selected in both fused image and original images. Classification test results are shown in Fig. 3. The feature vector computed here utilized v = [10log(HH) , 10log(HV) , 10log(VV)] (2)

Table 1. Spectral differences with respect to wavelet decomposition level C - band HH

HV

L – band VV

HH

Level 2 160.081

194.338

166.494

218.943

235.279

HV

240.869

VV

Level 3 160.135

194.353

166.541

218.944

235.281

240.873

Level 4 160.227

194.372

166.601

218.947

235.285

240.876

Table 2. SNR values with fusion are higher than without fusion With fusion

V. CONCLUSION In this research, multiresolution analysis using discrete wavelet transform was investigated for fusion of multiple polarimetric SAR data and a high spatial resolution panchromatic image. The final fused polarimetric SAR data have enhanced spatial structural features, while preserving

C - band

L - band

3324

Without fusion

HH

2.2881

2.2814

HV

3.7865

3.7753

VV

2.9568

2.9464

HH

2.3894

2.3806

HV VV

3.7794 3.9481

3.7640 3.9282

Fig. 1. (a) Decomposition of a j into approximations and details convolving with highpass and lowpass filters along the rows and columns. (b) Synthesizing of wavelet representations by Inverse Discrete Wavelet Transform (IDWT)

(a) (b) (c) Fig. 2. (a) EOC high spatial panchromatic image. (b) Fused C-band polarimetric SAR image (HH:Red, HV:Green, VV:Blue) (c) Fused L-band polarimetric SAR image (color assignments the same as C-band)

(a) (b) (c) (d) Fig. 3. Bayesian ML classification results based on initially unsupervised k-mean clustering map using polarimetric feature vector [10log(HH), 10log(HV), 10log(VV)]. (a), (b) are C-band classification results with fusion and without fusion, respectively. (c), (d) are L-band classification results with fusion and without fusion, respectively.

3325