Detecting Changes in Polarimetric SAR Data with Content-Based Image Retrieval Matthieu Molinier∗† , Jorma Laaksonen† , Yrj¨o Rauste∗ and Tuomas H¨ame∗ ∗ VTT
Technical Research Centre of Finland, Digital Information Systems, Earth Observation Team, P.O. Box 1000, FI-02044 VTT, Finland - Email:
[email protected] † Helsinki University of Technology, Adaptive Informatics Research Centre, P.O. Box 5400, FI-02015 HUT, Finland
Abstract—In this study, we extended the potential of a ContentBased Image Retrieval (CBIR) system based on Self-Organizing Maps (SOMs), for the analysis of remote sensing data. A database was artificially created by splitting each image to be analyzed into small images (or imagelets). Content-based image retrieval was applied to fully polarimetric airborne SAR data, using a selection of polarimetric features. After training the system on this imagelet database, automatic queries could detect changes. Results were encouraging on airborne SAR data and may be more useful for spaceborne polarimetric data.
cut into 16 × 16 pixels small images, forming a database of 8080 imagelets per scene. For the purpose of method evaluation, a ground truth was created by classifying each scene into 5 classes {mountain, forest, water, ice, shadow} [4]. Supervised Wishart classification was used in PolSARPro, after delineating training areas over the Pauli decomposition RGB image. Lack of optical data for creating the ground truth resulted in reduced number of classes and classification reliability.
I. I NTRODUCTION Two novel polarimetric SAR satellites, TerraSAR-X and RADARSAT-2, will be launched in 2007. In addition, ALOS has been launched in 2006 and is being taken into operative use. The immense amount of data generated by these satellite missions demands new approaches to manage it efficiently. There is a growing interest in the remote sensing community for Content-Based Image Retrieval (CBIR), which allows management of large image archives, as well as satellite image annotation and interpretation. Our work extends the potential of PicSOM [1], a CBIR system based on Self-Organizing Maps (SOMs) [2], for polarimetric SAR image analysis. The key idea of our study [3] is to artificially generate a database of small images – or imagelets – from each full satellite image to be analyzed. Imagelets can be extracted from one scene for the detection of manmade structures or other targets, or from two (or more) scenes to detect changes. In this paper we present our experiments on change detection in fully polarimetric SAR data using PicSOM. II. P RE - PROCESSING AND DATABASE PREPARATION Due to unavailability of fully-polarimetric spaceborne datasets suitable for change detection, polarimetric airborne data was considered. The data consists of 2 EMISAR singlelook complex (SLC in scattering matrix format) scenes acquired in March and July 1995. The March scene was then registered to the July scene – details on the data can be found in [4]. 3-by-3 coherent averaging was applied to form 9-look images of 1280 × 1616 pixels, using PolSARPro software [5]. A scene is typically divided into several thousands of imagelets, so that PicSOM produces relevant indexing. By this operation, the number of target classes within an imagelet is reduced compared to the original full scene. The extracts were
1-4244-1212-9/07/$25.00 ©2007 IEEE.
III. F EATURES Features were extracted from the polarimetric SAR imagelets to allow their indexing by the Self-Organizing Maps. The original PicSOM features were developed for RGB optical images. They are standard low-level measures of texture and color information, not suitable for polarimetric SAR images. Table I sums up the features included into PicSOM for polarimetric data analysis. Four Touzi polarimetric discriminators [6] were considered : R0 max the maximum scattered intensity, N DR0 the normalized difference of the scattered intensity, pmax the maximum degree of polarization and ∆p the dynamic range of the degree of polarization. The polar azimuthal polarimetric signature [7] presents several advantages over the original polarimetric signature [8], mainly the continuity of range in orientation angle, and a less ambiguous mapping of horizontally and vertically polarized targets. In addition, a coordinate feature was carried along the whole analysis process, both to keep track of the position of any imagelet within the full image, and to complete the framework for change detection. Table II lists the feature groups extracted from the imagelets, and their dimensionality. For all features except the polarimetric signature and xycoordinates, the imagelets were divided into 4 quadrants, from which the basic features were extracted after averaging the coherency matrix. For example, an imagelet of 16 × 16 pixels was divided into 4 quadrants of 8 × 8 pixels, over which the coherency matrix was averaged before extracting the LOGRATIOS features – thus generating a feature vector of dimension 4 × 3 = 12. The copolarized and crosspolarized signatures were calculated on the average coherency matrix over a whole imagelet, then aggregated into a single 1444dimension vector.
2390
TABLE I F EATURES CONSIDERED IN THE STUDY. hi INDICATES SPATIAL AVERAGING ( MULTILOOKING ), S IS THE SCATTERING MATRIX , R IS THE S TOKES VECTOR . A PPLICATION DOMAINS OF FEATURES IN THE REFERENCE PUBLICATIONS ARE GIVEN . Feature [ref.]
Expression ρHH−V V
Amplitude of HH-VV correlation coeff. [9], [10]
Application
hS S ∗ i = √ HH 2 V V 2 |S | |S | HH
Winter / spring crops discrimination
VV
φHH−V V = arg(hSHH SV∗ V i)
Phase difference HH-VV [10], [11] Ratio HV/VV in dB [9]
10 · log
Co-polarized ratio in dB [12]
10 · log 10 · log
Cross-polarized ratio in dB [12]
|SHV | |SV V |2
|SV V |2 |SHH |2
Sea ice classification
2
Bare soil / vegetation discrimination
Vegetation types discrimination
2
|SHV | |SHH |2
Vegetation types discrimination
Co-polarized HH backscattering coeff. [10], [11]
0 ∗ σHH = hSHH SHH i
Sea ice classification
Co-polarized VV backscattering coeff. [10], [11]
0 σV V 0 σHV
hSV V SV∗ V i ∗ i hSHV SHV
Sea ice classification
Cross-polarized HV backscattering coeff. [9], [10] Co-polarization ratio Depolarization ratio
0 σV V 0 σHH
0 σHV 0 0 +σV σHH V
=
Sea ice classification
∗ hSHV SHV i ∗ ∗ i i+hSV V SV hSHH SHH V
Sea ice classification
γ= δ=
[10], [11] H=
1 log(3)
Agriculture (crops), sea ice classification
∗ hSV V SV Vi ∗ hSHH SHH i
[11]
H-α decomposition [13]
=
3 X
−Pi · log(Pi ), α =
i=1
3 X
Pi αi
Target scattering properties
i=1
p Degree of polarization [6]
2 +R2 +R2 R1 3 2
Dpol =
R0
Polar azimuthal polarimetric signature [7]
Target scattering properties
TABLE II F EATURE GROUPS EXTRACTED FROM THE IMAGELETS . Feature group
Features
Dim
HHVV LOGRATIOS BACKSCATTER POLRATIOS HALPHA TOUZIDISC POLSIG xy-coordinates
HH-VV correlation and phase difference 3 log-ratios 3 backscatter coefficients Copol. and depol. ratios Entropy H and α R0 max, N DR0 , pmax , ∆p Copol and crosspol signatures Imagelet index
8 12 12 8 8 12 1444 2
visually queried: the system presents images then the user marks a subset of them as relevant to the present query. This relevance information is fed back to the system, which is then able to find more similar images and return them in the next query round. The PicSOM system provides a semi-automated, interactively supervised analysis of satellite images [3]. B. Change detection with PicSOM
IV. S ELF -O RGANIZING M APS FOR C ONTENT-BASED I MAGE R ETRIEVAL C HANGE D ETECTION A. SOM training The Self-Organizing Map is an unsupervised learning technique, which forms a non-linear mapping of a highdimensional input space (here the space of imagelets) into a typically two-dimensional grid of artificial neural networks (or units). In PicSOM, a separate SOM is trained for each feature type. Through this mapping, feature vectors that reside near each other in the input space are mapped into nearby units on the map (called best matching units, or BMUs). Consequently, imagelets that are mutually similar in respect to the given feature have BMUs located near each other on the SOM. After the SOMs have been trained, the database can be
1-4244-1212-9/07/$25.00 ©2007 IEEE.
Target characterization / separation
The PicSOM system can also be used for change detection, following the same approach. Considering two polarimetric SAR images taken over a given location at different times, each full image is similarly divided into imagelets. All imagelets are then merged to form a database, on which the feature SOMs are trained. The coordinate feature allows to define pairs of imagelets covering the same location (within the image registration accuracy), while temporal information is stored in the imagelet names. The change magnitude is defined not in the image domain, but on the SOM grids: the distance between BMUs of the imagelets in an imagelet pair characterizes the change. Imagelets having similar content between the two acquisitions are expected to have same or neighboring BMUs, whereas very dissimilar imagelets are represented by BMUs that are far away from each other on the SOMs. The highest betweenBMUs distances computed on all imagelet pairs point out locations in the full image where the most significant changes have occurred.
2391
mountain
forest
water
coords March
ice
shadow
changes do not appear anymore in the combination of features in Fig. 3(b). It may be because the other features (POLSIG mainly) dominated over the TOUZIDISC feature.
coords July
Fig. 2.
POLSIG March
POLSIG July
TOUZIDISC March Fig. 1.
300 imagelets with the most significant changes – TOUZI feature
TOUZIDISC July
Distribution of ground truth over SOMs
V. R ESULTS Training of all the feature SOMs took 1.5 hours, using each feature vector 100 times in training. Fig. 1 shows the distribution of the ground truth over several SOMs. The xycoordinates SOM looks similar to the ground truth classification [4], whereas the other SOMs have their own coordinate systems not related to geographical coordinates. The TOUZIDISC map shows some fragmented areas especially in March – where ground truth classification was challenging due to ice coverage. The classes were less fragmented in July – that was the case for all feature SOMs. The more localized a ground truth class is on a SOM, the better this SOM can retrieve imagelets containing that class. The POLSIG map was the least fragmented of all, even in March. Other maps were much more fragmented than that of TOUZIDISC feature. Fig. 2 shows in red the 300 imagelets where the most significant changes occurred, on the TOUZIDISC map. It was the feature that best detected the change corresponding to melting ice in the central part of the lake. Surprisingly, those
1-4244-1212-9/07/$25.00 ©2007 IEEE.
Fig. 3 shows in red the 300 imagelets where the changes with most magnitude occurred, for two other maps. The LOGRATIOS feature (Fig. 3(a)) produced an interesting result, although the ground truth was very fragmented over this map. Most ”true” changes between March and July (observed visually in the Pauli decompositions images) were detected by this feature combination, i.e. low vegetation appearing near the river (on the left part of the image), and ice melting on the lake (bottom-left part and central part of the lake). The combination of TOUZIDISC, POLSIG and BACKSCATTER features (Fig. 3(b)) best described the changes occurring around the river in left part of the image (mainly appearing vegetation). The presented change detection technique was unsupervised. With a more reliable ground truth it would have been possible to perform supervised or interactive change detection [3]. A major advantage of the imagelet-based approach to change detection is that it is less sensitive to mis-registration than pixel-based approaches – for example those tested in [4] on the same dataset. Also the type of the change can be determined from the locations of the BMUs on the SOMs. Apart from combining features into a same SOM, one could use directly the Best-Matching Units (BMUs) to combine information from different SOMs. If a pair of imagelet is considered to contain changes for a given feature (e.g. LOGRATIOS), one can check the BMUs of those two imagelets on another feature SOM (e.g. HALPHA). BMUs of each imagelet on that SOM map would be the values of entropy H and angle α, and they could be plot on a H-α plane for further target and change characterization. VI. C ONCLUSIONS Our preliminary change detection results on polarimetric SAR data were encouraging, though need more tuning. Imagelet-based structure detection does not provide direct
2392
(a) LOGRATIOS feature
(b) Combination of TOUZIDISC, POLSIG and BACKSCATTER Fig. 3.
300 imagelets with the most significant changes
delineation of objects of interest (contrary to pixel-based methods), but it can highlight in a full scene the possible locations where structures or other content of interest may exist. Such a system could point out locations in the full image where there may be potentially interesting structures, or features defined visually by the user that may have appeared or disappeared. One of the many advantages of PicSOM for polarimetric SAR data analysis is its ability to integrate various features, and to perform a selection of relevant ones based on user queries. Further research will investigate other combinations of polarimetric SAR features, and will require a dataset with more accurate ground truth for method evaluation. The method could also be applied to man-made structure detection in polarimetric images, as it was on optical images in our previous study. It is also expected the PicSOM system will be tested on a pair of ALOS PALSAR images over Finland. ACKNOWLEDGMENT The authors are grateful to Professor Torbjørn Eltoft and Anthony Doulgeris, from University of Tromsø (Norway), for providing us with the EMISAR data. We also wish to thank Ridha Touzi and Iain Woodhouse for clarifying some points on their respective publications. R EFERENCES [1] PicSOM Development Group, “PicSOM online demonstration,” http://www.cis.hut.fi/picsom, Helsinki University of Technology, 1998– 2006. [2] T. Kohonen, Self-Organizing Maps, 3rd ed., ser. Springer Series in Information Sciences. Springer-Verlag, 2001, vol. 30. [3] M. Molinier, J. Laaksonen, and T. H¨ame, “Detecting man-made structures and changes in satellite imagery with a content-based information retrieval system built on self-organizing maps,” IEEE Trans. Geoscience and Remote Sensing, vol. 45, pp. 861–874, April 2007.
1-4244-1212-9/07/$25.00 ©2007 IEEE.
[4] M. Molinier and Y. Rauste, “Comparison and evaluation of polarimetric change detection techniques in aerial SAR data,” in IEEE International Geoscience and Remote Sensing Symposium IGARSS’07, 2007. [5] E. Pottier, L. Ferro-Famil, S. Allain, S. Cloude, I. Hajnsek, K. Papathanassiou, A. Moreira, M. Williams, T. Pearson, and Y.-L. Desnos, “An overview of the PolSARpro v2.0 software. the educational toolbox for polarimetric and interferometric polarimetric SAR data processing,” in POLinSAR07 ESA Workshop, January 2007. [6] R. Touzi, S. Goze, T. L. Toan, A. Lopes, and E. Mougin, “Polarimetric discriminators for SAR images,” IEEE Transactions on Geoscience and Remote Sensing, vol. 30, no. 5, pp. 973–980, September 1992. [7] I. Woodhouse and D. Turner, “On the visualization of polarimetric response,” International Journal of Remote Sensing, vol. 24, no. 6, pp. 1377–1384, March 2003. [8] J. V. Zyl, H. Zebker, and C. Elachi, “Imaging radar polarization signatures: theory and observation,” Radio Science, vol. 22, no. 4, pp. 529–543, July-August 1987. [9] S. Quegan, T. Le Toan, H. Skriver, J. Gomez-Dans, M. C. GonzalezSampedro, and D. H. Hoekman, “Crop classification with multitemporal polarimetric SAR data,” in Workshop on POLinSAR - Applications of SAR Polarimetry and Polarimetric Interferometry (ESA SP-529), January 2003. [10] H. Skriver, W. Dierking, P. Gudmandsen, T. L. Toan, A. Moreira, K. Papathanassiou, and S. Quegan, “Applications of synthetic aperture radar polarimetry,” in Workshop on POLinSAR - Applications of SAR Polarimetry and Polarimetric Interferometry (ESA SP-529), January 2003. [11] W. Dierking, H. Skriver, and P. Gudmandsen, “SAR polarimetry for sea ice classification,” in Workshop on POLinSAR - Applications of SAR Polarimetry and Polarimetric Interferometry (ESA SP-529), January 2003. [12] J. R. Buckley, “Environmental change detection in prairie landscapes with simulated RADARSAT 2 imagery,” in IEEE International Geoscience and Remote Sensing Symposium IGARSS’02, 2002, pp. 3255– 3257. [13] S. Cloude and E. Pottier, “An entropy based classification scheme for land applications of polarimetric SAR,” IEEE Transactions on Geoscience and Remote Sensing, vol. 35, no. 1, pp. 68–78, January 1997.
2393