Multisource Data Integration in Remote Sensing

5 downloads 4671 Views 10MB Size Report
Institute of Electrical and Electronic. Engineers (IEEE) .... tablet and the resolution of the edges for the ground reference data depends ..... the spectral signature.
NASA

Conference

Publication

3099

Multisource Data Integration in Remote Sensing

James C. Tilton, Editor Goddard

Space Flight Center Greenbelt, Maryland

Proceedings of a workshop sponsored by the Institute of Electrical and Electronic Engineers (IEEE) and the NASA Goddard Space Flight Center and held at the University of Maryland College Park, Maryland June 14-15, 1990

IW_A National Aeronautics and Space Administration Office of Management Scientific and Technical Information Division 1991

FOREWORD On June 14-15, 1990, Technical Committee 7 (TC7) of the International Association for Pattern Recognition (IAPR) held a Workshop on "Multisource Data Integration in Remote Sensing" in College Park, Maryland. The original plan for this workshop was to place it in between two major, related conferences: the International Geoscience and Remote Sensing Symposium (IGARSS'90) at the end of May 1990, in College Park, MD; and the International Conference on Pattern Recognition (ICPR'90) in mid-June 1990, in Atlantic City, NJ. Such a configuration of two major conferences in geographically and temporally close range would have offered an excellent opportunity to realize a major goal of TC7: to bring together scientists from remote sensing applications and from the methodology of pattern recognition. Due to the postponement of the ICPR'90, the time between the two conferences became too long to allow participants of both conferences to attend the workshop by a short prolongation of their stay in the area. We therefore moved the workshop close to the ICPR to at least partly reach our goal. The subject of the workshop, Multisource Data Integration in Remote Sensing, seems to have become a real challenge for the near future. New instruments and new sensors will provide us with a large variety of new views of the "real world." This huge amount of data has to be combined and integrated in a (computer-)model of this world. But also, the knowledge of how these data are gathered and what their characteristic properties are is among the useful sources of information that contribute to a meaningful interpretation. Multiple sources may give us complementary views of the world -- consistent observations from different (and independent) data sources support each other and increase their credibility, while contradictions that may be caused by noise, errors during processing, or misinterpretations, can be identified as such. As a consequence, data integration can produce results that are very reliable, and represents a valid source of information for any geographical information system (GIS). The workshop structure consisted of three sessions of three to five individual presentations. All papers were discussed both individually and in the general context of the session. Additionally, all papers were considered under four specific aspects: 1. What are the characteristic properties of the data sources that are explored approach? 2. In which category does the used integration method fall? 3. What is the result of the integration and what are the improvements realized? 4. What are the major advantages of the proposed methods?

in the individual

Four participants generously volunteered to consider the papers under one of the above aspects. Started as an experiment to encourage lively discussion, the four summaries (three of which are included herein) prove the success of this approach, and give also a useful index to the presented papers. It was an interesting experience for the authors to get a feedback on their own presentations. Last but not least, I would especially like to thank Dr. James C. Tilton of NASA Goddard Space Flight Center for the excellent organization of this workshop. I would like to further acknowledge the sponsors of the workshop, the IAPR, NASA Goddard Space Flight Center, and the Washington/Northern Virginia Chapter of the IEEE Geoscience and Remote Sensing Society. My thanks go also to the authors that have contributed papers and to all the active participants that made the workshop more than just a sequence of presentations. I hope that activities of IAPR-TC7 will continue in the future under the leadership of the new chairman of TC7, Dr. Tilton, and that we can reassemble for another interesting workshop -- probably in 2 years -- on the occasion of the next ICPR, which will be held in The Hague, The Netherlands. Dr. Walter

G. Kropatsch

,°,

111

PRECEDING

PAGE

BLANK

NOT

FILMED

CONTENTS °oo

Foreword

111

Contents

V

PRESENTATIONS 3

Refinement of Ground Reference Data with Segmented Image Data Jon W. Robinson, ST Systems Corporation, Lanham, MD, USA James C. Tilton, NASA Goddard Space Flight Center, Greenbelt, MD, USA Near Ground Level Sensing for Spatial Analysis of Vegetation John Rasure, Tom Sauer, and Charlie Gage, Department of Electrical and Computer Engineering, University of New Mexico, Albuquerque, NM, USA

11

Integration

of SAR and DEM Data - Geometrical Walter G. Kropatsch, Institute for lnformatics, University Innsbruck, Innsbruck, Austria

27

Towards

Operational Multisensor Data Registration Eric Rignot, Ronald Kwok, and John Curlander, Jet Propulsion Laboratory, California Institute of Technology, Pasadena, CA, USA

Considerations

39

Combined Fluorescence, Reflectance, and Ground Measurements of a Stressed Spruce Forest for Forest Damage Assessment C. Banninger, Institute for Image Processing and Computer Graphics, Joanneum Research, Graz, Austria A Phenomenological and Visible Data

Approach

to Multisource

Data

Integration:

Analysing

Norway

Infrared

54

61

N. Nandhakumar, Electrical Engineering Department, University of Virginia, Charlottesville, VA, USA A Method for Classification of Multisource Data using and Its Application to HIRIS Data H. Kim and P. H. Swain, School of Electrical Engineering and Laboratory for Applications of Remote Sensing, Purdue University, West Lafayette, IN, USA

Interval-Valued

Improved

of Monocular

Disparity Map Analysis Through the Fusion Frederic P. Perlant and David M. McKeown, Digital Mapping Laboratory, Carnegie Mellon University, Pittsburgh, PA, USA

v

PRBCEUiNG

Probabilities

83

Segmentations

P,_GE

75

_C,q

F_LMED

Useof InformationFusionto Improvethe Detectionof Man-MadeStructuresin Aerial Imagery

94

Jeffrey Shufeit and David M. McKeown, Digital Mapping Laboratory, Carnegie Mellon University, Pittsburgh, PA, USA A Computer Vision System for the Recognition of Trees A. Pinz, Institute of Surveying and Remote Sensing, University of Agriculture, Wien, Austria

in Aerial

111

Photographs

Visualizing Characteristics of Ocean Data Collected During the Shuttle Imaging Radar-B Experiment David G. Tilley, The Johns Hopkins University Applied Physics Laboratory, Laurel, MD, USA

125

A Proposal to Extend Our Understanding of the Global Economy Robbin R. Hough, Oakland University, Rochester, MI, USA, and Manfred Ehlers, University of Maine, Orono, ME, USA

135

SUMMARIES Summary

of Data Sources Utilized in Workshop Papers James C. Tilton, NASA Goddard Space Flight Center, Greenbelt, MD, USA

Summary

of Types of Data Fusion Methods Utilized in Workshop Sylvia S. Shen, Lockheed Palo Alto Research Lab., Palo Alto, CA, USA

Summary

of Types of Data Output Produced by Studies Described Axel J. Pinz, Institute of Surveying and Remote Sensing, University of Agriculture, Wien, Austria

143

145

Papers

In Workshop

Papers

151

APPENDIX 155 Participants

vi

PRESENTATIONS

N91-15616 Refinement

of Ground

With Segmented Jon W. Robinson Image Analysis Section ST Systems Corporation 4400 Forbes Boulevard Lanham, MD 20706 Phone: (301) 794-5200 E-Mail: [email protected]

Reference Image

Data

Data

James C. Tilton Mail Code 936 Information Systems Development Facility NASA Goddard Space Flight Center Greenbelt, MD 20771 Phone: (301) 286-9510 E-Mail: [email protected]

ABSTRACT There are a variety of ways for determining ground reference data for satellite remote sensing data. One of the ways is to photo-interpret low altitude aerial photographs and then digitize the cover types on a digitizing tablet. These digitized cover types are then registered to 7.5 minute U.S.G.S. maps that have themselves been digitized. The resulting ground reference data can then be registered to the satellite image, or, alternatively, the satellite image can be registered to the ground reference data. Unfortunately, there are many opportunities for error when using a digitizing tablet and the resolution of the edges for the ground reference data depends on the spacing of the points selected on the digitizing tablet. One of the consequences of this is that when overlaid on the image, errors and missed detail in the ground reference data become evident. This paper discusses an approach for correcting these errors and adding detail to the ground reference data through the use of a highly interactive, visually oriented process. This process involves the use of overlaid visual displays of the satellite image data, the ground reference data, and a segmentation of the satellite image data. Several prototype programs have been implemented on the VAX computer system and the IVAS image display system to examine various methodologies for improving ground reference data. These programs provide a means of taking a segmented image and using the edges from the reference data to mask out those segment edges that are beyond a certain distance from the reference data edges. Then using the reference data edges as a guide, those segment edges that remain that are judged not to be image versions of the reference edges are manually marked and removed. We describe the prototype programs that were developed and the algorithmic refinements that facilitate execution of this task. Finally, we point out areas for future research.

INTRODUCTION There are many ways to use multiple data sources in remote sensing. In this paper, we discuss a method for using image data to improve ground reference data sets. Reference data sets are sometimes referred to as "ground truth"; however, since the methods of creating reference data sets provide many opportunities for error, we will use the term reference data set. This terminology allows for the possibility of using multiple data sources for determining not only the contents of the reference data set but also for refining it.

This work was supported by the Office NASA Headquarters, Washington, DC.

of Aeronautics,

Exploration

and

Technology,

We have obtained

satellite image data from a number of investigators along with reference data sets that they created. By using the image data to create a segmented image, edges consistent with the spectral variation in the image are created. Visual inspection of the reference data edge map overlaid on the raw image or the segmented image reveals many discrepancies. These may be errors, or simply inappropriate attention to detail by the person generating the reference data (particularly if they were digitizing data from a digitizing tablet). An example exhibiting displacement errors and too coarse of a digitization scale is given in Figure 1. While it would be desirable to have an automatic method of using the segmented image data to revise the reference data set, we chose, as a first approximation, to develop prototype methods that allow interactive selection and deletion of the segmented image edges that appeared to be too far from those edges generated from the reference data set. The end result is to leave a set of edges that are considered to be the true edges derived from the segmented image data, based on their proximity to the original reference data set edges. METHODS A series of allowed the attached to a segmentation Processor) at

programs were developed using IDL (Interactive Data Language) that user to use either an IIS system 575 terminal or an IIS IVAS terminal VAX cluster for interactive editing of the edge map generated by an image algorithm (Tilton, 1989) which runs on the MPP (Massively Parallel Goddard Space Flight Center.

The image segmentation algorithm is an iterative process. At each iteration, those regions that are most similar by a particular criteria (e. g., minimum change in image entropy, or minimum rise in image mean squared error) are merged. As the number of iterations increase, the number of image segments decreases. A program called "edge.movie" that runs on the IIS System 575 was developed that allows the user to view various iteration steps and compare these with the original reference data set edges and bands from the image data. This program allows the user to interactively pick an iteration and then immediately compare it with these other sources of information. It is best to select an iteration with more segments and edges than necessary to fit the reference data so that a crucial segment edge that may match a particular reference edge will not be lost. On the other hand, choosing too low an iteration and thus an image with too many segments and edges increases the work load of the analyst significantly. The next step is extract the edges of the selected iteration from the combined edge file which contains the edges from all iterations coded by iteration number. This leaves an image of all of the edges present at iteration n with the pixel value of the edge indicative of its age. The original reference data is a raster format image divided into regions. function, GREFEDGE.PRO was written to extract the edges from this image.

An

IDL

This segmented edge file is then masked with the edge file from the reference data to eliminate all edges that are beyond a specified distance from the reference set edges. This distance is selected by the analyst and depends on the image set being analyzed. Figure 2 shows an example of this process after this masking is complete.

4

Becausethe main algorithm for this procedurecanonly processimagesthat are 128x 128,sectionsof this sizeor smallermustbeextractedfrom larger imagesfor processing andthen be recombinedto producea final product. In orderfor there to be continuity betweensectionsit is necessaryto havea certainamountof overlapbetweenthem. The degreeof overlaprequiredis beinginvestigatedin ongoingstudies. Two programs,CHGVAL5.pro and CHGVAL4.pro (for the IIS System575 and IIS IVAS respectively)werewritten in IDL to provide a prototypeinteractiveenvironment (listings in AppendixA). The input files (image,segmentedge,original referenceedge) for eitherof theseprogramscaneitherbe input oneat a time asthe programrequestsor the input file namescanbereadfrom aninput file. The CHGVAL5.pro programfor the IIS System575 image display systemusesa track ball to movethe cursoron the imagedisplayandthe track ball function buttonsto either deleteanedgepixel, replaceandedgepixel or go on to the next stepin theprogram. The CHGVAL4.pro program for the IIS IVAS systemfunctions the sameway except that it allows the displayto be zoomedandroamedwhenlooking for edgesto delete. In addition,the IIS IVAS systemusesa threebuttonmouseinsteadof a multi-buttontrack ball. The CHGVAL#.pro programsstart by asking whetherthe analystwishes to enter the namesof the imagefiles individually or whetheran input file containingall of the other input items is to be used. Sincethe programis usuallyinvoked manytimes to complete the processingof a singlesection,the useof the input file savesthe analystthe troubleof rememberingandtyping in all of theotherfile nameseachtime the programis usedon a section. The edgesfrom the original referenceimage are loadedinto the graphicsplane of the display device. Two bandsof the original imagedataareloadedinto the red and green display memoriesandthe edgesfrom the maskedsegmentededgeimage areloadedinto theblue displaymemory. In the CHGVAL5.pro programfor the IIS system575,the analystusesthe track ball to movethe cursoroverpansof the blue edgeimagethatthey wish to removeandclick on the appropriatetrack ball button, producing "nicks" in the selectededge. For the CHGVAI.,4.proprogramfor the IIS IVAS system,the usercanzoomandroamthe image with the mousefirst so it is easierto seewhatandwhereoneis deletingedgesegments. The terminalmonitorpromptsthe userwith thecurrentfunctionsof the trackball buttons or the mousebuttons. An exampleof this processafter nicking severaledgesis given in Figure3. After havingnicked a numberof edgesfor removal,the analystcanexit this part of the program and the nicked edgeimage is written out to a file for processingby a batch program that runs on the MPP. After writing out the nicked edge image, the CHGVAL#.pro programsubmitsthe batchprogramto the MPPandwaits for it to finish. The MPP program writes out the new edge image with the nicked edgescompletely removed. The CHGVAL#.proprogramthenreadsin the new edgeimage and writes it out to thered displaymemorysothe analystcanreview theresultsof his work. When the new edgeimage is readinto the red memorybank,thosepartsof the original edgeimagethatweredeletedshowup asblue andthosepartsthat werenot deletedshow up asmagenta(red "newedgeimage"+ blue "blueold edgeimage"). The analystis then given the choiceof undoingsomeof his deletionsor continuing with further nicking or

5

quitting andsavingtheresult. An exampleof the final resultfrom CHGVAL#.PROfor a single128x 128sectionof datais givenin Figure4. There are two types of edge connectivity that can be used, 8 connected and 4 connected. With 8 connected edges, unexpected deletions can chain through edge intersections. Thus it is particularly useful to have a number of ways of back tracking. Our prototype system provides several alternative ways to recover edges. The simplest method is that while nicking lines, the user can undo a nick by placing the cursor over the nicked pixels and pushing the appropriate key on the mouse or the track ball. Alternately, the user can run a test, deleting the nicked edges and determine whether he wants to undo some of the nicks. If the user chooses to undo some of the nicks he can either fill in particular nicks using the cursor control device or he can pop deleted pixels off of the stack into their former locations. The latter method is most effective if the most questionable

deletes

are saved

until near the end of a nicking

cycle.

These methods of recovering erroneously deleted edges only work within a single nick and try cycle. If one either continues with a new nick and try cycle or saves the last result and exits the CHGVAL#.PRO program, then another method must be used to recover lost edges. In order to repair or recover edges after leaving the CHGVAL#.PRO a program that would allow pasting edge pixels from an earlier edge image into the current edge image was developed. This program puts one band of the image into the green display memory, the edge image to be fixed into the blue display memory and the master or original edge image into the red display memory. The user then uses the cursor control device to put the cursor over the master edges that he wishes to copy to the edge image to be fixed and presses the appropriate button. This puts a copy of the master pixel overlayed by the cursor into the edge image to be fixed. In this manner each pixel of an edge segment in the master image can be copied into the edge image to be fixed. An appropriate change in color takes place for each pixel that is moved into the edge image that is being repaired so the user can tell which pixels are present in both images. The above description is summarized in Figures data flow from the original image and ground edges (producing a revised ground reference describing the CHGVAL_.PRO programs. After

several

overlapping

128x128

edge

images

5 and 6. Figure 5 illustrates the overall reference data to the revised reference data file). Figure 6 is a flow chart

have

been

edited,

it is necessary

to

rejoin them into a single image, to delete nonterminating edges in the overlap region and to join edges from overlapping segments. Another IDL procedure, COMBCHG4.PRO that runs on the IIS IVAS display is being developed to accomplish this task. Since its functions are so similar to those of CHGVAL4.PRO it is being designed to perform these functions also. Two test images are being used. The first is a 468 x 368 pixel Thematic Mapper (TM) image of the Ridgely Quadrangle on the eastern shore of Maryland. It contains few classes and most of them are large in area. The main features of the scene are fields and wooded streams. There are small areas of water, ponds, and single urban area. These areas were digitized from a 7.5 United States Geological Survey quadrangle map that had had its reference data boundaries drawn on a plastic overlay from 1977 aerial photographs using a digitizing tablet (Gervin, et al., 1985). In the original study, the ability of AVHRR and MSS data to distinguish Level 1 land cover classes was examined. This involved registering the MSS data to the Ridgely Quadrangle and resampling it to 60 meter pixels. This lead to the ground reference data being rasterized to 60 meter pixels. Since the pixel size of the original reference data set was 60 meters and TM data

6

is 30 meters,the disparity betweenthe TM boundariesandthereferencedataboundaries, andlack of detail in thereferencedataareunderstandable.Examplesshownin Figures1 through4 arefrom a 128x 128sectionin the upperleft comerof this dataset. The secondimageis a portion of the WashingtonD.C. metropolitanarea. This areawas brokendowninto moreclassesthanthe Ridgelyquadrangleandthe classesaresmallerin size. Testswith this datasetwerenot yet completedasof this writing. RESULTS& DISCUSSION In practice, the prototypesworked well, allowing the edge maps to be trimmed by nicking thoseedgesto be deleted. Problemswereprimarily onesof speedin loading the original referencedatainto thegraphicsplane. The useof 8 connectededgeslead to unexpectedchainsof deletions. Becausethis was so severe,the processingprocedureswerestartedoverusing4 connectededges. Joining the segmentswith overlappingboundarieswas not a seriousproblem with the Ridgelyquadrangle.Thelarge areasandconcomitantsmallnumberof edgescontributed to the eachin joining the segmentstogether. Looking again at Figure4, note that refined groundreferenceboundaries(obtainedby selecting boundaries from the image segmentation)follow very closely visible boundariesin the imagedata. In particular,notethevery coarse,misregisteredboundary from the original groundreferencefile the top middle of the image. The refined ground reference boundary is perfectly registered,and follows the actual variations in the boundaryvery closely. We plan to use this and other data setsin comparativestudiesof various algorithms designedto extractspatialinformationfrom imagery. We expectthat therefined ground referencedata will help to more accuratelyevaluatethe behaviorof thesealgorithms thanwould theoften too coarseandmisregisteredoriginal groundreferencedata. REFERENCES Gervin, J. C.,

A. G. Kerber, R. G. Witt, Y. C. Lu and R. Sekhon, 1985, "Comparison of Level I Land Cover Classification Accuracy for MSS and AVHRR Data," International Journal of Remote Sensing, 6(1), pp. 47-57 Tilton, J. C., "Image Segmentation by Iterative Parallel Region Growing and Splitting," Proceedings of the 1989 International Geoscience and Remote Sensing Symposium, Vancouver, BC, Canada, July 10-14, 1989, pp. 2420-2423.

7

=

Figure 1. Edges (black) overlaid Landsat Thematic the displacement scale compared to

Figure 2. Edges from a ground reference file (white) and edges from an image segmentation (gray) overlaid upon the corresponding Landsat TM image. Image segmentation edges further than 6 pixels from a ground reference edge have been masked out.

from a ground reference file upon the corresponding Mapper (TM) image. Note errors and coarse digitization the Landsat data.

Figure 4. Edges from a ground reference file (white) and the final selection of corresponding edges from an image segmentation (gray) overlaid upon the Landsat TM image. This final selection of edges can now be used to generate a label map that can be used as a substitute for the original ground reference file.

Figure 3. Edges from a ground reference file (white) and edges from an image segmentation (gray) overlaid upon the corresponding Landsat TM image. Image segmentation edges that are shown to be "nicked" in this image will be deleted by the next processing step (connected components labeling).

ORTGINAL

8 BLACK

AND

WHITE

PAGE PHOTOGRAPH

ground reference

image]

label

map

SEGMENTATION IMAGE ]

U SELECT ITERATION

EXTRACT EDGES

EXTRACT EDGES

SEGMENT EDGES MASK OUTPP) (GREFMASK.

I segment edges

reference edges

J

masked segment edges


__.___ _-"6 ¢)

A

_! -_

_ ,_ _ .- _ ,,

_-_ _ _o _ ,.

E

0 D

"

N

IJJ ._1 '¢_

E

rr

,,

-,-__

_,

>0

k-.._ 5_

0. "I

57

m

0

,,

58

N91-15620 COMBINED FLUORESCENCE, REFLECTANCE, MEASUREMENTS OF A STRESSED NORWAY FOR

FOREST

DAMAGE

AND GROUND SPRUCE FOREST

ASSESSMENT

C. Banninger Institute

for Image

Processing and Computer Joanneum Research

Wastiangasse Tel. (316) E-Mail:

6, A-8010, 8021, Telefax

Graz,

Graphics,

Austria

(316) 8021-20,

[email protected]

The detection and monitoring of stress and damage in forested areas is of utmost importance to forest managers, who require timely and accurate information on the state of health and vitality of this natural resource for planning purposes. The extensive and often difficult-to-assess nature of many of the world's forested regions makes remote sensing the most suitable means to obtain this information. This requires that remote sensing data employed in a forest survey be properly chosen and utilised for their ability to measure canopy spectral features directly related to key tree and canopy properties that are indicators of forest health and vitality. Plant reflectance in the visible to shortwave infrared regions (400-2500 nm) provides information on its biochemical, biophysical, and morphological make up, whereas plant fluorescence in the 400-750 nm wavelength region is more indicative of the capacity and functioning of its photosynthetic apparatus. A measure of both these spectral properties can be used to provide an accurate assessment of stress or damage within a forest canopy. What is necessary is to define the specific wavelengths within these spectral regions that provide optimal information on a plant's health and vigor. Foliar chlorophyll and nitrogen are essential biochemical constituents required for the proper functioning and maintenance of a plant's biological processes. Chlorophyll-a is the prime reactive centre for photosynthesis, by which a plant converts carbon dioxide and water into necessary plant products. Nitrogen forms an important component of the amino-acids, enzymes, proteins, alkaloids, and cyanogenic compounds that make up a plant, including its pigments. The measurement in a canopy by remote sensing methods would allow the rapid appraisal of a forest's state of health. Both chlorophyll and nitrogen have characteristic absorption features in the visible to shortwave infrared region that furnish information on their content within a canopy. By measuring the wavelength position and depth of these features and the fluorescence response of the foliage, the health and vitality of a canopy can be ascertained. For a stressed Norway spruce forest in southeastern Austria, foliar chlorophyll-a and nitrogen content and leaf area indices (LAD were derived and foliage reflectance and fluorescence measurements obtained from samples collected at 50-m grid intervals over the test site. The discrete data sets were transformed into continuum data sets in the form of isopleth maps, and resolution cells corresponding to the size of the ground-projected pixels of the Landsat Multispectral Scanner (MSS) and Thematic Mapper (TM), NS001 Thematic Mapper Simulator (TMS), Thermal Infrared Multispectral Scanner (TIMS), Airborne Imaging Spectrometer (AIS-2), and Fluorescence Line Imager/Programmable Multispectral Imager (FLS/PMI) sensor systems derived for each ground data set. These cellularised data sets served as the reference data for evaluating the capabilities of the various remote sensing data used to differentiate stress or damage in the test site forest, which served as a test model for the development of remote sensing-based algorithms for more universal application. In addition to chlorophyll-a, nitrogen, and LAI data, other canopy biochemical constituents, canopy biogeochemistry, soil geochemistry, and site slope, aspect, and elevation information has been incorporated into the cell data bases. Examples of these will be presented in the paper.

59

N91-15621 A

Phenomenological Analysing

Approach Infrared

to

Multisource and Visible

Data Data

Integration:

N. Nandhakumar Electrical

Engineering

Dept.,

Univ.

of Virginia,

Charlottesville,

VA 22903

Abstract A new method is described approach uses phenomenological are based on intrinsic analyzed to estimate

for combining multisensory data for remote sensing applications. models which allow the specification of discriminatory features

physical properties surface heat fluxes.

of imaged surfaces. Thermal and visual images Such analysis makes available a discriminatory

is closely related to the thermal capacitance for labelling image regions based on physical from existing approaches which nonlinear combination of signal evidential approach.

1

of the imaged objects. This properties of imaged objects.

The that

of scenes are feature that

feature provides This approach

a method is different

use the signal intensities in each channel (or an arbitrary linear intensities) as features - which are then classified by a statistical

or or

Introduction

Multispectral/multisource riety of applications such

data acquired via remote sensing have been shown to be useful for a vaas urban land-cover assessment, rain-rate classification, crop assessment,

geophysical investigation, and surveillance and monitoring for national defence techniques have been developed for combining the information in the different These techniques typically use statistical or evidential rules to achieve the desired The

usual

statistical

approach

consists

of first

forming

a feature

vector

activities. Various sensing modalities. classification.

wherein

each

element

corresponds to the signal value (pixel gray level) from each sensor. This feature vector is then classified by a statistical decision rule. Other features such as the mean intensity level in a beighborhood, contrast, second and higher order moments, entropy measures, etc. have also been used as elements of the fusion

feature vector, of information

e.g. [1]. In such approaches, interpretation of the imaged scene based on the from the different sensors may be said to occur at the lower levels of analysis.

In some techniques, linear and/or nonlinear combinations of signal values from different sensors form a feature, several of which are then fed to a classifier, e.g. [2]. In the latter case, interpretation may be said to occur at higher levels of analysis, after an earlier stage of information fusion which extracts discriminatory

features.

Other

extensions

to the standard

statistical

approach

have

been

reported,

e.g., a fuzzy relaxation labelling approach for image interpretation has been reported [3] wherein a Gaussian maximum likelihood classifier is used to provide initial probability estimates to the relaxation process. Different optimal classification rules have been developed for interpreting multisource data each of a variety of statistical models assumed for the data. The classifiers however do not address problem

of choosing

sufficiently

Such approaches therefore is impossible to guarantee.

sets are warranted for achieving of the imaged objects are being Evidential

approaches

discriminatory

features

from the infinite

number

of available

suffer from the disadvantage that the global optimality Also, training of such classifiers is difficult since very a reasonable error rate. It is also not clear what utilized by the classifier during the discrimination

have also been

developed

61 PRL:=CEDiNGPAGE I_LAhlK NOT FILMED

for combining

information

for the

features.

of the feature set large training data physical process.

properties

from multiple

sensing

i T.erma,,mage I I V sua,,ma0e 1 Surface Temperature _irVe_city

Jf

I

Air_mp.

Radiated Convected Heat HeatFlux Flux '

I Surface Reflectance, Orientation

I

Date, Time_,l

I I Absorbed HeatF,ux I •

i

I Conducted

i

Heat Flux

I

I Temp.

I

R = Conducted/Absorbed I

Figure

1. Combining

thermal

modalities, e.g. [4]- [6]. Such methods contrast measures for each sensor and

Region Labels

and visible

data

I

Refl.

l

for surface

heat

rely on a large set of heuristics compare outputs from different

flux estimation.

rules which examine local sensors to provide varying

degrees of support (certainty values) for a hypothesized class. A non-probabilistic framework is used for updating these uncertainties to reach a final classification. Interpretation in such systems is attempted at multiple levels of analysis. The rules, however, are based on manifestations of the differences in the intrinsic physical properties of objects rather than on direct measures of the physical properties themselves. fusion.

in multisensor

data

Due to these reasons, it is desirable to first combine information from the different sensors on a physical model of the scene with the objective of evaluating intrinsic physical properties

based of the

imaged features higher

Such approaches

therefore

do not fully exploit

objects. Such an a_alysis allows for specification which may then be used for scene interpretation

the synergy

available

of physically meaningful and discriminatory by a probabilistic or evidential classifier at

levels of analysis.

This paper discusses the development fusion of information from infrared (IR) which principles of heat transfer sinks and sources in the scene.

and use of phenomenological scene-sensor and visible data. A computational model

models for the is established in

are used along with computer vision techniques to derive a map of heat The approach uses infrared imagery sensed in the 8pro - 12#m band,

monochrome visual imagery, and knowledge of ambient conditions at the imaged surface to estimate surface heat fluxes in the scene. A feature which quantifies the surface's ability to sink/source heat radiation is derived and is shown to be useful in discriminating between different types of material classes

such as vegetation

It is assumed algorithm processed

that

and pavement.

the thermal

image

is segmented

into dosed

regions

by a suitable

segmentation

(e.g., [7]) and that the thermal and visual images are registered. The thermal image to yield estimates of object surface temperature. This process requires the formulation

an appropriate model which relates scene radiosity to surface temperature, the thermal camera to scene radiosity. Several object and scene parameters 62

is of

and received irradiation at such as surface reflectivity,

emissivity,reflectedsceneradiosityareincorporatedin the model.The visualimage,whichis spatially registeredwith the thermalimage,yieldsinformationregardingthe relativesurfaceorientationof the imagedobject. The Lambertianreflectancemodelis usedalongwith the shape-from-shading principle for this purpose.The aboveinformationalongwith informationregardingambienttemperature,wind speed,and the date and time of image acquisitionis usedin a computationalmodel that allows estimationof surfaceheatfluxesin the scene.The estimatedsurfaceheat fluxesareusedto evaluate a featurethat is closelyrelated to the lumpedthermal capacitanceof the object. This featureis shownto bea meaningfulanddiscriminatoryfeaturefor sceneinterpretation.A blockdiagramof the approachis shownin figure 1. Multisensoryimages,andin particular - thermal andvisual images,havebeenusedin the past for evaluatinga rough estimateof thermalinertia for variousremotesensingapplications[8] - [10]. The previouslydevelopedmethodsusevery simplemodelsof the sceneand of the energyexchange phenomena ocurringat the imagedscene.In contrastto thesepastapproaches, the techniquedeveloped in this paperis basedon an explicit andmoredetailedphysicalmodelof the energyexchangein the sceneand providesmoremeaningfulanddiscriminatoryfeaturesfor classification. The remainderof this paperis organizedasfollows.Section2 describesan approachfor extracting accuratesurfacetemperatureestimatesfrom infrared imagery.Section3 discussesthe computation of relative surfaceorientationfrom visual imagery.Section4 describes the estimation of surface heat fluxes at the imaged scene. Section 5 discusses two different ways of using the surface heat flux estimates for scene interpretation. Section 6 presents experimental results using real data, and section 7 contains

2

a summary

of the ideas presented

Estimating

A quantitative

Temperature model

in this paper.

from

has been derived

Thermal

for estimating

Images

the surface

temperature

of a viewed

object

using

the thermal image. Details of the derivation may be found in reference [11]. The salient points of this model are presented below. The model is based on observations that are unique to the situation where outdoor scenes are illuminated by solar radiation. The derivation of the model rests on the following observations and results: 1. Most surfaces

found

in outdoor

scenes may be considered

to be diffuse emitters

in the 8#m-12#m

band. Furthermore, they possess high emissivities in this band - in the range of 0.82 to 0.96. Hence, a constant value of 0.9 may be assumed for the IR emissivity of all imaged surfaces in outdoor scenes. 2. The radiosity radiation, contribute

of an object's

suface

in a natural

scene comprises

and reflection of radiation that emanates to the total irradiation at the IR detector.

the 8#m - 12Ftrn band. the other hand, a large

Foc between

involves

reflected

solar

lie typically between 250K and 350K. Since IR emissivities are at the IR detector is dominated by emission from the surface of

the imaged object. The components other objects may be safely ignored.

and typically

emission,

Furthermore, the surface reflectivities to IR radiation are very low. On percentage of emission from scene objects lies in the 8#m - 12#m band

since their surface temperatures also high the scene irradiation

3. The view factor

surface

from other surfaces. These components Only 0.1% of the total solar energy lies in

due to reflected

the camera

the evaluation

and

imaged

of complex

53

solar radiation

surface

integrals

and reflected

depends

emissions

on the viewing

[12]. A reasonable

from

geometry

approximation

to

d

Ao

Figure 2. View factor between camera and imaged surface. Ac is the circular viewing surface of the detector, _ac is the solid angle subtended by the detector, Ao is projection of A_ onto the imaged surface, d is the distance between the camera and the imaged surface, and 0¢ is the angle between the surface

normal

and

the viewing

direction.

Foc is arrived at by making the following observations. Since the solid angle detector w_ is usually very small (on the order of 2 mrad) we can approximate

subtended by the the projection of

the detector's

patch,

Ao in figure

viewing

surface

onto the imaged

surface

to be a planar

circular

denoted

by

2.

The approximations indicated above allow for the derivation of a simple model that relates the surface temperature of the imaged object to the digitized value of the IR sensor's signal due to irradiation

at the sensor

[11]. The

0.9

resulting

model

is expressed

1 )_s(exp(C-_*_T)

as:

- 1) dA = K_,Lt + Kb

where, C1 and C2 are constants in Planck's equation and and C2 = 1.439 × 104 #inK. T is the surface temperature

(1)

have values: C1 = 3.742 × 10 s W#m/m 2 of the imaged object, )_ is the wavelength

of energy, )q = 8#ra and )_2 = 12#m. Lt is the pixel gray level value of the digitized thermal Ka and Kb are constants for a particular imaging setup and are obtained by appropriate calibration as described below. The

model

established

above

provides

image. camera

a simple

algorithm for surface temperature estimation. At el is created for different values of T; _5(_xp(V_/_T,)_l)d)t

the outset a table of values of F(Ti) = 0.9 f_/ via numerical integration of this expression. A scene

containing

two objects

at two different

known

temperatures is imaged. The corresponding gray level values are used in equation (1) to solve for the constants Ka and Kb. Thereafter, the temperature of other surfaces in other scenes can be determined by first evaluating the right-hand side of equation (1) using the corresponding gray level. Then the table

of values

index

Ti now provides

of F(Ti)

created the surface

above is looked temperature

up for a matching estimate.

54

value.

The value

of the associated

In general,

an exact

match

will not be found

acquire a reasonably accurate value Lt in the thermal image side of equation values of F(T_) F(Tm)

(1) evaluate assume that

< G < F(Tn).

The

has been

desired

value of surface

2. The distance between 3. The

spheric

3

the surfaces

distance

hundred If neither

between

as follows

to

gray level right-hand

for a match in the table of F(Tm) and F(Tn) such that computed

as:

the effect of atmospheric

imaging

appear

in the same

and the thermal

and

the

attenuation

situations:

camera

are to be estimated

surfaces

(2)

thermal

and

camera

scene

that

was used

is the same as the distance

the camera. is on the

order

of only a few

[13].

of the above attenuation

estimation,

surfaces

temperatures

imaged

is then

are to be estimated

the calibration

between

meters

is performed

Tn -Tm (G - F(Tm)) F(T.)- F(T )

for the following

temperatures

whose

temperature

for temperature

This is justifiable

1. The surfaces whose for calibration.

interpolation

to G, i.e., G = K_Lt + Kb. On searching G is found to lie between adjacent entries

the above approach

ignored.

a linear

estimate of the surface temperature. Consider a particular which corresponds to a surface temperature of Ts. Let the

T, = Tm + In deriving

and

conditions

apply,

appropriate

models

need to be applied

to account

for atmo-

loss [13]-[15].

Inferring

Surface

Reflectivity

and

Relative

Orientation

In order to estimate heat fluxes it is necessary above but also surface reflectance to visible

to estimate not only surface temperature as described radiation and also surface orientation relative to the

incident (solar) In the following

of the scene provides clues to both these quantities. the infrared and visual images of a scene are spatially

registered,

radiation. discussion

and that

The visual image it is assumed that

the images

are segmented

a priori into regions

by a method

such as that

described

addressed

by several

in [7]. The

use of shading

information

researchers, e.g. [16]-[20]. satisfied. The bi-directional Image

resolution

to recover

the shape

of an object

has been

These techniques, however, can be applied only if certain conditions are reflectance distribution function of the surface must be known a priori.

must be high enough

to allow the rendition

of several

surface

patches

near the occlud-

ing boundary, or there need to exist background patches of known surface orientation surrounding the region of unknown surface orientation. These conditions are difficult to satisfy when imaging objects in a natural scene. We also note that while the aforesaid efforts attempt the problem of determining the (x, y, z) direction cosines of the surface normals of the imaged surface, our problem is a much simpler one, i.e., to arrive at an accurate estimate of cosOi, where Oi is the angle between the surface normal and the direction of incident radiation. A simpler method may be used for this purpose as described below. Real

surfaces

in outdoor

scenes

exhibit

a combination

diffuse component has been found to dominate reasonable to assume that the imaged surfaces

of diffuse

in commonly are Lambertian 65

and

occurring reflectors.

specular

refiectivities.

The

surfaces [21]. Hence, it is If L_ represents the gray

levelof

a pixel in the visual

to that pixel is related

image,

the relative

to the brightness

surface

orientation

of the surface

patch

corresponding

value by: L,, = Kp cosOi

+ Cv

(3)

where, Kp = p K_, p is the surface reflectance, K_, C,, are constants of the visual imaging system and are determined via calibration. The calibration process simply consists of imaging two different surfaces at known orientations to solar radiation and of known reflectivities, whence the constants Kv and Cv are easily

computed.

It is possible to obtain via stereoscopic image analysis, laser radar imagery, or from registered digital terrain data, the orientation of one elemental surface patch in the entire surface that is represented by a given image region. This orientation is best acquired for the elemental area that provides estimate, e.g., one that lies within a large planar patch. The region reflectivity p is then using equation

(3).

Knowing

p, the value of cosOi is easily

computed

at each pixel in that

a reliable computed

region

using

equation (3). The surface is assumed to be opaque, hence, the absorptivity is computed as a8 = 1 - p. The above procedure is applied to each region in the image to provide estimates of p and cosOi at each pixel

in the entire

image.

The assumption presence polished

that

viewed

of transparent objects surface). It is assumed

that

presented

4

Estimating

surfaces (e.g. that

are opaque

and are Lambertian

is sometimes

violated

by the

glass windows, lakes), or regions of specular reflection (e.g. a such regions in the imagery are detected by means other than

in this paper.

Surface

Heat

Fluxes

In this section, the various heat fluxes at the surface of the object are identified and the relationship between them is specified. A method for estimating these heat fluxes is then presented. This method uses values of surface temperature deduced from the thermal image, and surface reflectivity and relative orientation deduced from the visual image. Section 5 describes methods for interpreting imaged scenes using these

heat

Consider

an elemental

flow, the heat solar radiation, temperature denotes the

flux estimates. area on the surface

of the imaged

object.

Assuming

one-dimensional

heat

exchange at the surface of the object is represented by figure 3. Wi is the incident 0i is the angle between the direction of irradiation and the surface normal, the surface

is To, and W_b_ is that portion of the irradiation that is absorbed heat convected from the surface to the air which has temperature

by the surface. We. Tamb and velocity

V, heat

W,.,_d is the heat lost by the surface to the environment via radiation and Wcd denotes the conducted from the surface into the interior of the object. Irradiation at the object surface also from other scene components. As discussed in section 2, the magnitude of includes that emanating this absorbed irradiation is small when compared to total solar irradiation absorbed in visible and IR bands. The contribution of the former may therefore be ignored. At any given instant, applying the law of conservation of energy to the heat surface of the object and those flowing out from the surface, we have,

W.b,

= Wed + We, + W_ad

where,Wrod= 0.9 o (T: - T2mb), W_bo = Wi cosOi ao , 66

fluxes

flowing

into the

(4)

\1/

Wrad

V s Air

ov

Tamb

Figure

a denotes

the

convected

heat

3. Exchange

Stefan-Boltzman transfer

of heat

fluxes

constant,

is given

and

h is the

average

at the surface

_s denotes

convected

heat

=

h(T,

transfer

In order

image

to estimate

as discussed the

the solar

heat

magnitude of the incident radiation and the reflectivity of the imaged

in section

flux absorbed

-

object.

absorptivity

of the

surface.

T,,,_b)

coefficient,

surrounding air (e.g. velocity, viscosity, temperature, object's surface. We note that Wrad is immediately from the thermal

of the imaged

The

by Wcv

where,

;woo

(6) and

depends

on the

properties

of the

etc.), and on the geometry and the nature of the available when T_mb is known since T_ is deduced

2. by the

surface,

it is first neccesary

on a horizontal surface and then compensate surface. One approach is to directly measure

radiation using a pyrheliometer. Alternately, as was done in the appropriate analytical model may be used to estimate this quantity.

to determine

the

for the orientation the incident solar

experiments described later, an The variation (with day of the

year and time of day) of the intensity of solar radiation incident on a horizontal surface on the ground has been modelled by Thepchatri, et al. [22] based on the data presented by Strock and Koral [23]. The empirical model accounts for diurnal and seasonal variations. This model is used for specifying W_. Thus, knowing cos_ and c_ as described in section 3, Wab, may be computed using from equation (5). The convective heat flux is obtained by using equation (6). The temperature for the object's surface is obtained from the thermal image as described in the previous section. The ambient temperature Tamb is known. The problem therefore h. A plethora of empirical correlations hydrodynamic is fiat allows

lies in estimating the average convected heat transfer coefficient have been established for computing h for various thermal and

conditions [24]. The simplifying the use of convecion correlations

procedure for estimating the convected air temperature, the Reynolds number

assumption developed

that the portion of the surface being viewed for external flow over fiat plates [24]. The

heat flux is as follows: is computed, where the 57

knowing the characteristic

wind velocity and the length of the object is

Wabs

Figure

assumed

to be 1 meter.

4. Equivalent

The value of the

flow conditions exist. Accordingly, obtained and thence the convected the estimate

of convected

heat

thermal

circuit

Reynolds

of imaged

number

surface.

determines

the appropriate correlation is used. heat transfer coefficient h. Equation

whether

laminar

The Nusselt number is thus (6) is now used to provide

flux.

Having estimated the convected heat flux, the radiated heat flux, and the absorbed described above, the conduction heat flux is then deduced using equation (4).

5

Scene

irradiation

as

Interpretation

The estimated surface heat fluxes may be used to derive physically meaningful scene. Two different methods are discussed. The first approach assumes that of the

or mixed

scene is available,

and it consists

of the thermal

image,

the visual

interpretations only a single

image,

and

values

of the data set of scene

parameters obtained at a particular instant of time. The second method assumes that a sequence of data sets obtained at different time instants is available. The first approach is more suitable for scenes, the contents suitable containing

5.1

of which

for scenes

change

only vegetation,

Analysis

The estimated

frequently,

containing

objects buildings

of a Single surface

heat

e.g., one that that

automobiles.

over the sequence

The second of data

sets,

approach

is

e.g., scenes

and pavements.

Multisensory fluxes

contains

are stationary

may

be used

Data

Set

to evaluate

how well an object

can act

as a heat

source/sink. Thus a highly discriminatory feature may be extracted which is closely related to an intrinsic physical property of the imaged object. Considering a unit area on the surface of the imaged object,

the equivalent

thermal

circuit

for the surface

68

is shown

in figure

4. CT is the lumped

thermal

Object

Thermal Capacitance ( × 10 -6 Joules/Kelvin)

Asphalt Pavement Concrete Wall

1.95 2.03

Brick Wall

1.51

Wood(Oak) Granite

Wall

1.91 2.25

Automobile

Table1: capacitance

of the object

0.18

Normalized

values

of lumped

thermal

capacitance.

and is given by CT = DVc

where, given

D is the density

of the object,

by:

V is the volume,

and c is the specific

1 Roy = -_

heat.

The

resistances

are

1 and

Rr_d = 0.9a(T_

+ T2_mb)(Ts + T_mb)

From figure 4 it is clear that the conduction heat flux Wed estimated on the lumped thermal capacitance CT of the object. A relatively

in the previous section depends high value for CT implies that

the object is able to sink or source relatively large amounts of heat. An estimate of Wcd, therefore, provides us with a relative estimate of the thermal capacitance of the object, albeit a very approximate one. Table 1 lists values of CT of typical objects imaged in outdoor scenes. The values have been normalized Note

for unit that

automobiles

the and

volume thermal

hence

of the object. capacitance

for walls

Wed may be expected

and pavements to be higher

is significantly

for the

former

greater

regions.

than Plants

that

for

absorb

a

significant percentage of the incident solar radiation. The energy absorbed is used for photosynthesis and also for transpiration. Only a small amount of the absorbed radiation is convected into the air. Therefore, the estimate of the Wed will be almost as large (typically 95%) as that of the absorbed heat flux. Thus, Wed is useful in estimating the object's shown to be useful in discriminating between different the feature's the ratio

dependence

on differences

in absorbed

ability classes heat

to sink/source heat of objects. However,

flux, a normalized

feature

radiation, a feature in order to minimize was defined

to be

R = Wcd/Wobs Although imaged object.

the heat

flux ratio,

R

object, it is not discriminatory Other sources of information

=

Wcd/Wabs

,

does capture

a great

deal of information

5.2

results

Analysing

of using this approach

Temporal

If a temporal sequence conditions is available,

the

enough to unambiguously delineate the identity of the imaged are therefore warranted. Hence, information such as the surface

reflectivity, p, of the region which is derived from the visual image, and which is derived from the thermal image are also used to facilitate region experimental

about

Sequence

on real multisensory

of

Multisensor

average region temperature labeling. Section 6 presents

data.

Data

of multisensor data consisting of thermal imagery, visual imagery and scene then it is possible to extract a more reliable estimate of the imaged object's

69

Observe that the relationship heat radiation. capacitance CT of the object is given by:

relative ability to sink/source heat flux Wed and the thermal

between

the conducted

dTs Wcd= CT -d-i A finite

(backward)

difference

approximation

to this equation

Cr = w_ tl and t2 are the time instants

at which

CT as

(t2 - t,) (Ts(t2)

where,

may be used for estimating

(7)

- T,(t,))

the data

were acquired,

Ts(tl)

and T_(t2)

are the cor-

responding surface temperatures, and Wed is the conducted heat flux which is assumed to be constant during the time interval. However, Wcd does vary and an average value of (Wcd(tl) + W_a(t2))/2 is used in equation (7). Section

6 presents

to multisensory

6

experimental

results

obtained

by applying

the above

temporal

analysis

method

data.

Experimental

Results

The methods described in the previous sections outdoor scenes. Calibrated remote sensing data

were applied to real multisensory data acquired were not available along with values of ambient

from scene

parameters such as wind speed and temperature. Hence, thermal and visual imagery were acquired from a ground based imaging setup consisting of an Inframetrics infrared imaging system and a video imaging system. An anemometer was used to measure wind speed and a digital thermometer was used to calibrate the thermal imaging system so as to allow absolute temperature estimates as discussed in section 2. The two methods discussed in the previous section were applied to several sets of data which

were acquired

The results visual image 6 shows the

at different

obtained

times

of the day and during

using one data

set are presented

of a scene containing an automobile, thermal image of the same scene.

of this paper

were

used

to estimate

the

different

in figures

seasons

of the year.

5 through

8. Figure

5 shows

buildings, asphalt pavement and vegetation. The techniques described in the preceeding

surface

heat

fluxes,

whence

the

ratio

the

Figure sections

R = Wcd/Wab_

was

computed at each pixel. A histogram of these values was computed for each region and the mode of the histogram was found. This value was chosen as the representative value of R for the region. Figure 7 shows the values obtained for each region. As predicted by the discussion in section 5, automobiles produce

the lowest

vegetation

produces

value

of this feature,

the highest

pavements

and

buildings

produce

intermediate

In addition to the feature R, the average region temperature, and the surface used in a decision tree classifier to label regions as building, pavement, vegetation classifier used heuristic rules of the form: IF { R e [0.2, 0.9] AND p e [0.35, 1.0] } OR { R e [-.8, The resultant classification is shown in figure 8. The

method

of temporal

values

and

values.

analysis

hours. classes

Table 2 presents of scene objects.

the mean and The estimated

Except

for the concrete

and brick

of the scenes standard values

-.3]

} THEN

was tested

label

on data

reflectivity were also or automobile. The

= bldng

acquired

at intervals

of three

deviation of the value of CT estimated for different compare very favorably with those listed in table 1.

walls, the estimated

7O

values

for each class are of the same

order

of

ORIGINAL

PP_GE IS

OF POOR t_._ALITY

OR!GINAL BLACK

AND

WHITE

Figure

PP,GE' PHOTOGRAFH

5. Visual

image.

.57

Figure

7. Mode

Figure

6. Thermal

Figure

8. Region

image.

-,._

of feature

R.

71

classification.

Object

Average

(xl0 Automobile Concrete Brick

Wall

Wall

Asphalt

-_ J/K) 0.08

2: Values

of lumped

capacitance

-_ J/K) 0.08 0.37

0.37

0.38

1.05

0.46

1.5

2.7

Pavement

thermal

(xl0

Devn.

0.22

Vegetation Table

Std.

CT

estimated

using

the method

described

in section

5.2.

magnitude as listed in table 1, and are ordered in a similar manner. The walls do not compare favorably possibly due to the wide variation in wall thickness that is difficult to account for and also due to the unknown thermal conditions on the interior surface of the walls. In general, a significant offset may be expected in the estimated values due to the many approximations used in the computation heat flux estimates. Inspite of this limitation, it is obvious that the approach described above available a very useful and meaningful that the value of CT is a deterministic particular class of objects. by comparing the estimated interpretation interpretation

7

of the makes

method for the interpretation of multisensory data. Note also value which is completely defined by a physical definition for a

Hence, a deterministic measure of the technique's performance is available and true values of CT. Such a measure is not available in purely statistical

techniques. This is one of the major advantages of a phenomenological when compared to the purely statistical approach.

approach

to scene

Conclusions

A new method

has been described

for interpreting

scenes using multisensory

data.

The phenomenologi-

ca] approach combines information from the different imaging modalities to derive meaningful features. The approach is based on physical models of the energy exchange between the imaged surface and the environment. provide a measure

The thermal and visual images yield estimates of surface of the relative ability of the imaged surface to source/sink

pretation thus relies on a rough estimate of the lumped thermal capacitance been shown to vary widely for different classes of objects in outdoor scenes.

heat fluxes which in turn heat energy. The interof the object The developed

which has approach

was tested on real multisensory data. Due to the unavailability of a calibrated remote sensing data and accompanying values of ambient scene conditions, the testing was performed using terrain based imaging equipment. The approach described above may be easily applied to multisensory imagery acquired

by airborne

or satellite-based

sensors.

References [1] B.G. Lee, R.T. Chin and D.W. Martin, "Automated Rain-Rate Classification of Satellite Images Using Statistical Pattern Recognition", IEEE Trans. Geoscience and Remote Sensing, Vol. GE-23, No. 3, May 1985, pp. 315-324. [2] W.D. Rosenthal, B.J. Blanchard and A.J. Blanchard, "Visible/Infrared/Microwave Agriculture Classification, Biomass and Plant Height Algorithms", IEEE Trans. Geoscience and Remote Sensing, Vol. GE-23, No. 2, March 1985, pp. 84-90 [3] S. Di Zenzo, R. Bernstein, S.D. Degloria and H.G. Kolsky, "Gaussian Maximum Likelihood and Contextual Classification for Multicrop Classification", IEEE Trans. Geoscience and Remole Sensing, Vol. GE-25, No. 72

6, Nov.

1987,

pp. 805-814

[4] S.W. Wharton, "A Spectral Knowledge-Based Approach for Urban Land-Cover Discrimination", Trans. Geoscience and Remote Sensing, Vol. GE-25, No. 3, May 1987, pp. 272-282 [5] T. Lee, J.A. Richards, and P.H. Swain, "Probabilistic and Evidential Analysis", IEEE Trans. Geoscienee and Remote Sensing, Vol. GE-25, [6] D.G. Goodenough, Trans. Geoscience

Approaches No. 3, May

M. Goldberg, G. Plunkett and J. Zelek, "An Expert and Remote Sensing, Vol. GE-25, No. 3, May 1987,

[7] H. Asar, N. Nandhakumar and J.K. Aggarwal, Data", to appear in Pattern Recognition. [8] V.S. Whitehead, W.R. Johnson and J.A. Visible, Near-IR and Thermal-IR AVHRR GE-24, No. 1, Jan. 1986, pp. 107-112.

"Pyramid-Based

Boatright, "Vegetation Data", IEEE Trans.

Segmentation

Assessment Geoscience

[9] P.R. Christensen, "Martian Dust Mantling and Surface Compositions: Properties", Journal of Geophysical Research, Vol. 87, No. B12, 1982, [10] B.M Jakosky and P.R. Christensen, "Global Duricrust on Mars: of Geophysical Research, Vol. 91, No. B3, 1986, pp. 3547-3559. [11] N. Nandhakumar and J.K. terpretation", IEEE Trans. 469-481. [12] R. Siegel and J.R. 1981. [13] C. Ohman, 204- 212.

Howell,

"Practical

Aggarwal, on Pattern

Thermal

Methods

Radiation

[15] J.M. Jarem, J.H. Pierluissi and W.W. SPIE, Vol. 510, 1984, pp 94- 99.

[17] Horn

B.K.P.,

[18] Woodham

"Hill Shading

R.J.,

"Analysing

Thermal

Images

Multisensory

of Vol.

of Thermophysical

Sensing

Data",

Journal

2nd Ed.,

and

of Curved

Map",

Proc.

Surfaces",

Shape

from

Proc.

Empirical

the Reflectance

Reflectance

Mc-Graw

of SPIE,

Atmospheric

for Atmospheric

Map",

Applied

Artificial

Optics,

Vol. 313,

Occluding

1981,

Methane",

Vol.

Proc.

18, No.

of

1, 1979,

47.

Vol. 17, 1981, pp 117Boundaries",

pp

Data",

Vol. 19, No. 1, pp 14-

Intelligence, and

Co., New York,

Propagation

Model

of the IEEE,

Shading

ttill Book

140.

Artificial

In-

and R. Chellappa, "A Method for Enforcing Integrability in Shape from Shading Algorithms", on Pattern Analysis and Machine Intelligence, Vol. 10, No. 4, July 1988, pp. 439-451.

[21] J.M. Norman, Canopies and pp. 659-667.

J.M. Soils",

Welles and E.A. Walter, IEEE Trans. Geoscience

"Contrasts and Remote

Among Sensing,

Bidirectional Vol. GE-23,

[22] Thepchatri T., Johnson C.P., and Matloek H., "Prediction of Temperature by A Numerical Procedure Using Daily Weather Reports", Tech. Report Research, University of Texas at Austin. [23] Strock C., and Koral 2nd Ed., 1965. [24] Incropera 1981.

IEEE

Using a Combination and Remote Sensing,

of Remote

Measurements",

of Modelled

"Calculating

and the

Transfer,

Ng, "A Transmittance

[19] Ikeuchi K., and Horn B.K.P., "Numerical telligence, Vol. 17, 1981, pp 141 - 184. [20] R.T. Frankot IEEE Trans.

Heat

for Improving

R.W.,

Sensing",

Using

Interpretations pp. 9985-9998.

Analysis

Data

"Integrated Analysis of Thermal and Visual Images for Scene InAnalysis and Machine Intelligence, Vol. 10, No. 4, July 1988, pp.

[14] J.R. Schott and J.D. Biegel, "Comparison Proc. of SPIE, Vol. 430, 1983, pp 45 - 52.

[16] Horn B.K.P., and Sjoberg pp 17701779.

for Multisource 1987, pp. 283-293

System for Remote pp. 349-359

Image

IEEE

F.P.,

and

R. L., Handbook

De Witt

D.P.,

of Air

Conditioning,

Fundamentals

of Heat

73

Heating

Transfer,

and

John

Reflectance of Leaves, No. 5, September 1985,

and Stresses 23-1, 1977,

Ventilating,

Wiley

in Highway Bridges Center for Highway

Industrial

& Sons,

Inc.,

Press,

Inc.,

New York,

N91-15622 A Method Interval-Valued

for Classification Probabilities and H. Kim School

Laboratory

of

of Its

Multisource Application

P.H.

Swain

and Electrical and

Data using to HIRIS

Data

Engineering

for

Applications of Remote Sensing Purdue University West Lafayette, IN 47907, U.S.A. Tel: (317)494-0212, FAX: (317)494-0776 E-mail address: [email protected] tent tion."

ABSTRACT

"evidential

informaresearch is framework can be in remote

sensing and geographic information systems. The methodology described in this paper has evolved from "evidential reasoning," where each data source is considered as providing a body of evidence with a certain degree of belief. The degrees of belief based on the body of evidence are represented by "interval-valued (IV) probabilities" rather than by conventional pointvalued probabilities so that uncertainty can be embedded in the measures. There are three fundamental problems in the multisource data analysis based on IV probabilities: (1) how to represent bodies of evidence by IV probabilities, (2) how to combine IV probabilities to give an overall assessment of the combined body of evidence, and (3) how to make decisions based on IV probabilities. The paper describes a formal method of representing statistical evidence by IV probabilities based on the Likelihood Principle. In order to integrate information obtained from individual data sources, the method presented in the paper uses Dempster's rule for combining multiple bodies of evidence. Although IV probabilities together with Dempster's rule provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. We need more intelligent strategies for making decisions. This paper also focuses on the development of decision rules over IV probabilities.

1 INTRODUCTION The importance of utilizing multisource data in ground-cover classification lies in the fact that it is generally correct to assume that improvements in terms of classification accuracy can be achieved at the expense of additional independent features provided by separate sensors. However, it should be recognized that information and knowledge from most available data sources in the real world are neither certain nor such a body sometimes

as

The objective of the current to develop a mathematical within which various applications made with multisource data

A method of classifying multisource data in remote sensing is presented. The proposed method considers each data source as an information source providing a body of evidence, represents statistical evidence by interval-valued probabilities, and uses Dempster's rule to integrate information based on multiple data sources. The method is applied to the problems of ground-cover classification of multispectrai data combined with digital terrain data such as elevation, slope, and aspect. Then this method is applied to simulated 201 -band High Resolution Imaging Spectrometer (HIRIS) data by dividing the dimensionally huge data source into smaller and more manageable pieces based on the global statistical correlation information. It produces higher classification accuracy than the Maximum Likelihood (ML) classification method when the Hughes phenomenon is apparent.

complete. We refer to tain, incomplete, and

information

of uncerinconsis-

75

2 AXIOMATIC DEFINITION IV PROBABILITY

OF

when there is absolutely the probability of A. The basic probability

Interval-valued probabilities can be thought as a generalization of ordinary point-valued probabilities. The endpoints of IV probabilities are called the "upper probability" and the "lower probability." There have been various works intro-

fined in Shafer's evidence[6] has the the IV probabilities:

re(A)

satisfying I)

U : I] _ [0, I]

(2.2)

the

U(A)

following

_>C(A) _>0

for any Ae _

(2.4)

III)

U(A)

(2.5)

= 1 for any Ae[3

For any A, BeJ3 and < L(AuB)

R ax/i,_Ts

brightness

distribution

This completes one top-down shot. The two main stages are shown in Fig. 3.3 and Fig. 3.4 (the original image is Fig. 1.2). Fig. 3.3 shows the lowpass filter (in this case a 25x25 window lowpass was selected by VES) together with the local maxima. Fig. 3.4 shows the corresponding circles that survived the "radial brightness distribution" test. Each of these circles is assigned to a scene object

(tree).

Some

of the

trees

are

removed

If the interpretation process is started The parameter variation will produce

by the

final

test

by (START '(TREE)), two more lowpass filters

120

in METH0

(if standing

too

close).

the global rules will be applied. and this will result in new local

maxima, circles and trees. The search for contrary objects (in this test case a road was entered manually) leads to the elimination of trees that would grow in the middle of a road. The final result shown in Fig. 1.3 was obtained after two parameter variations (19x19and 3lx31 lowpasswindow).

Fig. 4 PROCESSING

THE

TEST

3.3

DATA

Local

maxima

Fig. 3.4

Circles

from

Fig.

3.3

SET

We took a small portion of each of the five images Fig. 1.5 a. - e. each showing approximately the same part of the scene. The size of these portions is 512x512 pixels (b. - e.) and 256x256 pixels (a.). All five images were processed with the standard VES tree-search (search for a default crown radius of 2,5m followed by two parameter variations (1,25m and 5m)). The original 512x512

Fig 4.1

5122 portion

of d.

Fig. 4.2

5122 portion

of c.

Fig.

Fig.

1282 portion

of d.

Fig. 4.5

1282 portion

of c.

Fig 4.6

4.4

121 ORIGINAL" BLACK

AND

WHITE

PAGE' PHOTOGRAPH

4.3

Circles

from

A correlation

ORIGINAL OF POOR

Fig.

4.2

result

PAGE

IS

QUALITY

images are shown for d. (Fig. 4.1) and c. (Fig. 4.2). Fig. 4.3 shows after the first top-down process. The final results (trees found) are portions of d. (Fig. 4.4) and c. (Fig. 4.5).

the circles found in Fig. 4.2 shown in detail for 128x128

The results of this experiment were very interesting: While even in the worst case (1:32000, a.) many of the large crowns were detected, there was no "perfect" interpretation in any five cases a. - e.. Of course, the best results were obtained for the larger scales (c. - e.). each result there were several trees missing that were found in another case. The same for erroneous artifacts, which don't show up in more than one result at the same location. conclusion - the desired result of the interpretation of the whole data set (a. - e.) would careful combination of the several results. And, working with "intelligent" vision systems, we favour a robust solution that doesn't require too precise and detailed instructions, similar ability

of a person

to identify

the

same

tree

in two different

image of the But in is true As a be a would to the

images.

As a first step towards this goal we tried the following procedure. We generated dot images of the five results. For each tree a dot mark located at the center of the circle representing the tree was produced. When two different results were overlayed and displayed in different colors, the resulting image was very similar to the dot patterns described by Glass [Gla69] and Stevens [Ste78]. In our case the patterns of one result may be converted to another one by assuming a superimposition of translation, rotation and a small change of scale. The remaining "noise" is caused by the individual height of each tree, and by the different position of the sun and viewing position for each image. In addition, due to the imperfect interpretation, some points are missing or added in the other image. Stevens called his patterns "Glass patterns" and he developed a loca/ algorithm for the correct correlation of associated points. We implemented Steven's algorithm and tested it on the dot images generated from the interpretation results of a. - e.. One result of a correlation between the two images shown in Fig. 4.4 and Fig. 4.5 is shown in Fig. 4.6. The results of this experiment were imperfect but very promising. Taken alone, Steven's algorithm is not effective enough for our patterns. This is due to the noise effects discussed above and due to the occurence of rather large point displacements. The algorithm will have to be adapted for our purposes - there are already several ideas for improvements. When viewed as one component of a larger vision system, even the actual performance of the algorithm is valuable. The correlation results will be processed by VES. Several heuristics may be applied, e.g. the fact that correlated trees should be of similar size. The correlation should also hold for more than two of the results (a. - e.). If there is a component in the system, that is able to determine species [Bis89,Pin90], then correlated trees must have the same species. Current research is addressing these topics.

5 CONCLUDING It has been

shown,

the tree at IVF

REMARKS that

the use

of multisource

data

can improve

the

quality

and

robustness

of the

interpretation result of a computer vision system. While synergic effects of this kind are well known, the proposed approach is also robust from another point of view. We do not need the geometric rectification of our multisource data to compare them. We also don't need complete or very accurate correlation results. The system is able of comparing two objects from two scenes just like a human interpreter looking at the two images. In a way the knowledge of a system like VES may be viewed as an alternate data source itself.

122

Many problems were discussedonly very briefly or not at all. The ideas about the representation of objects, processes and methods in VES are improved in the VS environment. This representation problem is closely coupled with the problem of control of the interpretation process. Methods like the one described above can help in getting a better assessment of the current interpretation result. Dealing with multisource data, the representation problem becomes even more difficult: While there is one individual object, there can be several scenes (several scene objects) and many images (many approach for a vision system in a natural in a man-made environment. reflected in fuzzy and robust

image objects). Furthermore we believe environment should be rather different

Fuzziness in shape models and methods.

and

morphology

of natural

that from

objects

a good the one

has

to be

ACKNOWLEDGEMENTS The author wishes to thank Renate Bartl and Horst in producing the experimental results presented Med.Kybernetik u.Al., Univ.Wien) for making their

Bischof for fruitful discussions and their help in this paper. We thank IMKAI (Inst.f. implementation of FRL available to us. This

research was funded by the following contracts: "Rechnerunterstfitzte interpretation" (Fonds zur F6rderung der wissenschaftlichen Forschung Jubil_iumsfonds d. Osterr. Nationalbank) und "Schwerpunkt Fernerkundung" Fonds zur F6rderung der wissenschaftlichen Forschung)

objektivierte Proj. Nr. (Projekt

Luftbild4489 und $38/2 des

REFERENCES [Bob88]

Bobrow X3J13,

D., 1988

et al.,

[Bis89]

Bischof, H. and Pinz, A., "The Use of Neural Networks for Species in Digital Images", in Pinz, A. (ed.): "Knowledge-based OCG Schriftenreihe 49, Oldenbourg 1989, (in German)

[Bur88]

Burt, P.J., "Attention 1988, pp.977-987

[Gla69]

Glass,

L., "Moir6

"Common

Lisp

Mechanisms

Effect

from

System

for Vision

Random

[Han78b]Hanson Scenes", Orlando,

A.R. and Riseman E.M., "VISIONS: A in Hanson, Riseman (eds.), "Computer 1978

[Hae87]

S., Tr_inkner

[Hav83]

H.

and

(eds.),

Nature,

A.R. and 1978

Aerial Photographs: Oberpfaffenhofen,

E.M.

Dots",

Eckstein

"Computer

W.,

The Path through 1987, (in German)

the

Havens W. and Macksworth A., "Representing Computer, Aug. 1983, pp.90-96

123

Specification",

in a Dynamic

[Han78a]Hanson Orlando,

Haenel

Riseman

Object

the

World",

Vol.223, Vision

Draft

Bottleneck",

Proc. 9.ICPR,

pp.578-580,

Systems",

Knowledge

Detection

to

Recognition of Tree Pattern Recognition",

Rome,

Aug.

Academic

Computer System Vision Systems",

"Automatic

submitted

1969 Press,

for Interpreting Academic Press,

of Tree

Crowns

in 2.DFVLR-Statusseminar,

of the

Visual

World",

IEEE

in

[Mar82]

Marr,

[Mat87]

Matsuyama, T., "Knowledge-Based Aerial Systems for Image Processing", IEEE Trans. 25, No.3, May 1987, pp.305-316

[Mat88]

Matsuyama, T., "Expert Systems for Image Processing of Image Analysis Processes -", Proc. 9.ICPR, Rome,

[Mat90]

Mates J., l__nsky P. and Yakimoff N., "Models for the Perception Random Dot Patterns", Biol. Cybern. 63, pp.71-80, Springer 1990

[Keo85]

McKeown D.M., Wilson A.H. and Imagery", IEEE Trans. on Pattern No.5, Sept. 1985, pp.570-585

[Min75]

Minsky, M., "A framework for representing Psychology of Computer Vision", Mc Graw

knowledge", Hill, 1975

[Nag80]

Nagao Plenum

Analysis

[Naz84]

Nazif IEEE 1984,

[Oht85]

Ohta Y., publishing,

[Pin88]

Pinz, A., "An Image-Understanding Infrared Aerial Photographs",

[Pin89]

Pinz, A., "Final Results of the Vision Expert System Based Pattern Recognition", OCG Schriftenreihe,

[Pin90]

Pinz A. and Bischof H., "Constructing Species of Trees in Aerial Photographs",

[Rob77]

Roberts, Goldstein, MIT, 1977

[Sch89]

Schneider, W., "Using Limits", FBVA-Berichte,

[Ste78]

Stevens, Spiringer

D.,

"VISION",

M. and Press,

Freeman,

Matsuyama New York,

A.M. and Levine Trans. on Pattern pp.555-577

1982

McDermott Analysis

T., "A Structural 1980

Image Understanding Systems and Expert Geoscience and Remote Sensing, Vol. GE-

- Knowledge-Based 1988, pp.125-133

'q'he

Interpretation

FRL

of Orientation

in Winston,

of Complex

of Outdoor

Natural

Aerial

The

FRL

Manual",

for the Atlantic MIT

of Locally

124

Parallel

Structure",

Biol.

"The

Photographs",

Color

Scenes",

VES", in Pinz, A. (ed.): Oldenbourg 1989

a Neural Network Proc. 10.ICPR,

(ed),

An Expert System", PAMI-6, No.5, Sept.

Pitman

in Color-

"Knowledge-

Interpretation City, IEEE

of the 1990

Memos

and

Remote Sensing for Forest-Inventory; Methods, Sonderheft, Wien, 1989, (in German)

K.A., "Computation 1978

P.H.

Expert System for the Recognition of Trees Ph.D. Thesis, TU Wien, Okt.1988, (in German)

Primer,

in

J., "Rule-Based Interpretation of Aerial and Machine Intelligence, Vol. PAMI-7,

M.D., "Low Level Image Segmentation: Analysis and Machine Intelligence, Vol.

"Knowledge-Based London, 1985

Composition

408

Possibilities

Cybern.

409,

and

29, pp.19-28,

N91-15626 Visualizing

Characteristics the

Shuttle

of

Imaging

David The

Johns

Ocean

Data

Radar-B

G.

Physics

Laurel,

Maryland

During

Tilley

Hopkins

Applied

Collected

Experiment

University Laboratory 20723-6099

Abstract

Topographic Contour

Radar

plotted

as

the

track

the

Fourier

SAR

P-3

same

are

spectra,

modulations, Visual

aircraft computed

compared

after

subtraction

yields

topography

perspectives

surface

tracks

fields

on

observed

distinctions considered

within

scattering.

The

the

Keywords:

Doppler

SAR in

space

of

Chile and

of

of

a

ocean

and

time,

in

October

waves,

image

of

wind of

insight

with

height

by

the

SCR. for wave

Threshold

of

as

of

generated SAR

of

rough

to

the

synthetic

processing,

wave

1984.

model

images

although

modulations

is

waves

(SAR)

made

for

(ROWS)

inversion

compared,

statistical

ocean

radar

modeling

are

rotating

Challenger.

Fourier

measurements data

endeavors

of

from

shuttle

aperture

texture

dynamic

these

imaging

radar,

ocean

are along

Spectrometer

space

and

Surface

variance

computed

spectra.

direct

the

experiment

height

Wave

the

noise

to

elevation

context

the

wave

spectral

and

coast

result

governing

SCR

surface the

mechanisms

of

were

synthetic

by

(SIR-B)

wave

Ocean

under

surface

somewhat

off

between

SIR-B

ROWS

observe

Radar

flown

similar

the

differing

it was

collected

Radar

spectra

the

from

to

to

wave

by

as

elevation

Imaging

plots

Ocean

acquired

NASA

surface

Shuttle

surface

aircraft.

spectra

ocean

sea

NASA's

dimensional

a

power

the

of

during

measurements

aboard of

three of

altimeter

measurements (SCR)

data

physical

aperture

computer

are

surface

radar.

graphics

Introduction

Remotely change

global

sensed in our

information

on

forthcoming

in

for

the

empirical

spatial height are

next

temporal slope

viewed

as

scale texture experiment.

The

from

along

and

NASA

Challenger

as

of

9 October

4

days,

spectra Walsh

have et

al.,

been

variety As

ground

surface

to

computed

1985]

and

12

from Radar

the

1984.

coast As

data

acquired

Ocean

Wave

125

a by

a

in wide

result, the

Spectrometer

Chile

ocean

the

Contour

[ROWS,

short(SIR-B)

space

(55°S,

of wave

sensors in

Radar

directional Surface

relation

remote

Imaging

be tool

range

correlations

of of

of

spaceborne

Shuttle

underflights

will

processes

measurements and

monitoring much more

a scientific

over

appreciate

during

southwestern

October

physical

radar

for and

atmospheres as

operating

to

state

and

emerging

airborne

conducted the

of

example, of

potential now exist

oceans is

sensors

track

sea

aircraft

approached

an

the sets

our

models

of

perspectives

long-scale

P-3 it

the

of

computer

a

offer data

Visualization

scales.

shaded

data Large

properties

decade.

theoretical data

and and

spectral

the

investigating

to

earth science environment.

shuttle

80°W)

on

each

surface

wave

Radar

[SCR,

Jackson

et

al.,

1985]

for

comparison

Aperture

Radar

[SAR,

characterized

by

lowest

sea

system

with

state a

a

fresh

occurred

with

field

was

observed

a

height

of

wave

about

set

on

is

of

imaging

SAP,

with

12

may

for

be

have

be

of

radar

been

SAR

[Tilley,

estimate using

image

present.

are

required

including sea

to

variety

of

states

and

coastal

authorities.

dynamics of

improve that The

is

power. remote over

are

A

of

object

126

the

ocean

that

be

time

of

the

for

dynamic is

restore

with

to

tilt

and

height

processing

ocean

wave

height

direction, ship SAR

a

response

wave

spectral of

on

Doppler

applied

surface

oceanographers,

research

restored

required

propagation

well track

based

of

SAR

not

along

partially

and

to in

estimates to

inverse

blurring

model

and

is

SAR

threshold

applied

Advances

interest of

can

power

sensor

Canada an

deterministically

the

stationary

wavelength

in

has

sensor

response

be

motion

theoretical then

remote

Montreal,

theory that

in

estimated

1987]

well

and

a broadband

noise.

image

may

surface

events

rough

later.

apparent

ocean

spectra,

distributions

the

is

based

facets

used

spectral

days

is

floor that can tilted toward

SIR-B be

SAR

derive

homogeneous

near can

modulation

model

empirically

Fourier

few

scattering it

a

Missisquoi

broadband

a

backscattering

[Monaldo, the

the

reduction

the

function

the

and

backscattering

for

wave

about

noise facets

upon

not

filters

domain

Speckle

of

are

approach

restored

learned

spectral surface angle

Baie

Chile

been

with

with

spectra

enhancement

spectral

response

over

of

random

has

imaged

groups

and

distribution

improve

by

SAR

from

modulation to

the to

signal

significance

variance,

surface

power

spectral

images

the

and

the

SAR

response

empirical of

applied

wave

techniques

the data track

of

wave of

domain.

hydrodynamic

limited

an

After

are

velocity

from

across

wave

restoration

23 ° incidence

However,

is

directional

in

image

surface a

distant

what

wavenumber

to

via

with

a

coast

rough

synthesis.

separate

track

apparently

This

and

storms

wave

shuttle an

by

a broadband of transient

collected

the

distribution

functions

this

again

track

when

developed

a significant

the

modulations

reduced

apply

the

stationary

operation off

waveheight

1987]

stochastic

SAR

This

response

[Tilley,

imaging

October

underflights.

individual

function

in

at

data

1984.

at

wavenumber

to

stationary

the

Non-homogeneous understood

1987]

down

spatially

obtained

to

to

developing

statistics

domain

transfer

looks

a

filtering

related

speckle

for determining random influence

7,

data

typical

transient

October

detected

along

estimate

normal

Fourier

elevation

1986]

Fourier

in

Hence,

it

to

October

II

waveheight

However,

methods as the as

[Tilley, on

the

induced

the

surface

Such

used

m

is

a well

12

The

driven

Doppler of

along

also

140

to

SAR

On

aircraft

modeling

set

measuring

propagate

the

wind

and

day

was

m.

propagating

data

diminished

radars

about

to

by

examined

modulation

surface.

m having

responds

applications.

better

empirical subtracted

the

the

in

4.6

northeast

This

following

to

to

non-homogeneous

the

as

m

Synthetic period

growing

the

SAR.

which

of

fields

characterized

developed

estimates

from the

The

transformed

significance. been

actively

northwest.

of in

1.7

380

so

day

it

wave

Fourier

well

ocean-imaging

on be

ocean

fast

might

turned

as

from

the

of

wavelength

SAR

varied

a consensus

from

last

day

influence

direction.

interest

oceanographic

height

and

four

texture.

necessarily and

in

occurred

m

from

The

an

of

backscatter

the

of

Homogeneous SAR

a

particular

and

m

October,

properties

velocity

state

270

computed

data.

appeared

reached

look

when

m

a wavelength

3.5

the

system

northeast

of

that

direction

state

sensors

with

80

spectra

image

October

sea of

power

height

about

sea

wavelength

90 ° from

i0

look

sloped

remote

swell

new

of the

highest

radar

wave on

modulations The

three

about

1986]

al.,

significant

steeply

technique.

wave

wave

et

-30 ° from

hydrodynamic

the

Fourier

Beal

wavelength

approximately of

a

with

is

for

captains to

develop

theoretical

descriptions

parameterizedby can

then

with

be

with

surface

topography,

(e.g., the statistics.

Synergy

of

Aircraft

The Chilean

SAR,

been radar

October

sensors

is

the

are

ROWS

have

data

field,

an

surface

inverse

by

grouping

Fourier

dimensional

On

at

these

12

October

a 380

direction, weaker

m

Both

of

when

only wave

at

modulations the

An

of

image

transform

of

applied

to

is

in

depicted

coverage

On carrying Both

i0 the

remote

I0 SAR

of

ROWS of

the and

spectral

m 2.

Once

of

SAR

the

the

scenes

surface

wave

of

the

topography

ocean

of

sites

are and

by

non-homogeneous

for The

Figure

2b 3

where

wave

SARwere sensors

at

the even

detected

SAR,

the

was

track

direction.

be

computed

tilt

ocean and long

ranging

its

velocity

400

m

wide,

inverse

to

as tilt

SARalso

Fourier function

modulations

of

along well

direction.

transfer

Figure similar

measurement

each

an

ib,

estimate

surface The

m wavelength

in A

as

flight

by

be

swell.

Figure to

via

imaging

380

in

SAR,

across

along

may

travelling

the

system

the

tracks,

also

wave

can

and

applied

that

represented elevation.

direct

shown

A

direction,

dominant

driven

a theoretical of

as are

shuttle

travelling

surface

the

la

flight

heading.

flight limit

for

surprising

wind

interaction

the

observed

space

in

ROWS

Figure

eastern

southerly

the

the

in

its

signal-to-noise

not

variance

aircraft

spacecraft

October

is

depicted along

more

functions

system

both

a

by

driven sea are of the surface

SCR

to

response

weak

after

spectra nearly

detected The

it

the

height

account field.

the

recreate

across

broadening

also

wave

spectrum

power

instrument's

the

occurring

wave

SAR

over as

the

so

swell

surface

low

spectrum.

backscatter

140 m wavelength wind generated visualization depicted

on this

estimates

similar

ROWS

propagating

are

detect

the

propagating

instrument

dominant

the

states

suppression

units

of

over

site, at

of

direction,

able

to

backscattered

near cause

variance

of

observed

to

visualizations

system,

systems

flight is

Chilean

spread

empirical

action

eastern

applied

measurements

sea

height

with

in

three

assessments

spectral

and

the

Therefore,

develop

have

systems

the

wave

SAR.

the

format

SAR

clutter

best

statistics

system,

the

wave

the

ROWS,

the

appears

or

these

is SCR

surface

m wavelength

detected

with

the

variance,

its

this

all

low

Comparisons

to

height

wave

general,

for

detection,

action

with

height

wavelength

apparently

confused

of

In

with

to

at

in

consensus

investigation signal

experiment

sites.

that 140

a

by

SAR

geophysical

ocean future

models.

misrepresented

the

compared

non-Doppler of

SIR-B

designing

wave

reach

transform

three

of

significance.

Comparison

the

sea

terms

measurements

be

spectra

and

using

directional

wind-driven

for

guide

wave

indicates

computer

in to

computing

computing

to

continuing

transformed

Intercomparisons

value

to

direct

can

Fourier

methodology

yield

m 4.

purpose

fresh

inverse

during

to

of

that

data.

Observations

the

able

considered

more

collected

units

height

elevation.

made

is

wave

make

Ocean

somewhat

of

served

SAR

for

applied

of

estimates

the

be

subject

or

processed

the

to

spectra, evaluate

that

modulations

statistical

to

data

were

algorithms

restoration

an

1987]

appeared

scene

the

common

although

various

is

in

potential

remote

encountered,

ROW been

section

Fourier

wave

SCR)

[Beal,

their

cross of

Spacecraft

all

spectra

assessing

and

and

have

reported

ocean

and

SCR

site

variance

ROWS

radar

analysis

compared

ocean

radars surface

of

empirical

swell

of

the

and

the

2a as a computer visualization is surface

topography

simulate

the

same

SAR.

Chilean more

site,

closely an

80

m

the

ROWS

aligned wavelength

127

aircraft along wind

and

the

space

shuttle

eastern

flight

directions.

driven

system

propagating

at

60 °

and

a

direction. longer

200

The

swell

m

for

both

respective

instrument

modulations

of

when

the

wavelength

spectral the

ROWS

cross

assumption of two-scale inverse transformed both function

statistics

of

elevation by

about

20 in

surface sea)

with

of

surface

plot

height

variance

Computing

and

The have

512

of

arrays

about

13

transfers

of

picture

minutes

using

disk

the

to

A

in

to

Comtal

i

DMA

was

also

unit x

can

i0"

be

The WAVE 2.2

of

for

using

Version a QUEN-16

and

to

80

correlation as

is

a

shaded

proportional

modulation.

in

the

in

Fourier

512 in

coordinate

data

computer

and

to

filtering

to to

the

over

transformed

to

prior

required

a PDPat

intensity

16-bit

been

Initial with

Corporation

developed

database

typically

have

workstations

facilities.

fast

was

and

accomplished

algorithm of

code

1984 graphics

32-bits be

partition

was

image of

the

system

was

512

x

apply inverse

restore

large

Fourier

wave

x

512

equipped

interfaced 8 bit

with

to

pixel

the

scenes

height

2 Mbytes

PDP-II/70 over

image

mm

R,G,B

roll

outputs

film

found

herein

system

which

now

surface

plots

developed

and

or

from

the

format

were

stands

color

from

i

alone

and

ocean Model

25

receiving

This

images

with

on

this

its

an

scenes 4007,

monitor.

to

photographed

in

a UNIBUS

memory

a pipeline delivering up to 30 color monitor. A Matrix camera, to

35

Figures

processor

was

interfaced

SCR

running

on

texture

from 1.0,

cm

scene.

expose

package

SAR

23 the

8"

image

inputs

from

tape.

PV-WAVE

visualized

x

display

SAR

software

The

Equipment

storage

program

controlling 512 x 24 bit

to

magnetic

(i.e.,

visualized

was

could

spectral

One/20

and

scaled

memory

hour

ocean

film. and

9-track

also

texture

(i.e.,

spectrum

computer

Digital

mass

Fortran

This

used

sheet

processing

word

i

SAR

second.

acquired

set

and

waves

collected and

(pixels)

optimized

transfers

LSI-II microprocessor a second to a 512

images

complex

Vision

allowing about

a

are

algorithms from

elements

About

significance

data

time.

action

surface spatially

wavelengths

departmental

filtering

Ocean

an

were

processing

purchased

64K

the

transformation.

This

The

SCR

height

bunching

different

Fourier

peripherals.

algorithms

to

differing

generated

wave

velocity

herein image

several

SAR decade.

between

magnetic

1981

using

system

the

and

presented in

the

minicomputer

beginning x

data

displayed

of

tilt

significance.

hours.

texture

the

spectrum was ocean imaging

compared

integration

and that

imaging

stationary

Technology

evolved

development 11/70

Displa7

and

assuming

wind

scene

elevation

4,

without

radar

processed that

surface

2

resonant

longer

the

height

surface

radar

by

to

about

and

the

their

velocity

ocean

image the

of

segments

compare

of

SAR)

km 2 ocean

and

SAR

of

for

tilt

SAR

are

flight

that

ergodic

Hence, the application wave

their

correction

describe the

images

by to

comparable

Figure

6

temporally

L-band

to

restore SAR

over

modulation

the

the in

3

opportunity

periods

properties to

for

to

after

violate the

from

dominated

surface

suffice

models. after

used

Figure

hydrodynamic

sensors

seas

130 ° sea

The

not

filtered

and

unique

waves

may

at

driven

remote

transient

Fourier in

wind

functions.

scattering before and

kilometers

a

acting m

two

the

SAR

section

traditionally

the

statistics

presents

and

and

propagating

of

response

radar

non-homogeneous

transfer

swell

amplitude

is

also

of

to

elevation

installed array

the

DECstation

information

a number

wavefront

a

in

on

processor.

figures

supported 3100 shade and a

by

128

computed

surface

rotation

desktop

offers plot

angle 3500

using

Visuals,

workstation a

VAXstation This

were

Precision

of

the ocean

interfaced

via is

being

PV-

Version

algorithms wave

perspectives.

processor

the

Inc.

height PV-WAVE,

a

Q-bus developed

to

jointly

by

Applied

Physics

that

a

512

about

20

and

Interstate x

SAR

new

on

x

32

bit

up

future

to

system.

x

8192

development

of

microwave

x

in

hydrodynamic

radars

64

bits

of

imaging

like

the

pixel Such

models

SCR,

SAR,

that

and

are

now

being

in

programmed

will

from

be

data,

SAR

could

a capability

will

shown

computes

computed

experimentation

scenes,

complex

have

transform

allowing

ocean

University

QUEN-16

perspectives

hours,

workstation.

Hopkins

the

Fourier

height

than Larger

this

Johns

with

algorithms

Wave

rather

algorithms.

8192

The

fast

filtering

in minutes,

improvements

and

experimentation

dimensional

spectral

filtering

producing

two

QUEN/VAX

data

Corporation

Initial

Basic

the

image

Fourier

with of

512

seconds.

tested

from

Electronic

Laboratory.

be

addressed

will

improve

with

processors

our

accelerate

understanding

ROWS.

Summary

Oceanographic of

NASA's

but

over

SAR

the

image

The SAR

to

October

needed

to

that

to

implement

a

quickly

remote

larger

sensors.

facilitate evaluation.

the

hurricane

of

of

data,

reduced

distribution

speed

and

networks, 1989]

are

Laboratory.

global

change at

assimilated

being Computer

in

at

visualization

129

advanced

is

simulations

emerging

time

will

in

their

and

computing University a

include

require

High

as

data

prediction

will

Hopkins

towards

science and

to

applied

assist

earth

in

or being

spaceborne

flight

models.

Johns

now

with

and

scales

and The

desktop

altimeter

environment

physical

workstations, developed

our

different into

a

directed

spills

the

workstation

imaging

for

is

prototyping

are

and

oil

contain

experiments

hypotheses

of

i0 SCR

developing

in

research

technology

may

rapid

3500

scanning

surveillance

collected

desktop now

with

theoretical

sensor

erosion,

a

filtering

visualization

on SAR,

modulation

is

as

ocean

of

theories.

VAXstation

of

SCR.

backscatter

imagery

algorithms

accelerate

graphics

of

a

1990]

comparisons

Monitoring

sensed

reviewed,

will

SAR

to

the

observed

visualization

Fourier

[Tilley,

remote

coastal

tracks.

remotely

Physics

fields

communication

Applications documentation

by

the

(generally

backscatter

serve

processing

of

study

Laboratory

to

purpose

hosted

model High

Physics

SAR

sea

modulation

1989]

interface.

simultaneous

tilt

Applied

unit user

ocean

and

general

of

obtained

surfaces

scale

driven

of

SIR-B

segment

information

synergistic

texture

those

that

short

wind the

theory

[Dolecek,

to

QUEN-16 driven

the

speckled

University

with

texture

Fourier

the

response

to

dimensional

part

texture.

variance

correlate

hydrodynamic

to

applied

as

Chile,

bunching

applied

a

with

the a

processor

a hydrodynamic

combined

[Jenkins,

that

of

similar

three

of

instrument

and

their

For

coast

environment.

menu

over

developing SAR

be

with

SAR

was

were

model

elevation

height

visually

height.

that

Hopkins

computing

transferred

to

velocity

array

can

personal

wave

and

Johns

shaded

southwest

supplement

wavefront

tool

the

also

The

as

resolution,

images

linear

spectra

plotted

spacecraft

comparable SAR

a

threshold

wave

and

at

spacecraft

surface

of

data

using

Fourier

were

noise)

indicates

information

The

were

long

off

data

data

speckle

The

power of

aircraft

function.

yield

estimate

plots

useful

QUEN

SAR

with 1984

ROWS

to

spectral

representation

their

as

size.

transfer

A

from surface

information

modulation

filtered

modulations

and

to

surface

yielded

different

parameterized

compare

referred

of

instrument. data

operated

have

topographic

and

were ROWS

visually The

obtain

response

functions by

regions

to

sensors

experiment

ocean

filtered system

remote

SIR-B

of that

space, speed

be data

technology Applied

combination

of

three-dimensional as

a

and

tool

empirical

microwave from be

graphic

scientific

data.

as

is

from

oceanographic

the

two-dimenslonal

to

surfaces

sensors.

physical

image

relationships

planned

rough

remote to

with

investigating It

scattering

insight

concepts

for

apply

that

The

these

can

be

result

processes

processing

between

of

resources

to

investigated this

governing

wind

models models

with

exemplary

the

methods

theoretical

of

radar

data

endeavor

generation

will

of

ocean

waves.

Acknowledgement Paul research Heeres work

Hazan

and have

has

Kenneth

for

advice

funded

Physics

Potocki

resources

contributed

been

Applied

and

identify

by

and

the

have

its

helped

define

development.

assistance

in

Independent

the

Quentin applying

Research

new

and

goals

Dolecek

of

and

this

Kenneth

technologies.

Development

This

Fund

of

the

Laboratory.

References

Beal,

R.

C.,

Jackson,

D.

Lyzenga

and

Spectra

with

F. W.

R.

Space",

L.

232,

"Spectrasat: Hopkins

Q.

E.,

Dig,, F.

Height,

Wind

Chile",

1985

No.

C.,

W.

Jenkins,

R. Dig.,

D.

No.

M.,

"A

Monaldo,

F.

SIR-B",

Johns

Tilley,

D.

Doppler

Imaging

G.,

D.

Techniques",

"Use

at

184-185,

APL

of

G.,

and

and

Moves

i,

Gonzalez,

Walsh,

Directional

F.

C.

D.

R.

Ocean

Wave

Ocean

Wave

Model

to Monitor

Ocean

Waves

from

Hopkins

APL

107-115,

1987.

Array

Processor",

Johns

C.

Peng,

Estimates

Y.

Remote

Into

"ROWS

for

SIR-B

Sensing

the

for

Ocean APL

for

Dig.,

Optical

Hopkins

J.

I.

Spectral

Global

Spectra

Methodology

Speckle

"SIR-B

Johns

SIR-B

E.

Symposium

Nineties",

of

Wave

Underflights

of

Digest,

Johns

Hopkins

2,

APL

1989.

Tech.

Radars",

No

Wave

APL

Practical

Hopkins

Hines,

Geoscience

I0,

3,

8,

Wavefront

Directional

"Computing

of and

F.

1989.

and

E.,

Irvine, Swift,

ROWS/SARApproach

APL

International

1985.

Tilley,

The

198-207,

E. N.

1986.

Dig.,

Glazier,

Speed,

804-811,

Spectra

A Hybrid

3,

R.

Comparison

Radar

Tech.

D.

Hines,

1531-1535,

APL

"QUEN:

i0,

Tilley,

E. "A

Scanning

C.,

G.

D.

Zambresky,

Johns

Jackson,

Tech.

F.

D.

III,

Science,

Dolecek, Tech.

Monaldo,

Aircraft

Prediction",

Beal,

M.

Hancock

8,

No.

Estimating i,

Determining

the

Engineering,

Wave

Enhancement Dig..

Tilley, D. G., "Anomalous Synthetic 1990 International Geoscience and

8,

No.

6,

with No.

Aperture Radar Remote Sensing

Spectra

from

the

1987.

Response

25,

Tech.

Wave

82-86,

Characteristics 772-779,

Fast-Fourier

i,

87-93,

of

1986.

Transform

1987.

Modulations for Symposium Digest,

Ocean Scenes", 2, 1083-1086,

1990.

Walsh,

E.

J.,

D.

W.

Hancock,

Measurements

in

Support

International

Geoscience

D. of and

E. SIR-B

Remote

Hines,

R.

Using Sensing

130

N. the

Swift,

and

Surface

Symposium

J.

F.

Scott,

Contour Digest,

2,

"Spectral

Radar",

1985

798-803,

1985.

1:::I r/l I1/

0

4.11-)

00J U

•IJ 0

U

0

:3 c_

0

r.-.I

,--40,1..I 0

0

,._

0

U

,,r..I

•1-1 ._

_

m

,--t

•I..10

_0 O0

r--t

k,I

.r-I

131

OR,G!,_'_,R.L _, ...,. OF POOff

QUALITY

_ }-I

(a)

1875

m

square

(b) 1200 square

Figure

2

SAR

three

dimensional

meter

(a)

swell

and

the

swaths

meters

wide

ment,

although

SCR

graphs

across

so

and

a

140 of

that at

(b)

meter the

sea,

remote

3 data a

surface

depicting

different

sets

elevation

wave

data

height

respectively, sensors.

The

approximate place

and

132

plotted for

propagating SCR the

time.

are

variance swath

contiguous

was

a

as 380

along

and

only

400

SAR

seg-

m

Figure depicted

3

A as

and

velocity

the

Fourier

compared tograms

200x200 a

bunching

segment

to

the

SCR

of

pixel

of

simulate topography intensity

the

Fourier

distribution

modulations

filter

with (d)

pixel

two-dimensional

of the

the

wave (c)

counts

of SAR height

along are

5

filtered the

wave

cross

SAR

section

separate

can

the

be

(b),

aircraft for

(a)

is

intensity.

distribution

computed

image

action

which tracks.

three

Tilt

included

data

in

can

be

Hissets.

133 ORIGINAL BLACK

AND

WHITE

PAGE PHOTOGRAPH

Figure 4 Long scale and short evident in a three-dimensional intensity that has been shaded from the unfiltered SAR image.

scale ocean wave correlations are surface plot of the SAR wave action with speckled data values derived

ORIGINAL 134

BLACK

AND

WHITE

PAGE PHOTOGIRAPH

N91-15627 A

PROPOSAL

TO THE

EXTEND OF GLOBAL

OUR

UNDERSTRNDING

ECONOMY

by Robbin Oakland Rochester,

R. Hough University Michigan

Manfred University Orono,

48063

Ehlers of Maine ME 04469

ABSTRACT To date, identified setting specific identified

global models have been just that. They have problems common to the peoples of the globe, without forth a basis for the specific actions carried out by peoples that could step toward a resolution of the problems. There is another way.

Satellites acquire information on a global and repetitive basis. They are thus ideal tools for use when global scale and analysis over time is required. Data from satellites comes in digital format which means that it is ideally suited for incorporation digital databases and that it can be evaluated using automated techniques.

in

The paper proposes the development of a global multi-source data set which integrates digital information regarding some 15000 major industrial sites worldwide with remotely sensed images of the sites. The resulting data set would provide the basis for a wide

variety

of

studies

The preliminary class of global helpful to local

results policy policy

central thesis be identified of

satellite

The

Problem

of and

this their

of

the

global

economy.

obtained to date give promise model which is far more detailed makers than its predecessors. proposal that utilization

major can be

industrial tracked

with

of

a new and It is the

sites the

can aid

images.

The focus of this paper is the role of policy, resources and the environment in economic and social development. The world view which it attempts to sketch for the reader will be alien to many who perceive economics as a policy science increasingly concerned with abstract "macro" entities. To be of use to humankind, policies for humans must be on a human scale. Moreover, it is of central importance that peoples be moved to perceive the objectives of the policies as important to them. To date, global models have been just that. They have identified problems common to the peoples of the globe, without setting forth a basis for the specific actions carried out by specific peoples that could step toward a resolution of the identified problems. There is

135

another way. Humans must have stories, images, habits of thumb to llve in complex environments and they Just have them as they relate to our world as a whole. Simply an atlas or even a world globe is an exercise in looking information than an individual can handle at once.

and rules don't browsing at more The

importance of manufacturing activity in the total market economy makes it imperative that manufacturing images and understandings be developed along with those which have to do with agricultural lands, forest lands, air quality, water quality, etc. The purpose of this proposal is to seek support for a project which will contribute to manufacturing images and understandings. Satellites acquire information on a global and repetitive basis. They are thus ideal tools for use when global scale and analysis over time is required. Data from satellites comes in digital format which means that it is ideally suited for incorporation in digital databases and that it can be evaluated using automated techniques. Background In order to approach the problems discussed above, a number of preliminary efforts have been undertaken. The first effort was a study of the relationship between economic activity in the 494 Rand-McNally basic trading areas and the sites of major manufacturing facilities. The 494 basic trading areas have 1361 sites of major economic activity. The 378 trading areas with manufacturing sites have a total employment in manufacturing of 20529027. The 116 trading areas without manufacturing sites have a total manufacturing employment of 1385727. The observation that 80 percent of the trading areas have 95 percent of the manufacturing employment indicates that an appropriate definition of "major" was utilized in the identification of the manufacturing sites. A simple linear regression of sites against manufacturing employment suggests that there are 12804.5 employees per site. The results are statistically significant at the one percent level. The scale of the coefficient (1/4 of the total reglonal mfg. employment) again points to an appropriate definition of "major". Moreover, it was possible by further work to account for nearly 85% of manufacturing employment on an industry by industry basis. The second effort undertaken was to construct a global database of regions which include as their centers, 738 cities chosen for their scale, their size in relation to their neighbors or their role as largest city in a nation bounded by geographic barriers or other regions. Based on the size given for the cities used as regional centers: Average Number

size of of cities

the largest per country

136

city

959173 3.67052

A profile of manufacturing employment for the regions centering on each city was computed based on the U.S. estimates. That is, each of approximately 13000 plant sites was assumed to provide the same level of employment provided by its counterpart in the United States. Assuming that a local labor-force is trained to work in manufacturing (L) and that fuel (F) and machines (K) are imported a production function would appear as: Output When estimates result is obtained for available: National R'2 F

= =

=

are 50

income

f(L,F,K)

aggregated countries

=

.45

by country the following on which World Bank data is

*

L'.54

*

F'.24

*

K'.I8

.912 158.915

In essence, over 90 percent of the variation in national appears to be accounted for by the site based employment estimates, fuel imports and machine imports. The close plausible magnitude of the coefficients are taken as preliminary evidence that the U.S. based estimates will reasonable employment A

third

substitutes for in the regions. effort

sought

to

the

actual

understand

levels

the

of

scale

of

the basis of the economic activity basis, commercial farms, trading facilities discussed above can

determine

of

scale

the

cities

supported

by

fit

and

serve

as

manufacturing

around the globe on sustain. On a global and the manufacturing the

income

a

cities which they facilities be shown to

given

region.

A fourth effort has been directed at the development of a multi-level hierarchical flat-file manipulation language which allows relational operations on digital geographic information systems as well as the storage of links to image collections. Language M is the result of that effort. It has been in commercial use for nearly two years. In a fifth project, DIRIGO - an image processing remote sensing data has been developed. The DIRIGO strictly to the Macintosh interface guidelines and users to become quickly familiar with the system. supports four file formats (i) image data, (2) ASCII of control classified intelligent

point coordinates or training images and (4) training areas interface insures that only

appropriate tasks. At

format present

can be DIRIGO

opened supports

for

area for files

system, for system adheres hence enables The system text files

statistics, classification. with the

(3)

the selected application (i) point operations such

An

as

linear and Gaussian contrast stretch and histogram equalization; (2) spatial filtering; (3) geometric correction such as image-tomap rectification and image-to-image registration using firstdegree polynomials and standard re-sampling techniques; and (4)

137

classification techniques such as parallelpiped, minimum distance, and maximum likelihood. Execution times for a 512 by 512 image with three bands range from seconds to approximately five minutes for a maximum likelihood classification with six classes. The results reported above give promise of a global policy model which is far more detailed local policy makers than its predecessors. It thesis of this proposal that major industrial identified and their utilization can be tracked satellite gives both are the

images.

rise simple already scale

The

to employment to grasp highly of the

concept and

that

a

major

new class of and helpful to is the central sites can be with the aid of

industrial

and income which cannot open to verification.

sensitive to plants studied

plant openings here. Indeed

facility

be ignored is Policy makers and closings these "local"

on

matters are matters of focus for legislatures, the press, trade unions, chambers of commerce and others. By global interdependence is meant that those actors must become aware of the kinds of changes occurring elsewhere on the globe that will influence the plants with which they are concerned. The paper industry in Green Bay should be vitally interested in changes in Finland. Objective

Project

The objective of the remotely sensed image

project being for each of

proposed the 13,000

sites worldwide. A latitude and longitude industrial classification can be identified this information and a scene which centers

is to major

produce a industrial

as well as an for each site. on the information

Given it

should be possible to identify the major site and a variety of the key facilities which are physically associated with it by interpretation. It would be expected that the working image so identified would be but a small fraction of the total image from which it would be extracted. That is, the industrial plant image would usually occupy less than one tenth of one percent of the total image. Thus each plant image would be expected to require only about 36k of storage. The complete global collection of images covering all sites of economic interest would be reduced to an easily manageable 1/2 gigabyte. A full working system for global analysis and modelling including detailed digital demographics for each of the regions and a variety of digital maps could be handled in less than 600 megabytes. Products

of

With the enabled. automated

images identified by site a variety of analyses In principle, interpretation could be supplemented analysis. A few of the many possibilities are:

.

A study specific

the

Project

of the requirements sites based on

of the

138

the industry analysis of

the

are by

images of the site. Buildings, parking lots, water sources, retention basins, storage space, special facilities, etc would be open to examination. A comparison characteristics

.

of

image features of the plants

to known from other

sources.

Such analysis would, for example, make it possible to distinguish integrated manufacturing facilities from assembly plants, consider power requirements and siting, the relationship to common public facilities, residential spaces, transport networks and the like. A

.

third

form

of

analysis

tracking of these of the characteristics known economic and plants. The most effort is

.

would

proceed

with

the

sites

over time and the relating of the tracked images to the operating characteristics of the

straightforward a digitized

product industrial

of the proposed atlas of the world.

The resulting product would have an exceedingly wide range of potential uses. It is possible to compare potential industry sites based on the characteristic parameters of other existing industrial sites. If criteria non-image be possible

.

can be established information of the to rate potential

information. principals

Earlier includes

associative evaluations.

retrieval

It would current) information

work the

combining image global database, sites based on

in this development

systems

area

for

use

by the linear in

and it will this project

comparative

be possible to include other (future and satellite images with other spectral (e.g., thermal, mid-infrared, microwave

etc.) and tie this together with SPOT's high spatial resolution. This would allow multispectral, multitemporal, and multispatial site assessments. References

Ehlers, Advanced

Manfred. Imaging,

Folland, Relationship

Sherman T. Between

Agricultural Hough,

1990, May,

Robbin

Land, R.

Manipulation of systems Research,

Their 1990,

Mac-Based 65-67.

and Hough, Robbin R. Nuclear Power Plants Land

and

Own pp.

Economics,

Hough,

Tor

Remote

Some and

forthcoming H.,

Hierarchic Relations, Baden-Baden, 1990.

139

Database Advances

Sensing

System,

Evidence the Value

of of

February Tools in

for

1991. the

Support

a

Hough, Robbin R., Natural Language and Geographic Information Systems, Proceedings of a NCGIA Workshop on Geographic Information Systems, University of Maine, 1990. . Specialization in Systems Research University of Windsor

and

and the Ecology of Microcomputers, Cybernetics, ed. George E. Lasker, and Oakland University, 1989.

• Evolutionary Process on Adaptation: Environment Calhoun, New York: Frederick

in

Hierarchical and Population, Praeger, 1986.

140

Systems, ed.

Advances

Perspectives John C.

SUMMARIES

141

143

PRL_EDiNG

PAGF" BLA_K

NOT FILMED

r,_

0000

_

_

O0

__

_

v V

_

°

_

¢)

I:_

E

R

_

r_

°

""

VV

e" 0

144

Summary

of Types of Data Fusion Methods in Workshop Papers

Sylvia Lockheed

S. Shen

Palo Alto Research 0/96-30,

3251

Hanover E-Mail:

Lab.

B/251

St., Palo Alto,

Phone:

Utilized

(415)

CA

94304

354-5019

[email protected]

ABSTRACT This paper

gives

a brief

integration

methods,

also gives

a summary

gration

in Remote

utilization

overview

namely

the recent

imaging

ent portions olutions,

fumishing

geophysical, these

payloads of a broad

sampling.

in their

Data

fusion

level

fusion

data sources pixel.

In this case,

form Sensory

registering rotational

registration

from

the images, shifts,

are pixel,

sensor

produce

at different

Data

Inte-

detailing

the

data

platforms covering

spectral

information

studies.

of analog

data

geometric

sophisticated

multisource

differoceanic,

In addition These

properties, data,

res-

such as aerial

for researchers.

characteristics,

that

and spatial

for terrestrial,

and planetary

conversion

data

into three

sensor

sources.

reference These

according

and

advanced

of multiple

of data

to pho-

data

are

temporal analytical

images

for relative and intensity

of a scene

145

translational distortions

at which

techniques.

resolution

performed obtained

common

to all

shifts,

pix-

for each

at the pixel-level. from different

It is accomplished magnitude

of each image.

fu-

Pixel-

for each of the 'new' are then accumulated

fusion

time are formed.

to the stage fusion

spatial

measurements

measurements

example

at different

correcting

as well as geometrical

types

and decision-level

with a pre-selected

ground

is a typically

namely

of different

are also available

feature,

pixels

and/or

comparison

the same

and

must be developed.

'new'

from the original pixel-by-pixel

sors or taken

types

involved. Image

increasingly

paper

TECHNIQUES

sensors

of useful

data, digital

can be categorized

The three

These

surveillance, data

used

of data

The

on Multisource

methods

FUSION

spectrum

amount

radiometric

these

techniques

els are derived 'new'

sensory

techniques

techniques

place.

tremendously.

enormous

format,

data fusion

sion takes

the number

technology,

and geophysical

To fully exploit

or numerical

in sensor

reconnaissance,

and airborne

topographical

heterogeneous

DATA

with

types

applications.

of the electromagnetic

meteorological,

spacebome

tographs,

users

science

Three

are discussed.

Workshop

fusion

OF MULTISOURCE

has increased range

techniques. fusion

at the IAPR

the type of data

data sets in earth

advances

data fusion

and decision-level

presented

categorizing

1. OVERVIEW

carry

feature,

of all the papers Sensing,

of multisource

With

of multisource

pixel,

sen-

by spatially differences,

Pixel-level

fusion

allows different sensoryor ground reference Feature-level

fusion

techniques

generally

information

be accumulated

start with applying

image

for each

analysis

'new'

techniques

pixel.

to extract

some features from each data source independently. Certain measurements of the extracted tures are derived from each data source and these feature measurements are then integrated. instance,

when

both visible

and thermal

infrared

mogeneous intensity can be extracted value and average surface temperature mal infrared gion-level various

images

respectively.

in this example. versions

For example,

data

are available,

regions

of nearly

using image segmentation techniques. Average of each region can be calculated from the visible

The intensity

Decision-level

of interpretation

interpretation.

image

and temperature

information

or interpretation-level

obtained

fusion

from the different

in the case of image

ho-

intensity and ther-

are fused

at the re-

deals with integrating

data sources

classification,

feaFor

to arrive

the

at a consensus

the interpretation

or classification

results can be represented in terms of probability assignments. Bayes rule and Dempster-Shafer's rule of combination are often used as a means to reinforce common interpretation and to resolve differences amples

in order

to arrive

of decision-level

A summary Remote

at a more

fusion

of all the papers

Sensing

is presented

complete

given

at the IAPR

in the next section.

each paper are discussed. Fusion techniques level fusion. The utilization of multisource also briefly

and accurate

The paper Robinson

entitled

and Tilton

"Refinement discusses

data through

the use of satellite

TM images,

finds the edges

data

to mask

ence

data

Workshop

out those

edge

the ground

OF WORKSHOP

of Ground

Reference

to refining

image

More

from

data.

This system ing process,

Integration

in

Data

and data fusion

methods

used in

edge

points

"Near Ground

describes

detail

the approach

and uses the edges

that are beyond

is categorized

level

Data"

segments distance

by

reference the Landsat

from the ground

a certain

as a feature

Image

to the ground

reference

from the refer-

integration

method

where

data and Landsat TM image data are fused. and detailed edge information be added to

Level

a workstation

Sensing

between level image

changes

"Integration

an approach

used to process

biomass.

detected

of SAR to fusing

near ground

No multisource

and DEM information

software

level image

data fusion

from Landsat

data will be studied

processing

of Vegetation" system

by Rasure, clusterlabeling

has been

image

analysis

consists of a low level segmentation process, a supervised and unsupervised an object feature extraction process, a classification process, and a region non-living

based

for Spatial

Chorus.

near ground paper

with Segmented

called

The system from

Data

and adding

data.

er, correlation

describes

ex-

PAPERS

specifically,

the segmentation,

This approach

entitled

and Gage

The

on Multisource

The data types

an approach

segmented

points.

reference

The paper

biomass

are typical

are categorized into either pixel, feature, or decisiondata sets in earth science applications by each paper is

the edge information extracted from ground reference This fusion allows reference edge error be corrected

process.

They

described.

2. SUMMARY

Sauer

interpretation.

techniques.

data

is discussed

TM image

data

to distinguish

in this paper. and those

living Howev-

observed

from

in the future. Data from

146

-- Geometrical SAR

images

Considerations" and

Digital

by Kropatsch Elevation

Model

(DEM) databasedon establishingsomegeometricalrelationsbetweenthedatasets.Layoverand shadowregionsaredetectedin SAR images. Suchregionsarealsofound independentlyusing DEM data. Informationfrom bothdatasetsarethenaccumulatedthroughgeometricalmatching of theselayoverandshadowregions. This fusiontechniqueis consideredto beatthe featurelevel wherethefeaturesarethe layoverandshadowregions. The paper entitled "TowardsOperationalMultisensorData Registration"by Rignot, Kwok andCurlanderaddressesthe problemof automated precision registration of multisource image data acquired

from a number

presented. data

The first technique

such as Landsat

tered. then

The

spatial

selection

us to accumulate

resolution

way Spruce

Forest

sensing

or damage

in forested

ples collected TIMS,

The paper ing Infrared on integrated the visible image

Discrete

intervals

face and the environment face temperature level.

When

performed

allow

information

the information at the feature

and surface temperature values based classification scheme. The

paper

Probabilities fying

entitled

and Its Application

multisource

probability

"A Method

which

data in remote is a generalization

Both

at a pre-seconsidered

to

MSS

is performed Data

into distinct

for each region

for Classification

of the point-valued

147

defines

: Analyz-

the imaged

pixel.

Here,

image

what

probability

Data

discusses is called designed

the sur-

data fusion

Aggregated

Using

IR sur-

at the pixel

and used as discriminants

by Kim and Swain

The authors

level.

The thermal

between

class categories,

of Multisource

NS001

approach based In his approach,

intensity.

regions.

sam-

to facil-

and TM,

Integration

from the visible

are the segmented

content, from

at the pixel

heat flux for each values

stress

and nitrogen are obtained

of energy

Nor-

of using

sets are resampled

homogeneous

the exchange

regions

Data"

the problem

as Landsat

to Multisource

of surface

the features

data

of Stressed

describes a region classification data of remotely sensed scenes.

with the intensity

sensing.

such

of nearly

describing

to HIRIS

discrete

data

Approach

are computed

and tech-

and differentiate

measurements

the data fusion

is used to classify

level where

These

sensed

the estimation

is fused

discusses

chlorophyll-a

into regions

model

pixel

Measurements

sets such as foliar

Therefore,

is first segmented

and Ground

and fluorescence

and Visible Data" by Nandhakumar analysis of thermal IR and visible image

images

matching

is, in general,

to monitor

"A phenomenological

are regis-

estimation.

for each

Registration

the sensory data

feature

parameter

sensors

by Banninger

over the test sites.

and a phenomenological

the different

This

measurements

with remotely

and FLS/PMI.

entitled

data

reflectance

integration

AIS-2

Assessment"

data, and ground

areas.

from

are

level.

Reflectance,

Damage

and foliage

at 50-m

information

TMS,

for Forest

to simulate of optical

for transformation

involved.

at the pixel

techniques

types

images.

from different

to all sensors

(DEM)

such as edges

the different

of tie points

Fluorescence,

data, laboratory

leaf area indices, itate

common

Model

of registration

the various

features

between

Two types

Elevation

therefore

information

of data fusion

"Combined

sensors.

the Digital

first extracts

the manual

example

The paper remote

technique

remote

and SPOT,

features

allow

be a typical

uses

Seasat

the extracted

replaces

techniques lected

TM,

second

matches

nique

of different

is

intensity in a rule-

Interval-Valued

a method

of classi-

the interval-valued to represent

the pos-

sibility of smalldeviationsfrom the unknowntrue probability. Their approachtreatseachdata sourceasa bodyof evidence,representstheevidenceprovidedby eachdatasourcein termsof interval-valuedprobabilities,andusesthe Dempster-Shafer'srule of combinationto integratethe interval-valuedprobabilitiesform differentdatasources.This integrationis thereforecategorized as to be at the decisionor interpretationlevel. This methodcan combineparametricas well as nonparametricinformation. The paper"ImprovedDisparity Map Analysis Throughthe Fusionof MonocularImageSegmentation"by Perlantand McKeown first addressesthe issueof using the segmentationof the monocularimagery to improve the estimatesof three-dimensionalscenestructure,namely the scenedisparity map. Their approachgeneratesthe initial disparitymapfrom a stereopair of imagesusing area-basedand/orfeature-basedmatchingtechniques.Then the surfaceillumination information providedby a monocularimageis usedto segmentthe imageinto regionsof nearly homogeneous intensity. Basedonthe assumptionthatsuchregionscorrespondcloselyto physical surfacesin the scene,the region informationis usedto removemismatches.Here,the disparity informationextractedfrom stereopairis fusedwith segmentation resultsobtainedform monocular imageatthe featurelevel wherethe featuresarethesegmentedregions. The authorslater address the problem of using a number of different region segmentationmethodsand stereo matchingtechniquesto improve building extractionaccuracy.Eachpair of region segmentation andstereomatchingtechniquesproducesa versionof building extractioninterpretation.The variousversionsof interpretationarethenusedin a votingprocessto arriveat a consensusinterpretation. The information integrationusedhereto achievea more completeandaccuratebuilding extractionis consideredto beat the decisionor interpretationlevel. The paperentitled "Useof InformationFusionto ImprovetheDetectionof Man-MadeStructure in Aerial Imagery" by ShufeltandMcKeown addresses the problemof building information fusionin aerialimagery. Sinceit is assumedthatno singledetectionmethodwill accuratelydelineatebuildings in every scene,a cooperativemethodis proposed. Variousbuilding extraction techniquessuchastheonesbasedon edgeanalysis,shadowanalysis,andstereoanalysisareused to producebuilding delineations.Thedifferentdelineationresultsaresuperimposedto generatea moreaccuratebuilding interpretation.Quantitativemeasuressuchasbuilding percentage,backgroundpercentageand branchfactor are utilized as a meansto evaluatebuilding extractionresuits. In this study,only onetype of sensorydata,namely,the aerial imageis used. Therefore, fusionis not performedusing differentsensorydata,but ratherusingthe differentdelineationresuitson the sameimagedata. The informationfusedis the edgeinformation. It is thereforecategorizedas a featurelevel integrationmethod. The paper"A ComputerVision Systemfor the Recognitionof Treesin Aerial Photographs" by Pinz describesthe conceptof a computervision systemcapableof finding treesin infrared aerialphotographs.Genericdescriptionof a type of anobjectis integratedwith the imageobject wherean objectis, in this case,a tree. The treerecognitionresultsfrom aerialphotographstaken atdifferentresolutionscalesandundervariousconditionsarealsointegratedatthe feature,namely tree,level.

148

The paper "Visualizing Characteristicsof OceanData CollectedDuring the ShuttleImaging Radar-BExperiment"by Tilley discussesthe computationof oceanwavespectra(powerspectra of the Fouriertransform)from SIR-B SurfaceContourRadar(SCR)dataandthe comparisonof theseclustersof energywith thosecomputedfrom theRadarOceanWaveSpectrometer(ROWS) dataandthe SIR-B SyntheticApertureRadar(SAR) datato parametrizemodelsof oceanimaging. No datafusionof thesethreetypesof sensorydatais usedin this study.Fusionwill, however,beconsideredin the future. The paper"A proposalto ExtendOur Understandingof the Global Economy"by Hough and Ehlersproposesthe developmentof a global multisourcedata setwhich integratesdigital information including remotely sensedimagesregarding 15000major industrial sitesworldwide to tracktheir utilization asa basisfor a varietyof globaleconomicstudies. No specificdatatypesor fusiontechniquesarementionedin this conceptualpaper.

149

Summary of Types of Data Output Produced by Studies Described in Workshop Papers Axel J. Pinz Institute

of Surveying and Remote University of Agriculture Peter Jordan Str. 82 A-1190 Wien, Austria Phone: (43-222) 342500-546

Sensing

There were a total of 12 presentations at the IAPR TC7 Workshop. Most of them dealt with the description of systems. While the two previous summaries discussed the input and processing aspects of these systems, this summary discussed the output aspect. Many contributions were concerned with an improvement or a kind of refinement of results by use of multisource data. The improvements are either in overall accuracy or in robustness. The output may be categorized as either a better segmentation, an improved symbolic description of the scene, or a more robust classification. These "improvement-papers" are: • Jon Robinson:

The output is a refined data set of edges into a refined ground cover label map).

• Tom Sauer:

The analysis process deals with classification, region analysis and spatial information. In its current state of development the system is separating vegetation from background. The final output is the percentage of ground coverage by specific vegetation classes

* N. Nandakumar:

In this combination of thermal and visual information, the thermal capacity is calculated and used to improve a segmentation, and to label various classes like pavement, vegetation, car, etc.

• H. Kim

and

P. H. Swain

(which

(presented by J. C. Tilton): The integration DEM data leads to a more robust classification.

can

be translated

of multispectral

and

• Frederick Perlant: • Jeffrey Shufelt:

These two papers from David McKeown's group at CarnegieMellon University discuss information fusion. There are several different processes run on the same image or on different images (stereo pair). The output of these processes in than fused yielding an improved result. In Peflant's work the output is a segmentation. Shufelt ends up with a set of boundaries delineating the objects of interest (e. g. houses).

• Axel

Having several images of the same scene, the first step of .process!ng leads to several symbolic description of this scene. The integration (fusion) of theses results yields an improved and more robust symbolic description of the scene objects (in this case, trees).

Pinz:

151

PRECEDING

PAGE

E:LAI"_K NO] ...... I_L..M._"*

The following contributions cannot be categorized so easily (as "improvement-papers"). seems that the desired output of these systems wouldn't be possible without multisource integration: Walter

Kropatsch:

It data

The integration of SAR, DEM and knowledge results in a geocoded SAR image. Furthermore, Dr. Kropatsch can map additional information (e.g., slope) into the original SAR image, providing means for a better interpretation of the original image.

• Eric Rignot:

This is going to be a very of registered and resampled

• C. Banninger:

The integration of a few Landsat pixels with a very detailed ground data set is used for an optimal positioning of the pixel grid. The outputs of this process are correlation results and models.

• David

Since the original radar image of the ocean appear very uniform to an human interpreter, the first task to solve is the proper visualization of the data. Furthermore, Dr. Tllley hopes to gain more insight into imaging mechanisms of SAR, and to uncover information in the speckles.

• Robbin

Tilley:

Hough:

huge system. The output data at uniform scale.

will be a pack

A database systems of this kind would be able of producing manyfold outputs depending on the query. It could be used as a tool for economic forecasting, planning and analysis.

152

APPENDIX Workshop

Participants

153

PARTICIPANTS Richard

Arens

Walter G. Kropatsch Institute for Technical

GTE Government Systems 1700 Research Blvd Rockville, MD 20850 Tel: (301) 294-8430 Cliff Banninger Institute for Image Proc. Graphics Joanneum Research

& Computer

Robert Losa USDA/NASS Rm 4168 - South Bldg Washington, D. C. 20250 Tel: (202) 447-9295

Wastiangasse 6, 1-8010 Graz, AUSTRIA Tel: 011-43-316-8021-0 Fax: 011-43-316-8021-20 E-Mail: banninger%rzj, cunyvm.cuny.edu

fgi-graz.ada.at@ David

Applied Research Corp. 8201 Corporate Drive Landover, MD 20706 Tel: (301) 805-0379 or (301)

M. McKeown

Digital Mapping Laboratroy School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 U.S.A.

David W. Case

459-8442

Tel: (412) 268-2626 E-Mail: [email protected]

Long S. Chiu General Sciences Corporation 6100 Chevy Chase Drive Laurel, MD 20707 Tel: (301) 953-2700

N. Nandhakumar Dept. of Electrical Engineering Thornton Hall, Rm E-210 University of Virginia Charlottesville, VA 22903-2242 Tel: (804) 924-6108 Fax: (804) 924-8818 E-Mail: [email protected]

Edward A. Dunlop Center for Remote Sensing College of Marine Studies University of Delaware Newark, DE 19716 Tel: (215) 430-6549 E-Mail: dunlop@ freezer.it.udel.edu Marcus

Informatics

Technical University of Vienna Treitlstr. 3 A. 1040 Vienna, AUSTRIA Tel: 0I 1-43-222-58801-818I Fax: 011-43-222-569149 E-Mail: [email protected]

Mishh

Pavel

Bldg 420 Stanford University Stanford, CA 94305 Tel: (415) 725-2430 [email protected]

E. GIenn

MITRE Corp. 7525 Coleshire Dr. McLean, VA 22102 Tel: (703) 883-5386 E-Mail: [email protected]

Frederic

P. Pedant

Digital Mapping Laboratory School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213-3890 Tel: (412) 268-7552 E-Mail: [email protected]

Dr. Robbin R. Hough Oakland University Rochester, MI 48063 Tel: (313) 651-0820 Fax: (313) 651-4071 E-Mail: [email protected]

155

PRECEDING

PAGE Bt.A['_K

NOT

FILMED

Axel

J. Pinz

Inst. of Geodesy and Remote Univ. of Agriculture Peter Jordan Str. 82 A- 1190 Wien, AUSTRIA Tel: (43-222) 342500-546

Sylvia Shen Lockheed Palo Alto Research 0/96-30, B/251 3251 Hanover Street Palo Alto, CA 94304 Tel: (415) 354-5019 E-Mail: [email protected]

Sensing

Eric Rignot Jet Propulsion Laboratory 4800 Oak Grove Drive

Lab.

Jeffrey Shufelt Digital Mapping Laboratory School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Tel: (412) 268-3884 E-Mail: [email protected]

California Institute of Technology Pasadena, CA 91109 Tel: (818) 354-1640 telex: 675-429 Fax: (818) 354-3437 E-Mail: [email protected]

Stan Stopyra GTE Government Systems 1700 Research Blvd Rockville, MD 20850 Tel: (301) 294-8430

Jon Robinson ST Systems Corporation 4400 Forbes Boulevard Lanham, MD 20706 U.S.A. Tel: (301) 794-5200 E-Mail: [email protected]

David G. Tilley The Johns Hopkins University Applied Physics Laboratory MS 23-346 Johns Hopkins Road Laurel, MD 20723-6099 Tel: (301) 953-5000 x8644 Fax: (301) 953-1093 Telex: 511303

Tom Sauer University of New Mexico Dept. of Electrical and Computer Engineering Albuquerque, NM 87131 Tel: (505) 277-6563 Fax: (505) 277-1439 E-Mail: sauer@ bullwinkle.unm.edu

E-Mail:

davet@

fermi.apljhu.edu

James C. Tilton Mail Code 936 NASA GSFC Greenbelt, MD 20771 Tel: (301) 286-9510 E-Mail: [email protected] [email protected]

Andrew Segal University of Illinois Dept. of EE CS M/C 154, Box 4348 Chicago, IL 60680 U.S.A. Tel: (312) 996-5486 E-Mail: [email protected]

E. D. Zeisler 8242 West Buckspark Potomac, MD 20854 Tel: (703) 883-5768

Robert Series RSRE St. Andrews Road, Malvem Worcs WR14 3PS UNITED KINGDOM Tel: (+44) 684-5784 E-Mail: [email protected]

156

Lane

or

Report

Nalona_/_onaut¢sand _oaceAdcr_r_lralOn 1. Report

Documentation

2. Government

No.

Accession

Page

No.

3. Recipient's

Catalog

No.

NASA CP-3099 4. Title

5. Report

and Subtitle

Multisource

Data

Integration

in Remote

Date

January

Sensing

6. Performing

1991 Organization

Code

Organization

Report

930 8. Performing

7. Author(s)

James

C. Tilton, Editor

90B00122 10. Work

9. Performing

Organization

Name

Unit

Flight

Agency

Name

of Report

Conference

and Address

National Aeronautics and Space Washington, DC 20546-0001

15. Supplementary

or Grant

No.

Center 13. Type

12. Sponsoring

No.

and Address 11. Contract

NASA Goddard Space Greenbelt, MD 20771

No.

and Period

Covered

Publication

Administration 14. Sponsoring

Agency

Code

Notes

This workshop was organized by Recognition and co-sponsored Goddard Space Flight Center, Geoscience and Remote Sensing

Technical Committee 7 of the International Association for Pattern by the Space Data and Computing Division (Code 930) of the and by the Washington/Northern Virginia Chapter of the IEEE Society.

16. Abstract

Multisource

Data

instruments

and new sensors

huge amount

Integration

support during

integration

This

and increase

system

FORM

This

1626 OCT 86

observations and

while

model

from

different

can be identified

sessions

to five individual

of this world.

properties Multiple

contradictions

a valid source

of three

of three

and in the general Publication, along

(and

independent)

may be caused

as such.

of information

But also,

are is among the sources may give data

by noise,

As a consequence, for any geographical

presentations.

- Unlimited

Subject Classif.

Unclassified

(of this page)

21.

Category

No. of pages

164

43 22.

were

papers

Statement

Unclassified

20. Security

All papers

context of the session. The full text of these with three workshop summary papers. 18. Distribution

by Author(s))

(of this report)

Unclassified NASA

New

of the "real world."

(GIS).

consisted

(Suggested

Classif.

in the near future.

of new views

and they represent

Multisource Data Integration Remote Sensing Geographic Information Systems Ground Reference Data Image Registration 19. Security

a real challenge

in a (computer-)

their credibility,,

or misinterpretations,

are very reliable

both individually in this Conference

17. Key Words

and integrated

of the world--consistent

processing,

workshop

discussed included

views each other

results

information

will become

us with a large variety

of how these data are gathered and what their characteristic of information that contribute to a meaningful interpretation.

us complementary errors

Sensing

of data has to be combined

the knowledge useful sources sources

in Remote will provide

Price

A08

is

....

U" ----

,]_'r'--=

_

_--_i

__?_

..........

-= = :.:,2-_. _=?------

_F::,

¸

.

Suggest Documents