3D Imaging MMWave Radar

2 downloads 0 Views 167KB Size Report
Jan 22, 2004 - Real-time Visualization Using a 2D/3D Imaging MMWave Radar ... as well as 3D models of airports can be used to simulate daylight approach ...
MMW 2D/3D Radar

Real-Time Imaging VIII (EI13) SPIE Electronic Imaging, 18–22 January 2004, San Jose, California

Real-time Visualization Using a 2D/3D Imaging MMWave Radar Barnabás Takács*a , Lev Sadovnika, Vladimir Manassona, Mitch Wadea, Lawrence A. Kleina, Douglas Wongb, , Bernadette Kissc, Balazs Benedekc, Gabor Szijartoc a

b

WaveBand Corp., 1752 Armstrong Ave., Irvine, CA, 92614 NASA Langley, Hampton,VA 23681-2199, Bldg. 1268A, R1172 c VerAnim 1118, Otthon u., Budapest, Hungary

ABSTRACT This article describes a novel approach to the real-time visualization of 3D imagery obtained from a 3D millimeter wave radar and the associated sensor fusion modules. The MMW radar system uses two scanning beams to provide all weather 3D distance measurements of objects appearing on the ground. This information is displayed using our highend 3D visualization engine capable of delivering models of up to 100,000 polygons with 30 frames per second. The resulting 3D models can then be viewed from any angle and subsequently processed to integrate match them against 3D model data stored in a synthetic database. Such systems can be installed in aerial vehicles in order to process and display information merged from multiple image sources including a high-resolution MMW 2D and 3D radar images, a stored terrain with 3D airport database, and near IR sensors. The resulting system provides safe all-time/all-visibility navigation in terrain-challenging areas and, in particular, real-time object detection during landing. This paper focuses on the real-time imaging and display aspects of our solution, and will discuss technical details of the radar design in the context of a practical application. Keywords: real-time processing, 3D rendering, 3D image acquisition, millimeter wave imaging, radar systems, synthetic vision, DEM

1. INTRODUCTION Synthetic Vision Systems (SVS) combine advanced sensor capabilities with real-time rendering of airports and 3D models of the environment to help pilots navigate and land in low visibility conditions or at night. SVS works by rendering 3D synthetic scenery based on GPS data and on-board sensors of the aircraft. Large data sets of Digital Elevation Model (DEM) as well as 3D models of airports can be used to simulate daylight approach and ideal visibility conditions. However, existing on-board radar systems provide only 2D displays of objects they detect. However, the phase information from the radar return contains 3D cues and advanced designs using dual scanning beams may also be used to create true 3D images. Once processed and reconstructed, these 3D surfaces provide meaningful advantages over existing solutions and augment the pilots’ or operators sense of information. In the following sections we discuss the main modules of our 2D/3D high resolution MMW radar imaging system and its real-time rendering and image processing capabilities. 2. HIGH PERFORMANCE MILLIMETER WAVE IMAGING Recent advances in radar antenna technology and personal computing platforms have created a novel opportunity to design all weather imaging radars that run on low-cost configurations and can be readily installed on aircraft to provide live images during landing. The 94 GHz millimeter wave (MMW) imaging radar developed by WaveBand consists of a low power FMCW homodyne transceiver, a radar signal processor and an image processor which provides the processing and coordinate conversion that is necessary for display on a heads-up-display achieving a 30 fps scanning *

[email protected] Phone +1 310 312-0747 Fax: +1 310 312-1974

1

MMW 2D/3D Radar

Real-Time Imaging VIII (EI13) SPIE Electronic Imaging, 18–22 January 2004, San Jose, California

rate for the continuous linear scan. The radar produces angular distance measurements of objects in the field from which we first create a top down view of the scene (Figure 1.left) , and subsequently produce a perspectively correct ”out of cockpit” view for the pilot (Figure 1.right).

Fig 1. Top-down (left) and perspective (right) views of a runway obtained with a high resolution millimeter wave imaging device.

3. BUILDING 3D RADAR SURFACES The 3D surface reconstruction algorithm operates on the stack of images obtained during the multiple scans from the antennas. The data set is a volumetric image that can be represented as voxels or a volumetric texture. Subsequent 3D image processing and morphological operations allow our radar solution to compute a height field in a data format compatible with the DEM model rendering module. The 3D processing module of the SVS radar system creates high resolution height field data. The captured surface comprises of up to 100,000 polygons with real-time, dynamic texture maps mapped on it. The system runs on commodity hardware, specifically, a personal computer figuration costing under $2000. To achieve this high performance using low-cost hardware we devised a predictive rendering scheme that measures and takes into account the relative speed of the CPU, the graphics card and the memory. The predictive rendering algorithm uses this information to schedule optimal sub-tasks within the render pipeline and thereby ensure maximum visual throughput on any configuration. Figure 2 shows different examples of radar cross section processing. The height field is created from horizontal radar cross section images such as the one the top. Color codes indicate the strength of the radar return signal. Each such image is processed to remove noise and detect critical image features (middle). Finally, the results obtained from each layer are integrated to form a height field surface mesh (bottom). Figure 3 shows the same 3D surface from different viewing angles as displayed by the interactive real-time visualization system.

2

MMW 2D/3D Radar

Real-Time Imaging VIII (EI13) SPIE Electronic Imaging, 18–22 January 2004, San Jose, California

Fig 2. Building 3D surfaces using radar cross section images.

Fig 3. Example of a 3D radar built using WaveBand’s system.

3

MMW 2D/3D Radar

Real-Time Imaging VIII (EI13) SPIE Electronic Imaging, 18–22 January 2004, San Jose, California

4. REAL-TIME SENSOR FUSION AND INTEGRATION We developed a high performance visualization module to allow the SVS to operate with 3D radar surfaces generated by the radar, man-made objects such as buildings/airfields, or video imagery. This module is capable of displaying live video feed from either visual or IR sensors and 3D model views from Digital Elevation Model (DEM) data. The position information for the 3D rendition can be obtained using the GPS-based sensors in the aircraft. This data flow is used to control the relative position of a virtual camera traveling “above” the digital airport model and terrain surrounding it. Since the raw image data from the MMW image, 3D synthetic views and the optional video/IR camera have different resolutions, they are combined and further processed by a registration module. This registration module is a real-time image processing module that creates a properly registered output image. To create the output image, we consider the registration problem in the 2D image domain whereas binary image features obtained from sensor image pairs are aligned using an iterative matching algorithm. The sensor image pairs that feed into the registration module are the perspective radar image and a 2D image. The 2D image is obtained from either a rendered view of a 3D synthetic model or processed images from the optional image sensors. Both input images first pass through a feature detection module. This feature detection module uses a feature extraction algorithm that exploits a model of neural attention mechanisms in the human visual system1. It automatically locates a subset of binary features that can be used for optimal feature matching and tracking. Subsequently, the algorithm matches these detected binary features by minimizing a cost function based on a measure derived from the Hausdorff distance2 to find the best matching image transformation parameters. Figure 4 demonstrates the operation of the matching process with matching feature points on a MMW image and a photo, respectively. The bottom image in Figure 5 is the algorithm’s final output view shown to the pilot, which is obtained by combining the runway image (top) and the corresponding MMW sensor image (middle). The final registered and integrated image of the sensor fusion model can be used to track and to recognize objects of interest. Symbology can be added on to highlight those objects and features. To recognize objects of interest this sensor integrated SVS system incorporates an adaptive neural network architecture, called Visual Filters, which has been successfully used to identify 3D objects from multiple viewpoints, by maximizing a neural function disciminalibilities with respect to the reference object class3,4.

Fig 4. Example of image registration using unsupervised feature detection (see text).

4

MMW 2D/3D Radar

Real-Time Imaging VIII (EI13) SPIE Electronic Imaging, 18–22 January 2004, San Jose, California

Fig 5. Example of image registration using unsupervised feature detection.

5

MMW 2D/3D Radar

Real-Time Imaging VIII (EI13) SPIE Electronic Imaging, 18–22 January 2004, San Jose, California

5. ACKNOWLEDGMENT This work was partly supported by BOEING corporation and NASA under contract # NAS1-02074.

6. REFERENCES 1. 2. 3. 4.

Takács, B., H. Wechsler (1997), ”A Dynamic, Multiresolution Model of Visual Attention and its Application to Facial Landmark Detection”, in CVGIP Image Understanding, 71(1). Takács B. (1998), ”Comparing Face Images Using the Modified Hausdorff Distance”, Pattern Recognition, 30(10), 1623-36. Takács, Sadovnik (1998), “3-D Target Recognition and Tracking Using Neural Networks Trained on Optimal Views”, J. of Optical Engineering, Special Issue on Automatic Target Recognition, 37(3), 819-828. Takács, Sadovnik (1998), “3D Target Recognition Using Quasi-optimal Visual Filters”, SPIE Aerospace/Defense Sensing, Simulation and Controls, April, 1998, Orlando, Florida, USA.

6