Tracking Bolides – Kaye et al.
Tracking Bolides, 3D Visualization and Data Thomas G. Kaye Foundation for Scientific Advancement 7023 Alhambra Dr., Sierra Vista, AZ 85650
[email protected] Robert Crawford Tucson AZ Mark Bowling Tucson AZ John Kalas Tucson AZ
Abstract For the past decade progress has been made in using consumer-grade low light video cameras to watch the night skies for meteor trails. Sophisticated software can now monitor the video stream in real time and parse out the events of interest from a continuous recording. Sandia Laboratories has funded over 100 of such cameras spread throughout the country. Their primary purpose has been to record large fireballs. The fireball data is primarily used to compare with space based observatories that are watching for nuclear explosions on Earth. The Desert Fireball Array in southern Arizona has deployed three of these cameras in overlapping visual fields. The mutual detection of events allows for three-dimensional reconstruction of the entry angle, speed, brightness and spectroscopy. The stated goal of the project is to use the tracking information to locate and recover a meteorite that was tracked through the atmosphere. The acquired data would allow direct comparison of atmospheric ionization characteristics to ground based laboratory analysis using simulated ablation of the same meteorite. To achieve this goal, high precision calculations and final analysis using 3D projections in Google Earth will be described.
1. Introduction Meteors are the least expensive “sample return missions” available on Earth. Known meteor showers have been a subject of intensive studies especially in Europe (Berezhnoy, 2010; Borovicka, 1994). The sporadic meteors however present a challenge for research due to the random nature of their arrival and difficulty of recovery. While many large random bolides have been captured on home video, comparatively few have been recovered and almost none with reliable reentry data. Most data available from previous studies consists of velocity and brightness information (Boroviþka, 2007). Spectroscopic data is available for showers (Madiedo, 2011) but almost non-existent for sporadics. The upper atmosphere where bolides begin ionization is molecularly extremely thin and is at a higher vacuum than can be achieved on Earth. This leads to unusual ionization species that normally recombine instantly in a lower vacuum plasma environment but in the upper atmosphere, live long enough to display “forbidden lines” in their spectra.
Velocity upon reentry while consistent in meteor showers varies widely with sporadics and has not been well characterized. Velocity and entry angle combine to influence the survival potential of a meteor through the burn phase. Certain orbit trajectories can closely match Earth’s orbital velocity and are very favorable for meteor survival. These bolides exhibit slow apparent velocity and deep penetration before dark flight. For the past three years the group known as the Desert Fireball Array has been running three cameras on a triangular 100 kilometer baseline that covers the south eastern corner of Arizona.
2. Materials and Methods The initial Sandia Labs camera is pictured in Figure 1. The housing uses standard PVC fittings with an acrylic dome. Self-contained in the housing is the camera, power supply, heating unit and temperature control module. It is powered by 120 VAC and feeds an analog video signal via the coax cable to a commercial computer video card at video frame rates.
185
Tracking Bolides – Kaye et al.
The software is proprietary to Sandia and scans each series of 5-10 frames looking for brightness changes above a certain threshold. It is continuously recording and holds approximately the last 30 seconds in a memory buffer at all times. Once an event is triggered, the buffered recording allows the entire event to be saved with no clipping due to detection lag. The resulting data file displays a subframe of the video with the target centered in the field. A keystroke will provide a light curve for the duration of the burn.
shoulder” of the third camera to maintain a 360 degree mutual detection zone as shown in Figure 3.
Fig. 2 Second generation camera housing with 100 degree field of view.
Fig. 3 The three cameras overlap the field of view within the 100 kilometer baseline. Dual camera detections are possible out to several hundred kilometers.
Fig. 1. Sandia video camera housing. System incorporates PVC fittings for waterproofing with internal heater and power supply.
Figure 2 shows the second generation housing developed by the Fireball Array team. It is based on an inexpensive waterproof electrical enclosure that has been modified with an optically flat side window. The heater and onboard power supply have been dispensed with in order to save space and it is now powered by an external supply. The original camera utilized a 360-degree fisheye lens for horizon-to-horizon coverage. In practicality, bolides close to the horizon are typically many hundreds of kilometers away and do not produce useful data. The horizon in view also creates multiple false triggers from cars etc. Experimentation showed that three cameras using lenses with a 120 degree field-of-view, in a triangular deployment, provided a higher resolution image. In this arrangement, each pair of cameras looks “over the
186
Fig. 4 Automatically generated masking in UFO Capture in dark blue. This reduced the number of false hits by approximately 95%. Light blue is a captured meteor trail.
The Sandia software has now been replaced with UFO Capture (Sonotaco.com) that was developed in
Tracking Bolides – Kaye et al.
Japan. The new software features a self generating mask that blocks areas of twinkling lights such as stars or streetlights that continually vary (Fig. 4). This very powerful feature eclipses other software and drastically reduces the number of false detections, which can easily fill a hard drive in one night. Astrometric calibrations are handled by matching a star field to the mask. A least-squares calibration between the field and the mask determines RA and DEC for each pixel. In Japan, UFO Capture is used primarily with narrow field camera lenses (25~40 degree field of view). It remains to be determined whether the camera model used in the astrometric calibration can represent the greater distortions present in fisheye lenses.
Fig. 5 Reconstructed path through the atmosphere in Google Earth. The position data was extracted for each frame.
Combining the astrometric position information from two or more cameras allows for a three dimensional reconstruction of the bolides path through the atmosphere. In order to determine a potential impact zone, Google Earth data files were created from the multi-camera reentry data (Fig. 5). The 3D rotation in Google Earth allows rotation in order to “sight down the barrel” of the bolide’s trajectory and define a spot on the ground as the maximum, or farthest point of impact for the search ellipse (Fig 6.) Additional rotation allows for the end of the burn phase to be viewed normal to ground, thereby determining the minimum, or nearest position for the ellipse discounting wind. The light curve for the bolide is extrapolated from each frame of the video stream. The light curve is superimposed three dimensionally on top of the trajectory again using Google Earth. Penetration into the atmosphere is then visually correlated with brightness (Fig. 7).
Fig. 6 Using Google Earth to calculate the search area for a fall. Upper image is “looking down the barrel” of the bolide path in order to determine the furthest extent of the strewn field. Lower image is a side view of the trajectory in order to determine the minimum starting point for the ellipse of the strewn field.
Fig. 7 White vertical lines above the red reentry path indicated the relative brightness during the burn phase. In this configuration the light curve can be visually correlated to the penetration into the atmosphere.
187
Tracking Bolides – Kaye et al.
Proprietary software has been created in Matlab to triangulate a bolide’s position in three dimensional space on a frame-by-frame basis. For events seen by two or more stations, the video recordings are aligned in time to sub-frame accuracy (