(f) Anoplognathus pallidicollis, ÎÏ=30. 4. Quantification of the processed images in terms of blur effect using unsupervised machine learning algorithms. Consider ...
Bioinspired Polarization Navigation Sensor for Autonomous Munitions Systems
G. C. Giakos1,2, IEEE Fellow, T. Quang2, T. Farrahi1, A. Deshpande2, C. Narayan1, and S.Shrestha1, Y. Li1, M. Agarwal1 [1] Dept. of Electrical and Computer Engineering, [2] Dept. of Biomedical Engineering The University of Akron Akron Ohio, 44325, USA ABSTRACT Small unmanned aerial vehicles UAVs (SUAVs), micro air vehicles (MAVs), Automated Target Recognition (ATR), and munitions guidance, require extreme operational agility and robustness which can be partially offset by efficient bioinspired imaging sensor designs capable to provide enhanced guidance, navigation and control capabilities (GNC). Bioinspired-based imaging technology can be proved useful either for long-distance surveillance of targets in a cluttered environment, or at close distances limited by space surroundings and obstructions. The purpose of this study is to explore the phenomenology of image formation by different insect eye architectures, which would directly benefit the areas of defense and security, on the following four distinct areas: a) fabrication of the bioinspired sensor b) optical architecture, c) topology, and d) artificial intelligence. The outcome of this study indicates that bioinspired imaging can impact the areas of defense and security significantly by dedicated designs fitting into different combat scenarios and applications. Key words: Bioinspired sensors, imaging design parameters, munitions guidance, Automated Target Recognition (ATR), Space surveillance, target tracking, artificial intelligence, machine learning algorithms
1. INTRODUCTION Deployment of future highly efficient, low cost autonomous munition systems, small UAVs (SUAVs) and micro air vehicles (MAVs) for challenging engagement environments is limited by the capabilities of current guidance system technologies as well as by their competing design requirements1,2. Current UAVs tend to fly in open sky, far from any obstacles and rely on external beacons for localization and navigation. As result, they are unable from operating autonomously at low altitude, in cluttered or confined environments2. Similarly, munitions guidance underlies tremendous technical challenges depending upon the processing of information from distributed heterogeneous sensor networks of sensors, operating under different physical and engineering principles, navigation and control in mixed hostile environments, and guidance toward deceptive and objectives3. On the other hand, Automatic Target Recognition (ATR) tends to identify incoming objects at long range, often in a cluttered environment, with low SNR, and the system must be capable to track such targets long enough to identify whether the object is a friendly or not under operating scenarios having a low-contrast, few pixels in size targets. The above surveillance paradigms require extreme operational agility and robustness which can be partially offset by efficient bioinspired vision sensor designs capable to provide enhanced guidance, navigation and control capabilities (GNC). Bioinspired-based imaging technology can be proved useful either for long-distance surveillance of targets in a cluttered environment, or at close distances limited by space surroundings and obstructions. Contrary to human vision, animal vision relies on the polarization of light in addition to light intensity and color, as an Sensing Technologies for Global Health, Military Medicine, and Environmental Monitoring III, edited by Šárka O. Southern, Proc. of SPIE Vol. 8723, 87231H · © 2013 SPIE · CCC code: 0277-786X/13/$18 · doi: 10.1117/12.2022941
Proc. of SPIE Vol. 8723 87231H-1
additional source of visual information. It is also well known that polarimetric sensing and imaging offer unique advantages for a wide range of detection and classification problems due to the intrinsic potential for high contrast in different polarization components of the backscattered light. Indeed, polarized imaging can yield high-specificity images under high-dynamic range and extreme condition scenarios, in scattering media, or cluttered environments, offering at the same instance information related to the object material composition and its surface characteristics 4-9. Recently, a pattern recognition formalism based on the integration of polarimetric principles with wavelet-fractal analysis, leading to a revolutionary approach to enhance the detection and discrimination capabilities of objects and biological structures has been introduced 10. The light from the sun is unpolarized, however, as it interact with the atmosphere, though Rayleigh scattering, it becomes partially polarized. Similarly, sun light interacting with the sea surface becomes partially polarized. Both effects are depicted in Figure 1. Skylight, is partially plane-polarized, i.e., in any direction of the sky a particular orientation of the electric field (e-vector) of skylight dominates. The pattern of e-vector orientation is also recognized as a polarization pattern. Polarization can be described in terms of the orientation of the plane in which the electric vector or e-vector of skylight vibrates (direction of polarization or e-vector orientation), as well as in terms degree of polarization (DOP), which can be expressed in terms of the Stokes parameters, S0, S1, S2, S3 as:
DOP
( S12 S 22 S 32 )1 / 2 S0
(1)
Figure 1. Unpolarized light and Rayleigh scattering
Compound eyes consist of several, actually a large number of small optical systems arranged around the outside of a convex surface. There are three different types of compound eye11: apposition, superposition, and neural superposition. Simple (camera) and insect eye architectures are shown Figure 2 11,13. The superposition eyes have a key advantage over the apposition eyes because of their superb light collection efficiency, therefore their ability to operate in low-light conditions is their ability to deliver a picture in lower light conditions due to the convergence of gathered light rays. In the apposition compound eye11, which is one of the most commonly type of compound eye, each photoreceptor element, the so called ommatidium, operates independently from the other; therefore, each photoreceptor element intercepts light arising from a tiny portion of the image, then are combined by the brain into a single output. Each ommatidium consists of a thin, tapered tube capped with a cornea, underneath which is a transparent crystalline cone through which rays converge to an image at the tip of its receptive structure, referred to as the rhabdom. Each
Proc. of SPIE Vol. 8723 87231H-2
ommatidium is hexagonal in shape and lined with pigment cells, which act as a light barrier between each tube. Because of this, each ommatidium only receives an incredibly narrow proportion of incoming light from directly in front of it, each contributing a small part of the overall picture.
Figure 2. Simple (camera) eye (A) and insect vision architectures (B,C,D) A is the diameter of the aperture, f is the focal length, cl is the corneal facet lens, cc is the crystalline cones, co is the cornea, p is the screening pigment, rh is the rhabdom, ph is the photoreceptor, cz is the clear zone, and l is the lens.
In contrast, in superposition compound eyes individual optical systems operate synergistically to collect the light and produce a single image. Unlike apposition compound eyes where the ommatidia form small isolated images to be pieced together at the brain, the optical elements in superposition eyes superimpose all light rays received across the ommatidia to form one sent to the brain for processing. A detailed description of the animal eye structure, physiology and its physical and functional characteristics as related to the vision process can be referred to the pioneering studies of M. F. Land11, 12. The advantages of the insect compound eyes consist on their superb wide viewing angle and the fast movement tracking due to the large amount of photoreceptor units that do not have to individually move to track, and the ability to detect polarized light11-17. On the other have, spatial resolution and contrast resolution is severely limited by diffraction effects because of the tiny size of the individual photoreceptors as well as low quantum efficiency and detection efficiency, The purpose of this study is to explore the phenomenology of the image formation by different insect eye architectures, which would directly benefit the area of defense and security, depending upon the degree of innovation and knowledge advancement on the following four distinct areas: a) fabrication of the bioinspired sensor b) optical architecture, c) sensor topology, d) artificial intelligence.
2. IMAGING DESIGN PARAMETERS The performance of an insect eye is principally affected by following structural and optical parameters: (a) the angular spacing of the receptors, which determines the spatial resolution; (b) the optics quality which conditions the degree of blur of the image; (c) the diameter of the photoreceptors which condition both the spatial resolution and detection efficiency, and contrast; (d) signal-to-noise ratio, ( e ) motion artifacts due to blur, especially important in highly maneuverable animals, i.e., insects; (f) polarimetric arrangement of the eye structure; (g) induced polarization effects and degradation effects originating within the rabdon. Most of the following terms has been adequately addressed in 11. The acuity is defined as the spatial frequency expressed in cycles/degree, or cycles/radian, as [cycles/degrees] where
is the angle between two adjacent receptor elements, measured center to center.
Proc. of SPIE Vol. 8723 87231H-3
(2)
This is the sampling frequency, , of the bar pattern or a grating, and, it express the ability of the eye to resolve two object very close together. In one sense, it is analogous concept to the spatial resolution, typically expressed as lines pair/mm. In a diffraction limited compound eye, where is the maximum frequency (high-spatial frequency) in which a detail can be resolved. A plot of the spatial frequency versus Δφ indicates a deterioration of the spatial resolution with increasing the angle between two adjacent receptor elements, measured center to center.
Figure 3. Spatial frequency versus Δφ
For a regular eye (3) where, s is the receptor separator, center to center, it expresses the pitch resolution, and f is the focal length. For a an apposition eye the equivalent angle between the optical axes of adjacent ommatidia, is (4) Therefore, the numerical aperture (NA) is: (5) where R=2f, D is the diameter of facet lens and R is the local radius of curvature of the eye, n is he index of refraction of the medium. The acceptance angle of an ommatidium, is the half-width of the rhabdom’s angular sensitivity, expressed as:
√
(6)
where λ is the optical wavelength D is the diameter of compound eye facet d is the rhabdom diameter; f is focal length of ommatidial optics.
Proc. of SPIE Vol. 8723 87231H-4
In fact, this is depicted clearly in Fig. 4 11, indicating that the effective acceptance angle consists of the optical input aperture and the acceptance angle of the rhabdom (right) Therefore, an effective cut-off frequency of the optical system including the rabdon, νopt can be defined: (7)
Figure 4. Optical input aperture of an ommatidium (left) optical aperture of the ommatidium including rabdon aperture (right)
The receptive field of the output channel, which is the total acceptance function of the optical system, can be approximated with a Gaussian with half-width, ΔρT. It is modeled as the sum of three input channel receptive fields, namely: the half-width of receptive field function of the input channel, Δρ, defined earlier, and the spatial and temporal summation functions, modeled as Gaussians, with half-widths, Δρp and vΔt, respectively. It can be expressed as 13 (8) √
3. EXPERIMENTS AND RESULTS 3.1. MTF and Image Contrast By approximating the total acceptance function as a quasi-Gaussian with a half-width Δρ, neglecting blur and spatial functions, then the modulation transfer function (MTF) of the receptors is given by taking the Fourier transform, as 11, 13: (9) where is the spatial frequency. In Figure 5, the modulation transfer function (MTF) versus spatial frequency is plotted for different insects, by means of Equation 8. Simulation parameters were obtained from the Appendix I. The results of Figure 5 indicate a decrease of the high-frequency components with increasing the acceptance angle of an ommatidium, increasing therefore, the amount of unsharpness (blur). Defining the number of photons available to the receptors under different geometrical and optical parameters, as: (10) where, D is given by the expression: (11)
Proc. of SPIE Vol. 8723 87231H-5
and R, is defined according to: (12) In order to detect a structure with contrast C, the average number of photons per receptor element, must be: ̅
(13)
Figure 5. MTF versus normalize frequencies for different insect vision
The data of Figure 6 indicate that detection of low-contrast objects increases with increasing signal-to-noise ratio and high integration times.
Figure 6. Logarithmic contrast versus SNR of intercepted photons at different day times and integration times
3.2 Insects Eyes treated as Gaussian blur filters This study consists of four steps: 1. Simulation and plot of a laser footprint 2. Simulation and plot of the Gaussian filter blur masks for different insect eyes optical systems 3. Effects of the Gaussian filter blur masks on the laser footprint 4. Quantification of the processed images in terms of blur effect using unsupervised machine learning algorithm
Proc. of SPIE Vol. 8723 87231H-6
1.
Simulation and plot of a laser footprint
The beam divergence of the laser beam is about 0.1mrad in the x-z plane by 0.2mrad (y-z direction). With a 4.5mm collimating lens this leads to about 4.5mm (x) by 1.8mm (y) beam (assume on the laser aperture). The goal is to define the foot print of the model the spatial distribution of the laser beam at certain distances from the image plane. Let: ,
(14)
θx, is defined as :
(15)
For R>>x (16) Similarly, for θy,
(17)
for R>>y (18) and are the “super Gaussian” factors for the x- and y-directions, respectively Assuming that, Substituting (16) and (18) into (14), we obtain: ,
(
)
(19)
Defining the divergence as: (20) (21) where x0=4.5 mm, y0=1.8 mm, both at R=0 (laser aperture), then, the footprint of the laser can be estimated as: ( )
(22)
( )
(23)
The applied geometry with the estimated footprint on an image plane, at 1m far from the source is shown in Fig. 7.
Figure 7. Geometry and estimated laser footprint
Proc. of SPIE Vol. 8723 87231H-7
2.
Simulation and plot of the Gaussian filter blur masks for different insect eyes optical systems
The optical structure of the insects introduce different degrees of unsharpness. In order to assess their impact on the detection of targets, we designed Gaussian smoothening filters of size (2M+1)x(2M+1), according to: (24) Let define a small value ε that this represents the final value of the filter after smooth convergence to zero. The sigma can be therefore be optimized, according to: (25) and finally (26)
√
where σ is related to the half-width Δρ (values provided at Table I) through: (27)
√ and
(28)
√
3.
Effects of the Gaussian filter blur masks on the laser footprint
By solving Equation 26, under the assumption that ε=0.01, different values of M are obtained. As a result, different Gaussian blur filters related to the vision characteristics of the insects are shown in Fig. 8. In Figure 9, footprint images as peceived by different insect eyes, after applying the Gaussian blur filters, matching the optical characteristics of the insects, are shown.
Figure 8. Different insect Gaussian optical blur filters (a) Hemicordulia tau, Δρ=1.40; (b) Melanitis leda, Δρ=1.50, (c) Phalaenoides tristifica, Δρ=1.580, (d), Heteronympha merope, Δρ= 2; (e) Anoplognathus pallidicollis, Δρ=30
Proc. of SPIE Vol. 8723 87231H-8
Figure 9. Footprint images as peceived by different insect eyes (a) Original footprint; (b) Hemicordulia tau, Δρ=1.40; (c) Melanitis leda, Δρ=1.50, (d) Phalaenoides tristifica, Δρ=1.580, (e), Heteronympha merope, Δρ= 20;(f) Anoplognathus pallidicollis, Δρ=30 4
Quantification of the processed images in terms of blur effect using unsupervised machine learning algorithms
Consider that the above images have L grey levels in total and its normalized histogram, so that for each gray value x, px represents the frequency with which the particular value arises. Let assume that we are dealing with the case of a bright object on a dark background. The fraction of pixels that will be classified as background ones will be: (29)
∑ The fraction of pixels that will be classified as object pixels will be:
(30)
∑ Where x =grey value, px=frequency which the particular grey value occurs. Then, the total variance of the distribution of the pixels in the image is: (
)
=
(
)= (31)
where the first term expresses the variance within each class, subscript W, while the second terms expresses the variance between the two classes(interclass), subscript B. Our goal is to keep as small as possible (i.e experimental data with smallest as possible variation (error), while maximizing between the two classes. After algebraic manipulations:
Proc. of SPIE Vol. 8723 87231H-9
(32) The images of Figure 9 were processed by using the above algorithm (Otsu method)18 and the variances for different eye insects optical characteristics have been estimated and then plotted for different insects optics half maximum Δρ. The outcome of Figure 10 indicates enhanced discriminination among the classes.
Figure 10. Variance between classes versus half maximum Δρ
4.
CONCLUSION
The purpose of this study is to explore the phenomenology of image formation by different insect eye architectures, which would directly benefit the areas of defense and security. The outcome of this study indicates that bioinspired imaging can impact the areas of defense and security significantly by dedicated designs fitting into different combat scenarios and applications. 5.
ACKNOWLEDGEMENTS
This study was supported in part by the Air Force Research Laboratory (AFRL) 6.
APPENDIX
Table I. Comparisons of interommatidial (Δφ) and acceptance (Δρ) angles (data obtained from 11)
Species
Δϕ
Δρ
Δρ/Δϕ
Hemicordulia tau
0.90
1.40
1.56
Anoplognathus pallidicollis
1.5
0
3
0
2 3.3
Heteronympha merope
1.25
0
1.44
0
2.0
0
1.5
0
1.2 1.6
Melanitis leda
1.04 1.88
Phalaenoides tristifica
1.9
0
Proc. of SPIE Vol. 8723 87231H-10
1.58
0
0.83
7.
References
[1] J.H. Evers, “Biological Inspiration for Agile Autonomous Air Vehicles”, Platform Innovations and System Integration for Unmanned Air, Land and Sea Vehicles (AVT-SCI Joint Symposium) Meeting Proceedings RTOMP-AVT-146, Neuilly-sur-Seine, France, pp. 15-1 – 15-14, 2007. [2] Martin F. Wehling, AFRL-RW-EG-TR-2012-037 STATUS OF THE AFRL/RW BIO-SENSORS LAB, 28 March 2012. [3] Johnny H. Evers* and Martin F. Wehling, Biologically inspired guidance for tactical munitions, Air Force Research Laboratory, Munitions Directorate, pp. 1-10, 2000. [4] D. Goldstein and Collett, Polarized Light, CRC Press, 2003. [5] R.A. Chipman, “Polarimetry,” Ch 22 of Handbook of Optics, 2nd ed., vol. 2, New York, McGraw Hill, 1994. [5] [6] George C. Giakos, Richard. H. Picard, Phan D. Dao, Peter N. Crabtree, Patrick J. McNicholl, “Polarimetric Wavelet Phenomenology of Space Materials”, IEEE International Conference on Imaging Systems and Techniques, Batu Ferringhi, Malaysia, IEEEXplore, pp.1-6, 17-18 May 2011. [7] H. Goldstein, and D.B. Chenault, “Near Infrared Imaging Polarimetry”, Proc. SPIE 4481, pp. 30-31, 2001. [8] G. Giakos, R. Picard, and P. Dao, “Superresolution multispectral imaging polarimetric space surveillance LADAR sensor design architectures”, SPIE vol. 207, pp. 71070B-71070B-12, 2008. [9] G. C. Giakos, “Advanced Detection, Surveillance, and Reconnaissance Principles”, Proc. IEEE International Workshop on Measurement Systems for Homeland Security, pp. 6 – 10, Orlando, FL, 2005. [10] George C. Giakos, Richard H. Picard, Phan D. Dao, Peter N. Crabtree, Patrick J. McNicholl, Jeff Petermann, Suman Shrestha, Chaya Narayan, Stefanie Marotta, “Polarimetric wavelet fractal remote sensing principles for space materials”, Proc. SPIE 8364, Polarization: Measurement, Analysis, and Remote Sensing X, 836405 (June 8, 2012). [11] M. F. Land, Visual Acuity in Insects, Annual Review of Entomology, Vol. 42: 147-177, pp. 147-177, 1997 [12] M.F Land, D.E. Nilsson, Animal Eyes, Oxford University Press, 2006. [13] E.J. Warrant, “Seeing better at night: life style, eye design and the optimum strategy of spatial and temporal summation”, Vision Research, vol. 39 1611–1630, 1999. [14] K. Zhao, J. Chu, T. Wang, Q. Zhang, “A Novel Angle Algorithm of Polarization Sensor for Navigation”, IEEE Transactions on Instrumentation and Measurements, vol. 58, 2791–2796, 2009. [15] M. Mappes, U. Homberg, “Behavioral analysis of polarization vision in tethered flying locusts” J. Comp. Physiol., vol. 190, pp. 61–68, 2004. [16] A von Philipsborn, A.T. Labhart, “Behavioral study of polarization vision in the fly, Musca Domestica” J. Comp. Physiol., vol. 167, 737–743, 1990. [17] D. Lambrinos, R. Möller, T. Labhart, R. Pfeifer, R. Wehner, “A mobile robot employing insect strategies for navigation”. Robot. Auton. Syst., vol. 30, pp. 39–64, 2000 [18] M. Petrou, K. Petrou, Image Processing, The Fundamentals, John Willey&Sons, United Kingdom, 2010.
Proc. of SPIE Vol. 8723 87231H-11