array using ferroelectrets: A new fabrication approach,â IEEE transac- tions on ultrasonics, ferroelectrics, and frequency control, vol. 56, no. 4,. April 2009.
Sub-Nyquist Spatial Sampling Using Arrays of Directional Microphones Vladimir Tourbabin
Boaz Rafaely
Department of Electrical and Computer Engineering Ben-Gurion University of the Negev
Department of Electrical and Computer Engineering Ben-Gurion University of the Negev
Abstract—Microphone arrays are commonly used for spatial filtering or beamforming. A variety of beamforming techniques offering high spatial resolution have been developed and are described in the literature. However, improved resolution may require an increase in the number of sensors and channels, resulting in complicated and expensive systems. In the current paper the use of arrays composed of directional sensors is proposed and the consequent benefit in terms of a reduction in the number of sensors is analyzed. It is shown that a reduction in the number of sensors can be achieved at the expense of reduced steering capabilities. Index Terms—beamforming, microphone arrays, acoustic field, array signal processing.
I. I NTRODUCTION Sensor arrays of various geometries, including linear, planar and spherical are described in the literature [1]–[5]. One of the major applications of these arrays is spatial filtering or beamforming [6]. There are numerous approaches to beamforming including data-independent, statistically optimum and adaptive techniques [7], [8]. Generally, in order to achieve high spatial resolution and improved performance, there is a need for relatively large spatial apertures. As a consequence and in order to avoid spatial aliasing a large number of sensors is usually required. This requires simultaneous processing of numerous channels resulting in relatively expensive and sophisticated processor structures. Several ways to simplify the beamformer implementation have been studied and reported in literature. Havelock [9], [10] described a processor which samples all the channels sequentially one at a time. The data is aggregated in only one channel in a way that statistically approximates the delay and sum beamforming technique. Another reported approach to the simplification of beamformer structure uses real weights without the need for phase adjustments. This allows for a narrowband beamforming procedure to be established using simple amplifiers [11]–[13]. Continuous time wideband processing, where the desired frequency range is divided into subbands and processed independently, has also been reported [14]. In this paper we analyze the possibility of array construction using directional sensors and its advantages in simplifying the beamformer. We show that the number of sensors can be effectively reduced without affecting the array spatial
characteristics, or the frequency range can be broaden by using the same number of sensors, without suffering from extensive spatial aliasing. However, the steering capabilities will be limited in this case depending upon the amount of channel reduction. Any directional sensors, including cardioid variations or bidirectional beampatterns, can be used in order to construct an array of directional sensors. The discussion here is also concerned with an implementation using Electro-Mechanical Film (EMFi) [15]–[18]. Using these sensors, one of the electrodes can be divided into sub-electrodes thus effectively forming an array [19]. Each sub-electrode forms a continuous aperture having a directional response. In addition, each subelectrode can be accessed independently allowing beamforming procedures. The paper starts with a brief background, where the basic concepts of beamforming and the overall beampattern of an array consisting of directional sensors are introduced. We proceed with an analysis of the total beampattern characteristics and its advantages as compared to a conventional array formed by omni-directional sensors. Conclusions and suggested future work follows. II. A RRAYS
OF DIRECTIONAL SENSORS
Consider the uniform linear array illustrated by Fig. 1. The array consists of N sensors, with d indicating the distance between adjacent ones. All the sensors have an identical spatial response denoted by Bc (ψ), where ψ = ωc d cos θ denotes the beampattern dependence on the frequency ω and the arrival direction θ, with c = 343 m/s, denoting the speed of sound. Assuming plane-waves propagating in free field, the responses of the sensors are described by the array manifold vector: iT h v(ψ) = Bc (ψ), Bc (ψ)ejψ , . . . , Bc (ψ)ej(N −1)ψ
(1)
The beampattern of the above array is given by the dot product of the manifold vector and the weights vector w = [w0 w1 . . . wN −1 ]T , which is, in general, a function of frequency (the dependence on frequency is omitted for notation
Preprint: Joint Workshop Hands-free Speech. Communic. Mic. Arrays (HSCMA 2011): p. 76-80, Edinburgh, United Kingdom, May 2011.
cite
1/5
III. S UB -N YQUIST S AMPLING
θ
d
w0
The beampattern of an array consisting of omni-directional sensors is given by (4). Its visible range, which is the range in which it is defined at any given frequency ω, is − ωd c ≤ ψ ≤ ωd , determined by the range of arrival direction 0 ≤ θ ≤ π. c The beampatterns for two different visible ranges are illustrated in Figs. 2 and 3. It can be observed that the function
w1
wN-1
w2
160
140
120
100
θ [o ]
80
60
40
20
1
Gain
0.8
Fig. 1. Schematic illustration of uniform linear array consisting of N sensors with spacings d.
0.6
0.4
0.2
simplicity): BT (ψ) = wT v =
N −1 X
0
wn ejψn Bc (ψ)
−3
−2
−1
0
1
2
3
ψ [rad]
n=0
= Bs (ψ)Bc (ψ),
(2)
Fig. 2. Beampattern of linear array of 10 omni-directional sensors when the sampling is performed in accordance to Nyquist criterion, λ = 2d.
where Bs (ψ) is the beampattern of an array consisting of omni-directional sensors. It should be emphasized here that the overall beampattern of an array, consisting of directional sensors, is simply the product of the beampatterns of omnidirectional sensor array and the beampattern of each directional sensor. By using the well known delay and sum technique, which suggests weights of the following form:
160
140
120
100
θ [o ]
80
main lobe
60
40
20
grating lobes
1
0.8
,
(3) Gain
wn = e
−jψl n
it can be shown that [1], (ψ−ψl )N sin 2 1 , Bs (ψ) = N sin ψ−ψl
0.6
0.4
(4)
2
where ψl = ωc d cos θl with θl indicating the array look direction. As noted above, any directional microphones can be used to construct an array. In the next section we will be concerned with the following beampattern as an example: ψ . (5) Bc (ψ) = sinc 2π This is appropriate for a continuous sensor such as EMFi. We will analyze the overall beampattern and its steering capabilities from the perspective of spatial aliasing.
0.2
0
−8
−6
−4
−2
0
2
4
6
8
ψ [rad]
Fig. 3. Beampattern of linear array of 10 omni-directional sensors for subNyquist sampling, λ = 2d/3.
Bs (ψ) is periodic with a period of 2π (see Fig. 3). When sampling exactly in accordance with Nyquist criterion, only one period will be included in the visible range (Fig. 2). In this case the steering can be performed almost to endfire (θl = 0, π) without causing aliasing.
Preprint: Joint Workshop Hands-free Speech. Communic. Mic. Arrays (HSCMA 2011): p. 76-80, Edinburgh, United Kingdom, May 2011.
cite
2/5
180 160 140 120
∆θ [o ]
On the other hand, for frequencies higher than the Nyquist frequency (or wavelengths λ < 2d) the visible range will include more than one period as indicated in Fig. 3. As a consequence, particularly when steered to some desired look direction, the grating lobes will enter the visible range causing spatial aliasing. By using directional sensors, the beampattern of an omnidirectional sensor array will be multiplied by the beampattern of the individual sensors as discussed above. Thus, the grating lobes will be attenuated according to the beampattern of the directional sensor. This effect is illustrated in Fig. 4 for the case of sub-Nyquist sampling with λ = 2d/3, using directional sensors with the beampattern described by (5). It is clear that
100 80 60 40 20 0 0
160
140
120
100
θ [o ]
80
60
40
0.5
1
1.5
ξ
20
Bc (ψ)
1
Fig. 5. Allowed steering range as a function of under-sampling depth ξ, using an equal grating lobes criterion.
Bs (ψ) Bc (ψ)Bs (ψ)
Gain
0.8
0.6
0.4
0.2
0
−8
−6
−4
−2
0
2
4
6
8
ψ [rad]
Fig. 4. Beampattern of a linear array consisting of 10 directional sensors for λ = 2d/3.
the grating lobes are strongly attenuated and, when steered, will cause much lower aliasing. Note that steering will also attenuate the main lobe in accordance with the directivity of the sensors. Thus, the steering is limited to one period of Bs (ψ), and to the angles covered by the main lobe of the sensors. In order to calculate the allowed degree of steering or the steering span in θ domain we define ξ as a measure of undersampling: λ , (6) ξ= 2d where λ represents the wavelength. It is easy to see that for ξ > 1 there is no under-sampling and for ξ ∈ [0, 1] there is an under-sampling with its depth described by the value of ξ. Now, using the relation between ψ, θ and ξ we can write: π (7) ψ = cos θ, ξ and the resulting overall steering range is given by: ∆θ = π − 2 cos−1 ξ
(8)
The steering range ∆θ is presented in Fig. 5 as a function of under-sampling ξ. It can be seen, for example, that when we
sample the space using d = λ, the span of allowed steering is reduced to ∆θ = 60◦ . The above results show that using directional microphones can be beneficial in avoiding spatial aliasing, which means that the array can be used to receive fields including higher frequencies than would be allowed on the basis of the Nyquist criterion, albeit with reduced steering abilities. The same result can be viewed from another perspective. If there is no need for the full steering range for some particular application, the spacings between the microphones can be increased, while preserving the same spatial aperture and thus the same spatial performance. This will reduce the number of microphones needed and effectively reduce the overall beamformer complexity. IV. S IMULATION
AND
F URTHER D ISCUSSION
In order to provide an illustrative example, a linear array consisting of 5 directional microphones (with the beampatterns described by (5)) spaced 14 cm apart was considered. Its beampattern for a frequency range of 0−5 kHz was calculated and is presented in Fig. 6. The array is steered to θl = 75◦ for all frequencies. The beampattern of an omni-directional microphone array having the same parameters is presented in Fig. 7 for comparison. Note that in this case the wavelength at the maximum frequency (λmin ≈ 7 cm) is four times smaller than that allowed by the Nyquist criterion. From Fig. 7 it is clear that the beampattern of the array consisting of omni-directional sensors has grating lobes starting from 1225 Hz, which would normally result in extensive aliasing. However, for the array with directional microphones, the grating lobes are much lower (see Fig. 6). In particular, it can be observed that although the grating lobe enters the visible range at the same frequency as for the omni-directional sensor array, i.e 1225 Hz, its amplitude is much lower than that of the main lobe. The grating lobe grows continuously
Preprint: Joint Workshop Hands-free Speech. Communic. Mic. Arrays (HSCMA 2011): p. 76-80, Edinburgh, United Kingdom, May 2011.
cite
3/5
TABLE I P ERFORMANCE COMPARISON FOR ARRAYS CONSISTING OF 5 DIRECTIONAL AND OMNI - DIRECTIONAL MICROPHONES . microphones omni directional omni directional
Gain [dB]
0
d [cm] 14 14 3.5 7
∆θ [◦ ] 180 30 180 60
aperture [cm] 56 56 14 28
0
−5 1000 −10 2000
−15
3000
150
f [Hz] 100
θ [o ]
4000
sensors, although with reduced steering range. The next two rows of Table I demonstrate that by using an array of directional sensors, the spatial aperture can be increased at the expense of reduced steering range, while maintaining the same number of sensors and the same frequency range.
50 0
V. C ONCLUSION
5000
Fig. 6. A beampattern of a linear array consisting of directional microphones, N = 5, d = 14 cm.
0
Gain [dB]
freq. range [Hz] 0-1225 0-4900 0-4900 0-4900
In this paper an array consisting of directional microphones was analyzed to examine the potential reduction in number of sensors that can be achieved without affecting the spatial performance. It was shown that the number of sensors can be reduced, resulting in the reduction of channels and simplifying the beamformer structure. It was observed that the reduction is achievable at the expense of reduced steering capabilities, making arrays of directional microphones particularly useful in applications when only a limited steering range is required. Future work will include experimental evaluation of the proposed method and its performance by using common beamforming algorithms.
0
−5 1000
R EFERENCES
−10 2000
−15
3000
150
f [Hz] 100
θ [o ]
4000 50 0
5000
Fig. 7. The beampattern of a linear array consisting of omni-directional microphones, N = 5, d = 14 cm.
as the frequency increases and its amplitude becomes equal to that of the main lobe only at 5 kHz. Note that here the steering angle θl is constant for all frequencies as mentioned above, which means that steering amount in the ψ domain grows with frequency, potentially increasing the grating lobe height (and decreasing the main lobe height) in accordance with the directivity of the sensors. The above simulation is summarized in Table I, where two examples are presented in order to illustrate the relation between different array parameters and the trade-offs involved. The first two rows show that when using an array of directional sensors, the same array configuration (number of microphones and spatial aperture) can operate in a frequency four times wider, as compared to an array consisting of omni-directional
[1] H. L. Van Trees, Optimum Array Processing (Detection, Estimation and Modulation Theory, Part IV). New York: Wiley Interscience, 2002. [2] H. Krim and M. Viberg, “Two decades of array signal processing research,” IEEE Signal Processing Magazine, vol. 13, no. 4, pp. 67– 94, july 1996. [3] E. G. Williams, Fourier Acoustics: Sound Radiation and Nearfield Acoustical Holography. New York: Academic, 1999. [4] B. Rafaely, “Analysis and design of spherical microphone array,” IEEE Transactions on Speech and Audio Processing, vol. 13, no. 1, January 2005. [5] J. Meyer and G. W. Elko, “A spherical microphone array for spatial sound recordings,” J. Acoust. Soc. Am, vol. 111, no. 5.2, pp. 2346–2346, 2002. [6] B. D. Van Veen and K. M. Buckley, “Beamforming: A versatile approach to spatial filtering,” IEEE ASSP Magazine, April 1988. [7] L. Griffiths and C. Jim, “An alternative approach to linearly constrained adaptive beamforming,” Antennas and Propagation, IEEE Transactions on, vol. 30, no. 1, pp. 27 – 34, Jan. 1982. [8] L. Griffiths, “Adaptive array processing - a tutorial,” Communications, Radar and Signal Processing, vol. 130, no. 1, pp. 3–10, Feb. 1983. [9] D. I. Havelock, “Sensor array beamforming using random channel sampling: The aggregate beamformer,” Acoustical Society of America Journal, vol. 114, pp. 1997–2006, Oct. 2003. [10] D. Havelock, “Residual noise in the aggregate beamformer,” in Applications of Signal Processing to Audio and Acoustics, 2003 IEEE Workshop on., 2003, pp. 45 – 48. [11] M. Agmon, B. Rafaely, and J. Tabrikian, “Maximum directivity beamformer for spherical-aperture microphones,” in Applications of Signal Processing to Audio and Acoustics, 2009. WASPAA ’09. IEEE Workshop on, 2009, pp. 153 –156. [12] M. Ghavami, “Wideband beamforming using rectangular arrays without phase shifting,” European Transactions on Telecommunications, vol. 14, p. 449456, 2003.
Preprint: Joint Workshop Hands-free Speech. Communic. Mic. Arrays (HSCMA 2011): p. 76-80, Edinburgh, United Kingdom, May 2011.
cite
4/5
[13] Y.-H. Choi, “Simple adaptive beamforming with only real weights based on covariance differencing,” Electronics Letters, vol. 43, no. 10, pp. 552 –553, 10 2007. [14] V. Tourbabin and B. Rafaely, “Electronic steering for a microphone array by sub-band phase compensation,” in Electrical and Electronics Engineers in Israel (IEEEI), 2010 IEEE 26th Convention of, 2010, pp. 585 –589. [15] M. Paajanen, J. Lekkala, and K. Kirjavainen, “Electromechanical film (emfi) – a new multipurpose electret material,” Sensors and Actuators A: Physical, vol. 84, no. 1-2, pp. 95 – 102, 2000. [16] J. Lekkala and M. Paajanen, “Emfi - new electret material for sensors and actuators,” in 10th International Symposium on Electrets, 1999. [17] J. L. Ealo, F. Seco, and A. R. Jimenes, “Broadband emfi-based transducers for ultrasonic air applications,” IEEE transactions on ultrasonics, ferroelectrics, and frequency control, vol. 55, no. 4, April 2008. [18] J. L. Ealo, F. Seco, C. Prieto, A. R. Jimenes, J. Roa, A. Koutsou, and J. Guevara, “Customizable field airborne ultrasonic transducers based on electromechanical film,” in IEEE International Ultrasonics Symposium Proceedings, 2008. [19] J. L. Ealo, J. J. Camacho, and C. Fritsch, “Airborne ultrasonic phased array using ferroelectrets: A new fabrication approach,” IEEE transactions on ultrasonics, ferroelectrics, and frequency control, vol. 56, no. 4, April 2009.
Preprint: Joint Workshop Hands-free Speech. Communic. Mic. Arrays (HSCMA 2011): p. 76-80, Edinburgh, United Kingdom, May 2011.
cite
5/5