New Bionic Navigation Algorithm Based on the Visual Navigation. Mechanism of Bees. Yufeng HUANGa, Yi LIUa, Jianguo LIUa. aNational Key laboratory of ...
New Bionic Navigation Algorithm Based on the Visual Navigation Mechanism of Bees Yufeng HUANGa, Yi LIUa, Jianguo LIUa a National Key laboratory of Science and Technology on Multi-spectral Information Processing, School of Automation, Huazhong University of Science and Technology, Wuhan, Hubei Province 430074, P.R.China
ABSTRACT Through some research on visual navigation mechanisms of flying insects especially honeybees, a novel navigation algorithm integrating entropy flow with Kalman filter has been introduced in this paper. Concepts of entropy image and entropy flow are also introduced, which can characterize topographic features and measure changes of the image respectively. To characterize texture feature and spatial distribution of an image, a new concept of contrast entropy image has been presented in this paper. Applying the contrast entropy image to the navigation algorithm to test its’ performance of navigation and comparing with simulation results of intensity entropy image, a conclusion that contrast entropy image performs better and more robust in navigation has been made.
Keywords: bionic navigation, entropy image, contrast entropy image, entropy flow, motion estimation 1
INTRODUCTION
Flying insects represent an excellent example of design at the microscopic scale. They achieve efficient and robust flight control, navigation, landing and obstacle avoidance despite their diminutive size and simple structured brain which carries fewer than 0.01% as many neurons as ours does. The nervous system of flying insects provide accurate direction in search of food, shelter or partners, and even enables the insects to remember the way back to their nests. These creatures represent a fascinating source of inspiration for engineers aiming to create micro Unmanned Aerial Vehicle (UAV). Tremendous research has been given for the flying insects and much recent progress has been made[1, 2], but both the function of flying insects and the design of micro flying robot are not fully understood. Recent studies have shown that freely flying insects use the optical flow to avoid collisions, direct to the food source [3, 4] and landing[5]. Nowadays navigation systems have been widely applied in many kinds of facilities, such as automobiles, airplanes, guided missiles, unmanned planes, etc. However, there is still not an efficient design of navigation system that consume extremely low energy and can mainly rely on vision system. GPS and INS integrated system have been widely used in flight navigation but the precision is not enough for micro UAV to navigate under the indoor environment or small distance. Researchers have shifted their focus on studying flying insects to get inspiration and simulate the principle of insects’ vision mechanism for micro UAV navigation. By studying the model of visual nervous system of insects, Reichardt has proposed a model of correlation and applied it as an motion detector[6]. Inspired by visual navigation mechanism of flies, Franceschini has proposed some visual-based navigation algorithms and applied them to automatic visual guidance of robots[7]. In this paper, we design a vision-based aerial navigation system that are inspired by the visual navigation mechanisms of
Selected Papers from Conferences of the Photoelectronic Technology Committee of the Chinese Society of Astronautics 2014, Part II, edited by Xiangwan Du, Dianyuan Fan, Jialing Le, Yueguang Lv, Jianquan Yao, Weimin Bao, Lijun Wang, Proc. of SPIE Vol. 9522, 952218 · © 2015 SPIE · CCC code: 0277-786X/15/$18 · doi: 10.1117/12.2179840 Proc. of SPIE Vol. 9522 952218-1 Downloaded From: http://proceedings.spiedigitallibrary.org/ on 06/01/2015 Terms of Use: http://spiedl.org/terms
honeybees. Flying insects such as honeybees can perceive various pattern of movements that are induced by their motion, which can be considered as a kind of ‘visual flow’. In order to measure these ‘visual flow’, new concepts of entropic map and entropy flow has been proposed [8]. To quantify the texture feature and spatial distribution of an image better, contrast entropy image (CEI) is presented on the basis of original entropic map. Finally, this kind of CEI is applied to a bionic visual navigation algorithm based on entropy flow and Kalman filter[8], and results show that this kind of CEI do help to get a quicker, more accurate and robust navigation result. The remains of this paper is organized as follows: The concept of contrast entropy image is introduced in section 2, followed by the navigation algorithm based on entropy flow and Kalman filter in section 3.Some simulation experiment results are obtained in section 4 and finally the conclusion is shown in the last section.
2
CONTRAST ENTROPY IMAGE
Honeybees can navigate fast and accurately in complicated natural conditions. Many research works have been done to find the mechanism of honeybees’ navigation system. Some experiments have revealed that honeybees determine the distance and flying attitude by flow of environment’s images moved in their eyes[9, 10]. From the analysis above, we can conclude that honeybees’ visual navigation are based on quantity of images’ information that flow through their retina. Referred to the concept of entropy in information theory, a kind of intensity entropy image (IEI) has been presented[8] to quantize the intensity information of an image, while this kind of entropy image can’t well describe the texture information and spatial distribution of an image. As we all known contrast is a basic texture properties. So a concept of CEI is naturally presented with expectation of a better description of texture information and spatial distribution of an image. In this paper, we introduce a local entropic operator to calculate the entropy of an image block: 𝑀
𝑁
1 H=− ∑ ∑ 𝑝(𝑖, 𝑗) log(𝑝(𝑖, 𝑗)) log(𝑛)
(1)
𝑖=1 𝑗=1
And, 𝑝(𝑖, 𝑗) =
𝑓(𝑖, 𝑗) 𝑀 ∑2
𝑁
𝑀 𝑖=− 2 +1
(2)
∑2
𝑓(𝑖, 𝑗) 𝑁 𝑗=− 2 +1
In equations above, size of the block is M × N; 𝑓(𝑖, 𝑗) is the intensity of the central point of the block; n is the number of pixels in this block. And contrast of an image can be calculates as the following: 𝐶𝜎 = 𝜎𝐼𝑤×ℎ = √
1 . ∑ (𝐼(𝑥, 𝑦) − 𝜇𝐼𝑤×ℎ )2 𝑤×ℎ
(3)
𝐼𝑤×ℎ
In the equation above, 𝜇𝐼𝑤×ℎ =
1 𝑤×ℎ
. ∑𝐼𝑤×ℎ 𝐼(𝑥, 𝑦) represent the average intensity of this image; size of this image is 𝑤 ×
ℎ; 𝐼(𝑥, 𝑦) is the intensity of point in (x,y); For an M × N image, its CEI can be constructed as the following: 1) Partition the image into equal-sized rectangular sub-blocks (with size of m × n), and make the adjacent sub-blocks share a certain percentage of overlapping.
Proc. of SPIE Vol. 9522 952218-2 Downloaded From: http://proceedings.spiedigitallibrary.org/ on 06/01/2015 Terms of Use: http://spiedl.org/terms
2) Use equation (3) to calculate the contrast of each sub-block which is 𝐶𝜎 (𝑖, 𝑗).Then replace the intensity of central point of the sub-block (which is I(i, j)) as 𝐶𝜎 (𝑖, 𝑗), and we can get a kind of contrast image. 3) Partition the contrast image got in step (2) as step (1), and calculate the probability distribution of each sub-block’s contrast according to equation (3). 4) Calculate the entropy of each sub-block which is 𝐻𝑖𝑗 , and with all these entropy value we can get an entropy matrix. 5) Normalize the entropy matrix got in step (4), and set value below the threshold to be zero, then an image with range of [0,1] is got , that is the contrast entropy image. The images below show the results of the contrast entropy algorithm:
Figure 1. Original image (left) and its CEI (right)
From images above, we can clearly see that pixels with high contrast entropy value mainly located at edges and regions with highly detailed texture. And data matrix of the contrast entropy image can always be a sparse matrix, so it may improve rate of calculation.
3 3.1
BIONIC NAVIGATION ALGORITHM
Entropy flow
Computation of optical flow has always been a research hotspot in the past years, and many algorithms have been presented. The most classical works would belong to Horn and Shunck[11], and Lucas and Kanade[12]. Optical flow describe changes of brightness pattern. When translated into an entropy image, the change of image’s brightness pattern accordingly turns into the change of entropy pattern, which can be described by entropy flow. To analyze motion in an image sequence, temporal constancy of certain image features has to be imposed, such as intensity, gratitude, etc. Just like derivation of optical flow constraint (OFC)
[13]
, entropy flow constraint (EFC) is derived with
[8]
application of assumption of entropy value constancy as the following : 𝐻𝑥 . u + 𝐻𝑦 . 𝑣 + 𝐻𝑡 = 0
(4)
Where subscripts denote partial derivatives. And the (𝐮, 𝐯) is the entropy flow field. To avoid aperture problem, another constraint need to be imposed[13]. In a small spatial neighborhood Ω, the estimation of entropy flow is computed by minimizing the following equation[8]: 2
ε𝐻 = ∑ 𝑊 2 (𝑥) ∙ (𝐻𝑥 . u + 𝐻𝑦 . 𝑣 + 𝐻𝑡 )
(5)
(𝑥,𝑦)𝜖Ω
Where 𝑊(𝑥) is the function of window. Using the least square method, the solution of u and v is got as follows: 𝐔(u, v) = (A𝑇 𝑊 2 A)−1 A𝑇 𝑊 2 𝐵 Where in the time of t, 𝑥𝑖 ∈ Ω, i = 1,2, … , n, and
Proc. of SPIE Vol. 9522 952218-3 Downloaded From: http://proceedings.spiedigitallibrary.org/ on 06/01/2015 Terms of Use: http://spiedl.org/terms
(6)
𝐴 = [∇𝐻(𝑥1 ) ∇𝐻(𝑥2 ) … ∇𝐻(𝑥𝑛 )] {𝑊 = 𝑑𝑖𝑎𝑔[W(𝑥1 ) W(𝑥2 ) … W(𝑥𝑛 )] 𝐵 = −[𝐻𝑡 (𝑥1 ) 𝐻𝑡 (𝑥2 ) … 𝐻𝑡 (𝑥𝑛 )] 3.2
(7)
Global motion estimation algorithm
Different relations between camera and environment induce different motion pattern. Given the practical application, this paper just studies the global motion pattern which is induced by camera moving through a static environment. We use entropy flow to estimate motion parameters with a six-parameter model based on orthogonal projection[14]: 𝑢(𝑥, 𝑦) = 𝑎1 𝑥 + 𝑎2 𝑦 + 𝑎3 { 𝑣(𝑥, 𝑦) = 𝑎4 𝑥 + 𝑎5 𝑦 + 𝑎6 Where ( 𝑢(𝑥, 𝑦) , 𝑣(𝑥, 𝑦) ) is the entropy flow. If we assume 𝛼𝑢𝑇 = (𝑎1 (𝑥
𝑦
(8) 𝑎2
𝑎3 )𝑢 , 𝛼𝑣𝑇 = (𝑎4
𝑎5
𝑎6 )𝑣 , 𝑋 =
1), then: 𝑢(𝑥, 𝑦) = 𝑋𝛼𝑢 { 𝑣(𝑥, 𝑦) = 𝑋𝛼𝑣
(9)
Using a least-squares estimation algorithm to solve equation (8), we can get the solution of parameters as the following: [𝛼𝑢
𝛼𝑣 ] = [ ∑
𝑋 𝑇 𝑋]−1
(𝑥,𝑦)∈Ω𝐻
∑
𝑋 𝑇 [𝑢(𝑥, 𝑦)
𝑣(𝑥, 𝑦)]
(10)
(𝑥,𝑦)∈Ω𝐻
For the least-squares estimation algorithm is sensitive to noise, to get rid of it auto-selecting algorithm of assessment threshold[8] can be used. And Kalman filter is also used to improve the accuracy of the algorithm. 3.3
The framework of the navigation algorithm
Assuming that a reference trajectory is provided as a premise to guide the flight, and the reference point is also moved with flight’s movement at the same time. Entropy flow is computed between real-time images and their corresponding reference images. And then the motion parameters can be obtained based on the last section. The framework of the navigation algorithm based on entropy flow and Kalman filter can be described as the following: (1) Translated the initial image of the reference image sequence and the real-time image sequence into contrast entropy image, and view these two entropy images as the first and second frame of entropy image sequence; (2) Calculate the entropy flow according to section 3.1; (3) Calculate the motion parameters based on six-parameter model introduced in section 3.2; (4) Input the parameters got in step (3) to Kalman filter to get the optimized global motion parameters; (5) Correct the instant motion parameters of the carrier such as velocity and direction with the motion parameters got in step (4); (6) Come to the next point and repeat the process.
4 4.1
EXPERIMENTS AND RESULTS
Simulation
In the simulation experiments, it’s assumed that the camera is fixed on the flying object and its optical axis is perpendicular to the ground and it is always parallel to the ground. And only translation in X, Y, and Z coordinates is considered without considering the rotary issues and changes of object’s attitude. The testing maps were snapped in Google Earth. To simulate changes in height, seven images were snapped respectively
Proc. of SPIE Vol. 9522 952218-4 Downloaded From: http://proceedings.spiedigitallibrary.org/ on 06/01/2015 Terms of Use: http://spiedl.org/terms
at each location, from the height of 1850m to 2150m, with 50m as an interval, and these images were used, while images of other height could also be formed through an interpolation algorithm. Meanwhile the random error that is inevitable in the practical flight was also considered and added to the coordinates X, Y, Z. To reveal characteristics of CEI clearly and persuasively, we compare results of navigation algorithm using CEI and IEI[8]. Images below show a set of simulation results. The simulation start point is at (160,160,1.86), and the planned start point is at (140,180,2.00). A pixel of maps below represents a real distance of 2.8m. Number of iteration is set as 75.
Figure 2. results of navigation algorithm using CEI (left) and IEI (right)
From the images above, as the iteration grows, the trace of simulated points (represented by red dot) approximate the trace of the accurate points planned beforehand (represented by yellow “*”), while convergence of navigation algorithm using CEI is much better than that IEI. The error analysis results are also shown below: Error in XVplane
Error in xv plane
20 error in X
N«+ 40
50
60
70
80
=nor in rams
Figure 3. error analysis of navigation algorithm using CEI (left) and IEI (right)
From images above, we can see that the error of navigation algorithm using CEI is less than that using IEI in all three coordinates. After 75 times of iteration, distances of error in all three coordinates are shown in the following table in consideration of proportional scale: Table 1. distances of error
Distance
X(m)
Y(m)
Z(km)
Navigation using CEI
8.097
0.174
0.008
Navigation using IEI
18.104
60.438
0.574
Algorithm
From the table above we can conclude that navigation algorithm using CEI does much more precisely than that using IEI, and after many times of experiments we’ve found that its’ precision is less than 10m in all three directions, even in some direction its’ distance error is much less than 1m without a loss of real-time performance.
Proc. of SPIE Vol. 9522 952218-5 Downloaded From: http://proceedings.spiedigitallibrary.org/ on 06/01/2015 Terms of Use: http://spiedl.org/terms
4.2
Results and conclusion
In this paper, we have presented a novel concept of contrast entropy image, which can characterize image’s texture feature and spatial distribution well, and combined it with the bionic navigation algorithm based on entropy flow and Kalman filter. This navigation algorithm using contrast entropy image is proved to get better navigation results and less navigation error. However the navigation algorithm dosen’t take into account the rotations in X, Y, Z coordinates and changes of attitude of the flying objects, which are the focus if further studies. And there are still some works remained to be done for a highperformance motion estimation model and navigation framework.
ACKNOWLEDGMENTS This roject is supported in part by the National Science Fund of China under Grants 61071136 and Research Found for the Doctoral Program of Higher Education of China under Grants 20110142110069.
REFERENCES [1] Lambrinos, D., Möller, R., Labhart, T., Pfeifer, R., and Wehner, R., "A mobile robot employing insect strategies for navigation," Robot Auton Syst 30, 39-64 (2000). [2] Iida, F., "Goal-directed navigation of an autonomous flying robot using biologically inspired cheap vision," in International Symposium on Robotics, 21 (2001). [3] Franceschini, N., "Small Brains, Smart Machines: From Fly Vision to Robot Vision and Back Again," P Ieee 102, 751-781 (2014). [4] Humbert, J. S., Hyslop, A. M., and Chinn, M., "Experimental validation of wide-field integration methods for autonomous navigation," in International Conference on Intelligent Robots and Systems, (2007). [5] Srinivasan, M. V., Zhang, S. W., Chahl, J. S., Barth, E., and Venkatesh, S., "How honeybees make grazing landings on flat surfaces," Biol Cybern 83, 171-183 (2000). [6] A, B., "Correlation versus gradient type motion detectors: the pros and cons," Phil. Trans. R. Soc. B 362, 369-374 (2007). [7] Franceschini, N., "Towards automatic visual guidance of aerospace vehicles: from insects to robots," Acta Futura 3, 15-34 (2009). [8] Deng, H., Pan, C., Wen, T., and Liu, J., "Entropy Flow-Aided Navigation," J Navigation 64(01), 109-125 (2011). [9] Barron, A. B., Zhu, H., Robinson, G. E., and Srinivasan, M. V., "Influence of flight time and flight environment on distance communication by dancing honey bees," Insect. Soc. 52, 402-407 (2005). [10] Barron, A. B., and Srinivasan, M. V., "Visual regulation of ground speed and headwind compensation in freely flying honey bees (Apis mellife ra L.)," J. Experimental Biology 209, 978-984 (2006). [11] Horn, B., and Schunck, B., "Determining optical flow," Artif Intell 17, 185-203 (1981). [12] Lucas, B., and Kanade, T., "An iterative image registration technique with an application to stereo vision," In Proceedings of Seventh International Joint Conference on Artificial Intelligence, 674-679 (1981). [13] Mitiche, A., and Aggarwal, J. K., [ Computer Vision Analysis of Image Motion by Variational Methods], Springer,4188 (2014). [14] He, Y. W., Feng, B., Yang, S. Q., and Zhong, Y. C., " Fast global motion estimation for global motion compensation coding," In Proceedings of the IEEE International Symposium on Circuits and Systems 2, 233-236 (2001).
Proc. of SPIE Vol. 9522 952218-6 Downloaded From: http://proceedings.spiedigitallibrary.org/ on 06/01/2015 Terms of Use: http://spiedl.org/terms