A Biomimetic Neuronal Network-Based Controller for Guided ...

5 downloads 0 Views 2MB Size Report
A Biomimetic Neuronal Network-Based. Controller for Guided Helicopter Flight. ⋆. Anthony Westphal, Daniel Blustein, and Joseph Ayers. Depts. of Marine and ...
A Biomimetic Neuronal Network-Based Controller for Guided Helicopter Flight Anthony Westphal, Daniel Blustein, and Joseph Ayers Depts. of Marine and Environmental Sciences, Biology, and Marine Science Center, Northeastern University, Nahant MA 01908, USA [email protected] http://www.neurotechnology.neu.edu/

Abstract. As part of the Robobee project, we have modified a coaxial helicopter to operate using a discrete time map-based neuronal network for the control of heading, altitude, yaw, and odometry. Two concepts are presented: 1. A model for the integration of sensory data into the neural network. 2. A function for transferring the instantaneous spike frequency of motor neurons to a pulse width modulated signal required to drive motors and other types of actuators. The helicopter is provided with a flight vector and distance to emulate the information conveyed by the honeybee’s waggle dance. This platform allows for the testing of proposed networks for adaptive navigation in an effort to simulate honeybee foraging on a flying robot. Keywords: robotic flight, honeybee, neuronal control, helicopter.

1

Introduction

Autonomous and adaptive flight as performed by insects has long served as an inspiration to the development of micro air vehicles (MAVs). Efforts to date have focused on mimicking different biological characteristics of flying insects including sensors, behaviors, and biomechanics. Here we present a holistic biomimetic platform that combines and advances robotic implementation of neuromorphic sensors, electronic nervous system control, and motor neuron activation of effectors. The control rules for foraging insect flight are well established [33]. What is less understood is how these rules are realized by neuronal networks in the central nervous system [34]. Given the methodological difficulties in recording from neurons in behaving animals, we have taken an alternative approach to circuit tracing: embodied simulation through robotics [2,39,40]. Comparative physiology and biomimetics provide numerous examples of how many aspects of behavior are mediated by neuronal networks [28,37]. The command neuron, coordinating neuron, central pattern generator model appears generalizable for the control of innate behavior throughout the animal kingdom and has withstood critical review for over 30 years [20,21,26]. 

Supported by NSF ITR Expeditions Grant 0925751.

N.F. Lepora et al. (Eds.): Living Machines 2013, LNAI 8064, pp. 299–310, 2013. c Springer-Verlag Berlin Heidelberg 2013 

300

A. Westphal, D. Blustein, and J. Ayers

Some of the neural components underlying arthropod behavior have been identified including optical flow sensitive neurons [in Drosophila [25], in honeybees [17], in locusts [4], in crayfish [44]]. Although recent efforts have begun to investigate the neural pathways involved in optical flow responses [18], the neural circuitry underlying visually mediated flight remains a mystery. Insect responses to optical flow, the movement of the world across the visual field, have been shown to control and stabilize flight [36] and have been mimicked on various robot platforms. Other bioinspired sensory systems well-studied in animals and mimicked on robots include magnetic compasses in animals [7] and robots [42], inertial sensors in animals [16] and robots [22], and air velocity sensors in animals [46] and robots [31]. Previous efforts to engineer autonomous flight have adopted such biologically inspired sensors but neglect neuronal control principles [8,9,45]. As the details of the dynamics of the mechanical system of Robobee emerge [15,24] we have been addressing the possiblity of adapting neuronal network control [2] to the control of the steering system [11] for directed foraging. Hybrid analog/digital systems provide efficiencies essential to micromechanical systems [32]. Advances in nonlinear dynamics [29,30] have yielded technologies that allow for the control of robots by synaptic networks in real time [1,23]. Controlling robots with neuronal and synaptic networks provides a powerful heuristic framework to compare and evaluate different hypothetical network topologies and candidate mechanisms for the control of behavior [2,43]. However, this approach requires the development of an autonomous flying biomimetic platform that can interface with sensors and actuators. Recent advances as part of the Robobee project have lead to the engineering of a flapping wing MAV demonstrating passive stability [38] and optical flow sensing [12]. Robobee flight currently relies on an off-board motion tracking system for control [24] but current efforts are focused on achieving autonomy using onboard sensors, processors, and power. The helicopter-based model we describe here serves as a platform to establish and verify basic neuronal network functionality as an alternative to traditional algorithmic control prior to miniaturization. The coaxial helicopter platform mediates lift and steering with a reasonable approximation to that of the flight system of asynchronous insects [10,19]. Here we use it to explore neuronal network models. Interfacing electrical components with an electronic nervous system simulation requires the translation of digital data to and from the language of neurons, i.e. action and synaptic potentials. To accomplish this we present an approach to convert sensory information into the pulse-interval code of biological networks and to convert neuronal activity into a pulse width-modulated (PWM) signal for the control of motors and servos.

2

Helicopter Platform

A custom electronics and software package implemented on a Blade MX2 coaxial helicopter serves as the foundation for the biomimetic platform. The electronics package is divided into three parts: sensory integration, neural processing,

A Biomimetic Neuronal Network-Based Controller

301

and motor output. These parts are modular and can be expanded and customized based on the robot’s requirements. A programmed phenomenological neural model [30] allows for rapid prototyping and simulation of neural network hypotheses in LabVIEWT M and their subsequent implementation in C on an 8bit microprocessor. Our studies have shown [1,43] that numerical simulations of networks needed to control a biomimetic robot can be implemented in real time using standard microprocessors. The hardware layout for the helicopter control system is shown in Figure 1. There are two ATMega 2560 microprocessors at the heart of the helicopter control board. The sensory microprocessor (SP) is responsible for reading data from the gyroscope, ultrasonic range finder, and optical flow sensors. It also sends data to a computer logger for analysis via a Bluetooth link, sends commands to the rotors, and controls pitch and roll servomotors. User control override via a 5 channel RC receiver is also enabled. The nervous system microprocessor (NS) runs the discrete time map-based (DTM) neurons and synapses [30] that are the building blocks of the nervous system simulation. The sensory information passes from the first processor to the second where it modulates neuron network activity. Motor neuron activity is then passed back to the first processor to drive the motors and servos.

Fig. 1. Helicopter nervous system processor hardware. The ATMega 2560 sensory microprocessor (SP) is used to gather sensory information, transmit data to an off-board logging device and send motor signals to the main rotor motors and to the roll and pitch servomotors. The nervous system microprocessor (NS) is passed filtered sensory data that modulates the neural networks which produce motor neuron outputs that are passed back to control the motors and servos. The operator can resume control of the motor output via the 5ch RC receiver.

302

A. Westphal, D. Blustein, and J. Ayers

A custom PC board 30 mm wide and 50 mm long was built. The top of the PC board holds the NS microprocessor (Fig. 2A) while the SP processor is located on the bottom of the board (Fig. 2B).

Fig. 2. Top and bottom views of the nervous system microprocessor board. Two ATMega 2560 microprocessors on a custom DSP to operate a DTM nervous system on the helicopter. A. The NS IC runs the DTM neurons and synapses. The connectors are outlined by boxes: [1] 5ch RC receiver; [2] output to rotors and servos; [3] two UART lines to processor SP; [4] SPI programming jack for processor SP; [5] programming jack for processor NS; [6] I2C bus for SP processor/UART to NS processor. B. The SP IC is responsible for motor control and sensor processing.

3

Sensory System

A variety of analog sensors are mounted to the sensor array board (Fig. 3) and connected via a series of wires between the individual sensors and the helicopter SP microprocessor. It is well established that bees use a sun compass for navigation [14]. To simulate this we have integrated an ITG-3200 triple-axis digital gyro capable of 2000 deg/sec measurements. It is used to calculate the current heading since vibration and magnetic interference generate inaccurate readings from a standard magnetic compass module. The ultrasonic range finder is a Devantech SRF08 capable of cm resolution up to a height of 6 m. The optical flow sensor is a Centeye Ardueye Aphid Vision Sensor (v1.2), which produces optical flow readings in both the x and y axes. A 4 mm gel pad is used to isolate the sensor mount from helicopter vibrations. Additional vibration dampening in the form of antistatic foam was used to encase the ultrasonic sensor. The sensor mount, printed on a 3D printer (MakerBot Replicator 2) using PLA, clips onto the helicopter rails via 4 mounting tabs. The fully assembled helicopter with all of its components is shown in Figure 4.

A Biomimetic Neuronal Network-Based Controller

303

Fig. 3. Sensor array board. The 3D-printed mounting board holds the 3 axis digital gyroscope, ultrasonic range finder, and optical flow sensor.

4

Neuron Network Development

Based on known rules for optical flow modulation of flight [35] and sun compass operation [14], we have developed a hypothetical network that integrates a heading deviation sense along with optical flow modulation of yaw, and optical flow-mediated odometry to control the search phase of foraging (Fig. 5). To initiate a search behavior, a target heading and distance are communicated over the Bluetooth module as an analogue to the search vector communicated by the honeybee’s waggle dance [13]. The target heading is conveyed as a set point for the compass network and the target distance as the strength of synaptic connections between optical flow sensory neurons and the odometer neuron. The compass sensory neurons mediate yaw towards the target heading while the helicopter climbs to the desired altitude based on the ultrasonic sensor’s set point [2,42]. Once heading error has been eliminated, inhibition on the Forward Translation command is released and the helicopter pitches forward resulting in forward flight. Translation proceeds with yaw compensation and odometry mediated by optical flow [2,6]. Once the desired distance has been traversed, the odometer neuron fires resulting in inhibition of Forward Translation.

5

Neuron Network Implementation

The implementation of a nervous system simulation on the helicopter platform is a three-step process that has been developed to allow for rapid prototyping and implementation in C on a variety of processors. The DTM model is capable of modeling both spiking and bursting activity [30]. Here we simplified the model to achieve robust integration of synaptic input without higher order bursting. A nonlinear function (1) is responsible for spiking behavior and generates a new current value xn+1 when passed the present current xn , the previous current

304

A. Westphal, D. Blustein, and J. Ayers

Fig. 4. Helicopter platform. Custom Blade MX2 helicopter platform with Motor Driver, Bluetooth module, NS/SP DSP board, 5ch RC receiver and mounted sensor array.

xn−1 , and a synaptic current input u. The variables α and σ determine a baseline spiking or bursting activity profile.

xn+1

⎧ ⎨ α/(1 − xn ) + u, = fn (xn , xn − 1, u) = α + u, ⎩ −1,

x ≤ 0, 0 < xn < α + u & xn−1 ≤ 0, xn ≥ α + u or xn−1 > 0,

(1)

Function (2) sums σ with the excitatory synaptic currents cIe, the inhibitory current cIi, and the exogenous injected current Idc, and scales the cell’s transient response to them based on the scaling factors, βE , βI , and βDC , respectively. u = βE (cIe) + βI (cIi) + βDC (Idc) + σ

(2)

The synaptic current (I) is calculated by: I = γ(I) − γInh (spike(xpost − xrp )) n

(3)

where γ = synaptic strength, γInh = relaxation rate, xpost = postsynaptic curn rent and xrp = reversal potential. The variable spike signifies when an action potential has occured, it is either 1 or 0 based on the state of equation 1. Based on the causal network of Figure 5, the parameters of the network are tuned using LabVIEWT M software. Virtual instruments (VIs) for neurons (equations 1 and 2) and synapses (equation 3) are constructed. Neuron VIs are connected together with synapse VIs based on the proposed network layout. Neuron behaviors are set by varying α and σ. Synapses are characterized as excitatory

A Biomimetic Neuronal Network-Based Controller

305

Fig. 5. Helicopter sensory network layout. Circles represent modeled neurons and lines represent synapse topology: filled triangles represent excitatory synapses, filled circles represent inhibitory. Black circles are sensory neurons, white circles are interneurons and grey circles are motor neurons. Compass Right and Compass Left neurons respond to heading error to either the right or left, respectively. Optical flow Right and Optical flow Left respond to translational optical flow from separate sensors facing laterally outwards. The Ultrasonic neuron responds to the error between the target and current altitude.

or inhibitory by setting xrp to either a positive or negative number, respectively. Once the network settings have been tuned for optimal performance in software simulation, the network hypothesis is instantiated in C code for robot operation. A function for each type of neuron and synapse is created in a structure with the same variables to mimic the LabVIEWT M architecture.

6

Transforming Sensor Data to Neuronal Activity

Sensory information can be transformed into a scaled injected current that drives sensory neurons. The sensor outputs are converted to neuronal spike trains by a variety of mechanisms such as raw value scaling, differential error scaling, and range fractionation [43]. For example, the optical flow sensor generates a scalar representing instantaneous optical flow. This scalar serves as an input to an optical flow interneuron that encodes optical flow as a spiking neuronal output (Fig. 6). As instantaneous optical flow increases, the raw sensor value increases leading to an increase in spike frequency in the corresponding sensory neuron.

306

A. Westphal, D. Blustein, and J. Ayers

Fig. 6. Optical flow sensory input is transformed to a neuronal pulse code. An optical flow sensor was presented a moving visual stimulus consisting of a sinusoidal black/white contrast pattern with a period of 0.5 in [27]. The top trace shows the speed of the optical flow stimulus profile (1.42 in/sec from 1.5 to 4.0 sec, 3.70 in/sec from 4.0 to 6.5 sec, and 6.94 in/sec from 6.5 to 9.0 sec). The middle trace shows the reported instantaneous optical flow values from the sensor. The bottom trace shows the response of a sensory neuron coding this sensory input (α = 4.05, σ = -3.10). The sensor was mounted 5 inches away from a computer monitor displaying the stimulus.

The control parameters α and σ can be adjusted to modify the neuron’s response to sensory input.

7

Controlling Motors with Neuronal Activity

Translating neuronal activity to motor actuation is the final step necessary to establish a complete robotic implementation of a nervous system simulation. While generally not an issue using robots with shape memory alloy actuators such as RoboLamprey [43] since neuronal spikes can directly activate such effectors, a method to transform spikes to a PWM signal is needed for the helicopter platform. The transformation from spike frequency to pulse width modulated duty cycle is governed by the following formula:  P W MDC =

(Δiterations ) − 1)), Δiterations < M axiterations (−100 ∗ ( Max iterations 0, Δiterations > M axiterations

(4)

P W MDC = Duty cycle used to drive motors. Δiterations =Number of iterations since last spike. M axiterations =Maximum number of iterations between each spike before P W MDC can be considered 0.

A Biomimetic Neuronal Network-Based Controller

307

P W MDC is calculated when the motor neuron fires and is a representation of the motor neuron spike frequency. M axiterations is set to a value that corresponds to the maximum number iterations of the code that can occur before the motor neuron that is being transformed to a PWM signal can be considered silent. Δiterations is increased by 1 every iteration of the code. The ratio of Δiterations to M axiterations determines P W MDC . As spike frequency increases across its range of values, the output PWM signal increases proportionally allowing for control of motors and servos (Fig. 7).

Fig. 7. Transformation of spike frequency to duty cycle. As the spike frequency of a motor neuron increases, the PWM output to the associated motor increases. Top panel: When a spike occurs every 3754 iterations, the injected current (Idc) equals .2836 which corresponds to a 25% duty cycle. Middle panel: 2521 iterations per spike, Idc = 1.1.2852, 50% duty cycle. Bottom panel: 1288 iterations per spike, Idc = 1.2932, 75% duty cycle.

8

Discussion

Here we have shown the development of a flying biomimetic robot platform that can be used to test embodied simulations of neuronal network function for the control of flight behavior. The custom electronics platform presented allows for the use of a variety of sensors including an ultrasonic range finder, optical flow sensors and a three-axis gyroscope. Using a discrete time map-based neuron and synapse model, neural network hypotheses can be simulated and tested on the flying robot. At this stage in the development of a neuronal network-based controller for helicopter flight, there exist two main challenges in approaching biomimetic control. The first challenge is to transform sensory information to the pulse-interval code of biological networks. We have shown that by using biologically relevant sensors, sensory data can serve as input to interneurons and motor neurons to drive reflexive behaviors. The second challenge was to translate motor neuron spikes into PWM signals to drive DC motors and servos.

308

A. Westphal, D. Blustein, and J. Ayers

By using an iteratively updating equation to convert spike frequency to PWM, the activity of a modeled neuron can drive a motor as shown in Figure 7. This complete biomimetic platform allows for the testing of nervous system hypotheses related to network connectivity and neuronal principles in general, such as varied basal neuron activity levels and the role of efferent copy in multimodal sensory fusion [41]. Forthcoming work will use this platform to test a neuronal compass [42] and a neuron-based optical flow odometer [6] in the field. Testing of sensor fusion hypotheses demonstrating complex behaviors will follow. While this platform is a powerful tool for the testing of nervous system hypotheses related to insect flight control, the general framework presented can be adapted for a wide range of experimentation. The technological capability to run real-time neural network simulations on-board autonomous robots has remarkable implications for the advancement of synthetic neuroscience research. The approach to software modeling of neuronal activity presented can be accomplished on almost any microprocessor including Arduino boards and the LEGOT M NXT brick [5]. Neuroscientists can use these robotic tools to test their own network hypotheses stemming from neurophysiological and neuroethological investigations. We have previously developed controllers for underwater walking [3] and swimming [43] robots. The work presented here extends this implementation to a simulation of asynchronous flight and confirms that the command neuron, coordinating neuron, central pattern generator model can be implemented on a range of robotic platforms with varied sensors and actuators. We plan to adapt and miniaturize this technology for the control of Robobee. Biomimetic embodied nervous system simulations show great promise for investigating neuron operation in vivo. Comparative testing between animal and robot simulation can serve to elucidate deficiencies in hypotheses and inform further study. The development of a biomimetic framework adaptable to a flying platform shows that the approach can be used to address research questions in a broad range of biological systems and offer new control techniques for autonomous robotics.

References 1. Ayers, J., Rulkov, N., Knudsen, D., Kim, Y.-B., Volkovskii, A., Selverston, A.: Controlling Underwater Robots with Electronic Nervous Systems. Appied Bionics and Biomimetics 7, 57–67 (2010) 2. Ayers, J., Blustein, D., Westphal, A.: A Conserved Biomimetic Control Architecture for Walking, Swimming and Flying Robots. In: Prescott, T.J., Lepora, N.F., Mura, A., Verschure, P.F.M.J. (eds.) Living Machines 2012. LNCS, vol. 7375, pp. 1–12. Springer, Heidelberg (2012) 3. Ayers, J., Witting, J.: Biomimetic Approaches to the Control of Underwater Walking Machines. Phil. Trans. R. Soc. Lond. A 365, 273–295 (2007) 4. Baader, A., Schfer, M.: The perception of the visual flow field by flying locusts: A behavioural and neuronal analysis. J. Exp. Biol. 165, 137–160 (1992) 5. Blustein, D., Rosenthal, N., Ayers, J.: Designing and implementing nervous system simulations on LEGO robots. J of Visualized Experiments (in press, 2013)

A Biomimetic Neuronal Network-Based Controller

309

6. Blustein, D., Westphal, A., Ayers, J.: Optical flow mediates biomimetic odometry on an autonomous helicopter (in preparation, 2013) 7. Boles, L.C., Lohmann, K.J.: True navigation and magnetic maps in spiny lobsters. Nature 421(6918), 60–63 (2003) 8. Chahl, J., Rosser, K., Mizutani, A.: Bioinspired optical sensors for unmanned aerial systems. In: Proceedings of SPIE: Bioinspiration, Biomimetics, and Bioreplication, vol. 7975, pp. 0301–0311 (2011) 9. Conroy, J., Gremillion, G., Ranganathan, B., Humbert, J.S.: Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Auton. Robot. 27(3), 189–198 (2009) 10. Dantu, K., Kate, B., Waterman, J., Bailis, P., Welsh, M.: Programming microaerial vehicle swarms with karma. In: Proceedings of the 9th ACM Conference on Embedded Networked Sensor Systems. ACM (2011) 11. Dickinson, M.H., Tu, M.S.: The function of dipteran flight muscle. Comparative Biochemistry and Physiology Part A: Physiology 116(3), 223–238 (1997) 12. Duhamel, P.-E.J., Perez-Arancibia, N.O., Barrows, G.L., Wood, R.J.: Biologically Inspired Optical-Flow Sensing for Altitude Control of Flapping-Wing Microrobots. IEEE/ASME Trans Mechatron 18(2), 556–568 (2013) 13. Dyer, F.C.: The biology of the dance language. Annual Review of Entomology 47(1), 917–949 (2002) 14. Dyer, F.C., Dickinson, J.A.: Sun-compass learning in insects: Representation in a simple mind. Current Directions in Psychological Science 5(3), 67–72 (1996) 15. Finio, B.M., Wood, R.J.: Open-loop roll, pitch and yaw torques for a robotic bee. In: IEEE/RSJ International Conf. on Intelligent Robots and Systems, IROS (2012) 16. Fraser, P.J.: Statocysts in Crabs: Short-Term Control of Locomotion and LongTerm Monitoring of Hydrostatic Pressure. Biol. Bull. 200(2), 155–159 (2001) 17. Ibbotson, M.: Wide-field motion-sensitive neurons tuned to horizontal movement in the honeybee, Apis mellifera. J. Comp. Physiol. A: Neuroethology, Sensory, Neural, and Behavioral Physiology 168(1), 91–102 (1991) 18. Joesch, M., Weber, F., Eichner, H., Borst, A.: Functional Specialization of Parallel Motion Detection Circuits in the Fly. J.Neuroscience 33(3), 902–905 (2013) 19. Kate, B., Waterman, J., Dantu, K., Welsh, M.: Simbeeotic: A simulator and testbed for micro-aerial vehicle swarm experiments. In: Proceedings of the 11th International Conference on Information Processing in Sensor Networks. ACM (2012) 20. Kennedy, D., Davis, W.J.: Organization of Invertebrate Motor Systems. Handbook of Physiology. The organization of invertebrate motor systems. In: Geiger, S.R., Kandel, E.R., Brookhart, J.M., Mountcastle, V.B. (eds.) Handbook of Physiology, sec. I, vol. I, part 2., pp. 1023–1087. Amer. Physiol. Soc, Bethesda (1977) 21. Kiehn, O.: Development and functional organization of spinal locomotor circuits. Current Opinion in Neurobiology 21(1), 100–109 (2011) 22. Lobo, J., Ferreira, J.F., Dias, J.: Bioinspired visuo-vestibular artificial perception system for independent motion segmentation. In: Second International Cognitive Vision Workshop, ECCV 9th European Conference on Computer Vision, Graz, Austria (2006) 23. Lu, J., Yang, J., Kim, Y.B., Ayers, J.: Low Power, High PVT Variation Tolerant Central Pattern Generator Design for a Bio-hybrid Micro Robot. In: IEEE International Midwest Symposium on Circuits and Systems, vol. 55, pp. 782–785 (2012) 24. Ma, K.Y., Chirarattananon, P., Fuller, S.B., Wood, R.J.: Controlled Flight of a Biologically Inspired. Insect-Scale Robot. Science 340(6132), 603–607 (2013)

310

A. Westphal, D. Blustein, and J. Ayers

25. Paulk, A., Millard, S.S., van Swinderen, B.: Vision in Drosophila: Seeing the World Through a Model’s Eyes. Annual Review of Entomology 58, 313–332 (2013) 26. Pearson, K.G.: Common principles of motor control in vertebrates and invertebrates. Annu.Rev.Neurosci. 16, 265–297 (1993) 27. Peirce, J.: PsychoPy - Psychophysics software in Python. J. Neurosci. Methods 162, 8–13 (2007) 28. Prescott, T.J., Lepora, N.F., Mura, A., Verschure, P.F.M.J. (eds.): Living Machines 2012. LNCS, vol. 7375. Springer, Heidelberg (2012) 29. Rabinovich, M.I., Selverston, A., Abarbanel, H.D.I.: Dynamical principles in neuroscience. Reviews of Modern Physics 78(4), 1213–1265 (2006) 30. Rulkov, N.F.: Modeling of spiking-bursting neural behavior using two-dimensional map. Phys.Rev. E 65, 041922 (2002) 31. Rutkowski, A.J., Miller, M.M., Quinn, R.D., Willis, M.A.: Egomotion estimation with optic flow and air velocity sensors. Biol. Cybern. 104(6), 351–367 (2011) 32. Sarpeshkar, R.: Analog versus digital: extrapolating from electronics to neurobiology. Neural Computation 10(7), 1601–1638 (1998) 33. Srinivasan, M.V.: Honey bees as a model for vision, perception, and cognition. Annual Review of Entomology 55, 267–284 (2010) 34. Srinivasan, M.V.: Honeybees as a model for the study of visually guided flight, navigation, and biologically inspired robotics. Physiol Reviews 91(2), 413–460 (2011) 35. Srinivasan, M.V.: Visual control of navigation in insects and its relevance for robotics. Current Opinion in Neurobiology 21(4), 535–543 (2011) 36. Srinivasan, M., Zhang, S., Lehrer, M., Collett, T.: Honeybee navigation en route to the goal: visual flight control and odometry. J. Exp. Biol. 199, 237–244 (1996) 37. Stein, P.S.G., Grillner, S., Selverston, A.I., Stuart, D.: Neurons, Networks and Motor Behavior. MIT Press, Cambridge (1997) 38. Teoh, Z.E., Fuller, S.B., Chirarattananon, P.: A hovering flapping-wing microrobot with altitude control and passive upright stability. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3209–3216 (2012) 39. Webb, B.: Can robots make good models of biological behaviour? Behav. Brain Sci. 24(6), 1033–1050 (2001) 40. Webb, B.: Robots in invertebrate neuroscience. Nature 417(6886), 359–363 (2002) 41. Webb, B., Reeve, R.: Reafferent or redundant: integration of phonotaxis and optomotor behavior in crickets and robots. Adaptive Behavior 11(3), 137–158 (2003) 42. Westphal, A., Ayers, J.: A neuronal compass for autonomous biomimetic robots (in preparation, 2013) 43. Westphal, A., Rulkov, N., Ayers, J., Brady, D., Hunt, M.: Controlling a lampreybased robot with an electronic nervous system. Smart Struct. Sys. 8(1), 37–54 (2011) 44. Wiersma, C.A., Yamaguchi, T.: Integration of visual stimuli by the crayfish central nervous system. J. Exp. Biol. 47(3), 409–431 (1967) 45. Wood, R.J., Avadhanula, S., Steltz, E., Seeman, M., Entwistle, J., Bachrach, A., Barrows, G., Sanders, S.: An autonomous palm-sized gliding micro air vehicle. IEEE Robotics and Automation Magazine 14(2), 82–91 (2007) 46. Yorozu, S., Wong, A., Fischer, B., Dankert, H., Kernan, M., Kamikouchi, A., Ito, K., Anderson, D.: Distinct sensory representations of wind and near-field sound in the Drosophila brain. Nature 458, 201–205 (2009)

Suggest Documents