21 Jun 2004
14:22
AR
AR217-NE27-24.tex
AR217-NE27-24.sgm
LaTeX2e(2002/01/18) P1: IKH 10.1146/annurev.neuro.27.070203.144343
Annu. Rev. Neurosci. 2004. 27:679–96 doi: 10.1146/annurev.neuro.27.070203.144343 c 2004 by Annual Reviews. All rights reserved Copyright
VISUAL MOTOR COMPUTATIONS IN INSECTS Mandyam V. Srinivasan and Shaowu Zhang Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
Center for Visual Science, Research School of Biological Sciences, Australian National University, P.O. Box 475, Canberra, A.C.T. 2601, Australia; email:
[email protected],
[email protected]
Key Words fly, bee, vision, navigation, behavior ■ Abstract With their relatively simple nervous systems and purpose-designed behaviors and reflexes, insects are an excellent organism in which to investigate how visual information is acquired and processed to guide locomotion and navigation. Flies maintain a straight course and monitor their motion through the environment by sensing the patterns of optic flow induced in the eyes. Bees negotiate narrow gaps by balancing the speeds of the images in their two eyes, and they control flight speed by holding constant the average image velocity as seen with their two eyes. Bees achieve a smooth landing on a horizontal surface by holding the image velocity of the surface constant during approach, thus ensuring that flight speed is automatically close to zero at touchdown. Foraging bees estimate the distance that they have traveled to reach a food source by integrating the optic flow experienced en route; this integration gives them a visually driven “odometer.” Insects have also evolved sophisticated visuomotor mechanisms for pursuing prey or mates and possibly for concealing their own motion while shadowing objects of interest.
INTRODUCTION A glance at a fly evading a rapidly descending hand or orchestrating a flawless landing on the rim of a teacup would convince even the most sceptical observer that this insect possesses exquisite visuomotor control, despite its small brain and relatively simple nervous system. Most insects have compound eyes consisting of a large number of facets, or ommatidia. Each ommatidium includes a lens that focuses light onto a small group of six to nine photoreceptors, and each lens has a visual field that is typically a few degrees in width. The visual axes of neighboring ommatidia are separated by a few degrees, so that the two compound eyes (each containing a few thousand ommatidia in a fly or honeybee, for example) together provide a near-panoramic view of the environment (Wehner 1981). The optics of insect eyes are quite different from those of our own so-called simple eyes, each of which has a single lens and a retina containing approximately one hundred million photoreceptors. In contrast to the composition of insects’ eyes, this arrangement endows humans with restricted 0147-006X/04/0721-0679$14.00
679
21 Jun 2004
14:22
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
680
AR
AR217-NE27-24.tex
SRINIVASAN
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
P1: IKH
ZHANG
peripheral vision, but the strong overlap between the visual fields of the two eyes enables good quality stereo vision. The visual systems of insects differ from those of humans in ways that have more profound consequences for vision and behavior. Unlike vertebrates, insects have immobile eyes with fixed-focus optics. They cannot infer the distance of an object from the extent to which the directions of gaze must converge to view the object, nor can they judge distance by monitoring the refractive power that is required to bring the object’s image into focus on the retina. Compared with human eyes, the eyes of insects are positioned much closer together, and they possess inferior spatial acuity. Even if an insect had the neural apparatus required for binocular stereopsis, such a mechanism would be relatively imprecise and restricted to ranges of a few centimeters (Collett & Harkness 1982, Horridge 1987, Rossell 1983, Srinivasan 1993). Not surprisingly, therefore, insects have evolved alternative visual strategies for guiding behavior and locomotion in their three-dimensional world. Many of these strategies use cues derived from the image motion that insects experience as they move in their environment. Vision in insects is a very active process in which perception and action are tightly coupled. This review outlines a few examples of visuomotor control in insects and highlights the underlying principles. It is by no means exhaustive, but it does provide references to additional topics and more complete accounts.
STABILIZING FLIGHT For insects, vision provides an important sensory input for flight stabilization. If an insect flying along a straight line is blown to the left by a gust of wind, the image on its frontal retina moves to the right. This causes the flight motor system to generate a counteractive yaw torque that brings the insect back on course (Reichardt 1969). Similar control mechanisms act to stabilize pitch and roll (e.g., Srinivasan 1977). This optomotor response (Reichardt 1969) is an excellent experimental paradigm with which to probe the neural mechanisms underlying motion detection. Largely through studies of the optomotor response, we now know that flies sense the direction of image movement by correlating the intensity variations registered by neighboring ommatidia in the compound eye (Reichardt 1969). Thus, the front end of the movement-detecting pathway consists of an array of elementary movement detectors (EMDs) that perform these correlations. Different sets of EMDs are used to detect motion in various directions by correlating signals from ommatidia that are appropriately positioned relative to each other. During the past 30 years, researchers have discovered several motion-sensitive neurons with large visual fields, each responding preferentially to motion in a specific direction (Hausen 1993, Hausen & Egelhaaf 1989) or to the fly’s rotation around a specific axis (Krapp 2000, Krapp & Hengstenberg 1996). These neurons derive their sensitivity and selectivity by pooling signals from EMDs that have the appropriate directional selectivity in different regions of the compound eye. They are likely to play an important role in stabilizing flight and providing the insect with a visually kinesthetic sense. The properties of these motion-sensitive neurons have been reviewed
21 Jun 2004
14:22
AR
AR217-NE27-24.tex
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
VISUAL MOTOR COMPUTATIONS IN INSECTS
P1: IKH
681
extensively (e.g., Egelhaaf & Borst 1993, Hausen 1993) and we do not elaborate here. It is generally supposed that motion-sensitive neurons play an important role in the insect’s “autopilot” mechanism by detecting deviations from the intended course and generating corrective turning commands. However, the precise means by which course stabilization is achieved remains elusive. Motion-sensitive neurons possess large visual fields, each typically covering most of one eye. Therefore, steering a straight course can only be achieved by balancing the responses of two neurons, each sensitive to front-to-back motion in one eye. Such a scheme works well only when the insect is flying in a symmetrically structured environment. It does not work when the insect flies along a cliff, for example, because the eye that faces the cliff experiences substantially greater image motion than does the contralateral eye. The only way to steer a straight course in an asymmetrical world (which is more often the rule than the exception) is to sense and compensate for image motion in only a small patch of the visual field that faces the direction along which the insect wishes to fly—the frontal visual field, for example, if the objective is to fly straight ahead. Behavioral evidence suggests that hoverflies adopt just such a strategy (Collett 1980). When flying straight ahead, they minimize image motion within a small visual field that looks in the forward direction. When flying obliquely (as hoverflies often do), they minimize image motion within a small visual field that looks in the appropriate, oblique direction (Collett 1980). However, the neural basis of such steering, which requires visuomotor control via an array of motion-sensitive neurons with small visual fields, remains undiscovered. In flies, course control and stabilization are also aided by the halteres, small hind-wings that oscillate in antiphase with the main wings and act like miniature gyroscopes to provide information on the body’s rotation (Dickinson 1999, Nalbach 1993, Nalbach & Hengstenberg 1994). The halteres sense and compensate for rapid rotations, whereas the optomotor reflexes deal with the slower turns (Hengstenberg 1993, Sherman & Dickinson 2003). Long-term course control is aided by a celestial compass (Wehner 1997) and by the use of prominent landmarks or beacons in the environment (Collett & Zeil 1998). Visual stabilization of roll and pitch are aided by the ocelli, three single-lens eyes situated on top of the head. Each ocellus has a relatively large visual field, more than 40◦ in width. The two laterally directed ocelli stabilize roll by monitoring the position of the horizon on either side. The medial ocellus stabilizes pitch by monitoring the elevation of the horizon in the frontal field (Stange 1981, Stange & Howard 1979, Stange et al. 2002, Wilson 1978). The neural pathways mediating these reflexes remain to be investigated.
NEGOTIATING NARROW GAPS When a bee flies through a hole in a window, it tends to fly through the center, balancing the distances to the left and right boundaries of the opening. How does it gauge and balance the distances to the two rims, given that it does not possess stereo vision?
21 Jun 2004
14:22
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
682
AR
AR217-NE27-24.tex
SRINIVASAN
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
P1: IKH
ZHANG
One possibility is that the bee does not measure distances at all, but simply balances the speeds of image motion experienced by its eyes while flying through the opening. To investigate this possibility, Kirchner & Srinivasan (1989) trained bees to enter an apparatus that offered a reward of sugar solution at the end of a tunnel. Each side wall had a pattern consisting of a vertical black-and-white grating (Figure 1). The grating on one wall could be moved horizontally either toward or away from the reward at any speed. After the bees had received several rewards with the gratings stationary, they were filmed from above as they flew along the tunnel. When both gratings were stationary, the bees tended to fly along the midline of the tunnel, i.e., equidistant from the two walls (Figure 1a). But when one of the gratings was moved at a constant speed in the direction of the bees’ flight—thereby reducing the speed of retinal image motion on one eye relative to the other—the bees’ trajectories shifted toward the side of the moving grating (Figure 1b). When the grating moved in a direction opposite to that of the bees’ flight—thereby increasing the speed of retinal image motion on one eye relative to the other—the bees’ trajectories shifted away from the side of the moving grating (Figure 1c). When the walls were stationary, the bees maintained equidistance by balancing the speeds of the retinal images experienced by its two eyes. A lower image speed on one eye caused the bee to move closer to the wall seen by that eye. A higher image speed, on the other hand, had the opposite effect. Experiments in which the contrasts and the periods of the gratings on the two sides were varied revealed that this centering response is rather robust to variations in these parameters: Bees continued to fly through the middle of the tunnel even when the contrasts of the gratings on the two sides were substantially different or when their periods varied by a factor of as much as four (Srinivasan et al. 1991). These findings suggest that the bee’s visual system is capable of computing the speed of the image of a grating independently of its contrast and spatial-frequency content (Srinivasan et al. 1991). The neural basis of this capacity remains to be discovered, although there is now some evidence that certain neurons in the visual pathways of the fly (Dror et al. 2001) and honeybee (Ibbotson 2001) encode image speed independently of the spatial texture of the image. The movement-detecting subsystem that mediates the centering response appears to be different, qualitatively as well as quantitatively, from the subsystem that mediates the optomotor response (Srinivasan et al. 1993, Srinivasan & Zhang 1997).
CONTROLLING FLIGHT SPEED Do insects control the speed of their flight, and if so, how? Work by David (1982) and by Srinivasan et al. (1996) suggests that flight speed is controlled by monitoring the velocity of the image of the environment. David (1982) observed fruit flies flying upstream in a wind tunnel, lured by the scent of fermenting banana. The inside wall of the cylindrical wind tunnel was lined with a helical black-and-white striped pattern so that rotation of the cylinder
21 Jun 2004
14:22
AR
AR217-NE27-24.tex
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
VISUAL MOTOR COMPUTATIONS IN INSECTS
P1: IKH
683
Figure 1 Experiment investigating the centering response of bees. Bees are trained to fly through a tunnel 40 cm long, 12 cm wide, and 20 cm high to collect a reward placed at the far end. The flanking walls of the tunnel are lined with vertical black-andwhite gratings of period 5 cm. The flight trajectories of bees, as recorded by a video camera positioned above the tunnel, are shown (a–c). In each panel, the shaded area represents the mean and standard deviation of the positions of the flight trajectories, analyzed from recordings of several hundred flights. The dark bars represent the black stripes of the patterns on the walls. The small arrow indicates the direction of bee flight, and the large arrow represents the direction of pattern movement, if any. When the patterns on the walls are stationary, bees tend to fly close to the midline of the tunnel (a). When the pattern on one of the walls is in motion, however, bees tend to fly closer to that wall if the pattern moves in the same direction as the bee (b) and farther away from that wall if the pattern moves in the opposite direction (c). These results indicate that bees balance the distances to the walls of the tunnel by balancing the speeds of image motion that are experienced by their eyes. Adapted from Srinivasan et al. 1991.
21 Jun 2004
14:22
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
684
AR
AR217-NE27-24.tex
SRINIVASAN
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
P1: IKH
ZHANG
around its axis produced apparent movement of the stripes toward the front or the back. The rotational speed of the cylinder (and hence the speed of the backward motion of the pattern) could then be adjusted such that the fly was stationary (i.e., it did not move along the axis of the tunnel). The apparent backward speed of the pattern then revealed the ground speed that the fly was maintaining, as well as the angular velocity of the pattern’s image on the flies’ eyes. In this setup, fruit flies tended to hold the angular velocity of the image constant. Increasing or decreasing the speed of the pattern caused the fly to move backward or forward, respectively, along the tunnel at a rate such that the angular velocity of the image on the eye stayed at a fixed value. The flies also compensated for headwind in the tunnel, increasing or decreasing their thrust to maintain the same ground speed (as indicated by the angular velocity of image motion on the eye). Experiments in which the angular period of the stripes was varied revealed that the flies were measuring (and holding constant) the angular velocity of the image on the eye, irrespective of the spatial structure of the image. Bees appear to use a similar strategy to regulate flight speed (Srinivasan et al. 1996). When a bee flies through a tapered tunnel, it decreases its flight speed as the tunnel narrows to keep the angular velocity of the image of the walls, as seen by the eye, constant at approximately 320◦ /s (Figure 2). This suggests that the bee controls flight speed by monitoring and regulating the angular velocity of the environment’s image as represented on the eye; that is, if the tunnel’s width is doubled, the bee flies twice as fast. On the other hand, a bee flying through a tunnel of uniform width does not change its speed when the spatial period of the stripes lining the walls is abruptly changed (Srinivasan et al. 1996). This indicates that flight speed is regulated by a visual motion-detecting mechanism that measures the angular velocity of the image independently of its spatial structure. In this respect, the speed-regulating system is similar to the system that mediates the centering response described above. Controlling flight speed by regulating image speed allows the insect to automatically slow down to a safer speed when negotiating a narrow passage or a cluttered environment.
COLLISION AVOIDANCE When an object or surface is approached head-on, its image expands in the eye of the observer (Figure 3a). This visual cue is used by cruising insects to avoid imminent collisions with surfaces or obstacles. Tammero & Dickinson (2002) filmed and analyzed the trajectories of fruit flies flying in a cylindrical arena lined with a random visual texture. The analysis revealed that the insect consistently turned away from whichever eye experienced greater image expansion, when this expansion exceeded a certain threshold. This reaction ensures that the insect turns away from objects that are dangerously close. Thus, certain neural mechanisms in the insect visual pathway are tuned to detect local image expansion. Such neurons have been found in flies (Borst 1991) and locusts (Gabbiani et al. 2001, Gray
21 Jun 2004
14:22
AR
AR217-NE27-24.tex
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
VISUAL MOTOR COMPUTATIONS IN INSECTS
P1: IKH
685
et al. 2001, Judge & Rind 1997), and they may be involved in visual sensing to control landing as well as avoid obstacles. However, at least in the fruit fly, the two behaviors may involve partly different neural pathways (Tammero & Dickinson 2002) as one proceeds downstream from the sensory neurons toward the motor command neurons.
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
ORCHESTRATING SMOOTH LANDINGS When a flying insect approaches an object to land on it, the insect needs to monitor the expanding image precisely (Figure 3a) so that flight speed may be reduced and the forelegs extended in time. Mistakes could lead to unpleasant consequences. Researchers have studied the landing response, using rotating spirals or moving gratings to simulate visual expansion and measuring the strength of the landing response as the probability of foreleg extension (e.g., Braitenberg & Taddei-Ferretti 1966, Borst & Bahde 1988, Eckert & Hamdorf 1980, Tammero & Dickinson 2002). These studies suggest that the strength of the landing response depends on the spatial-frequency content and contrast of the pattern as well as the speed and duration of the pattern’s expansion. According to the model proposed by Borst & Bahde (1988; also supported by recent evidence from Tammero & Dickinson 2002), the landing response is triggered when the time-accumulated output of an expansion-detecting system, based on the correlation model (Reichardt 1969), exceeds a preset threshold. Flies may determine when to initiate a landing by computing the time required to contact the object or surface on which they are about to land (Wagner 1982). When an insect approaches a planar surface in a direction normal to the plane (Figure 3b), the projected time to contact the surface, τ , is given by the expression θ˙ , if the insect continues to fly toward the θ surface at a constant velocity (Lee 1976). Here θ is the direction of a visual feature X on the surface, relative to the direction of flight (Figure 3b), and θ˙ is the rate of change (increase) of this angle. Computing time to contact in this way is advantageous because it does not require knowledge (or measurement) of the animal’s speed or of its distance from the surface: It only requires measurement of θ ˙ Flies commence deceleration approximately 90 ms prior to contact (Wagner and θ. 1982), suggesting that their nervous systems compute the projected time to contact. These two strategies provide useful information on when a landing process should commence but not on what the actual landing strategy should be. For example, how rapidly should the insect decelerate, once the landing process is initiated? This question was addressed by Srinivasan et al. (2000b), who filmed bees as they landed on a horizontal surface. On horizontal surfaces, bees usually perform grazing landings in which trajectories are inclined to the surface at an angle that is considerably smaller than 45◦ . A perpendicular surface approach generates strong looming (image expansion) cues that could be used to control the deceleration of flight. But looming cues are weak when a bee performs a grazing
21 Jun 2004
14:22
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
686
AR
AR217-NE27-24.tex
SRINIVASAN
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
P1: IKH
ZHANG
landing, where the motion of the image of the surface is dominated by a strong, front-to-back translatory component in the ventral visual field of the eye. To investigate how bees execute grazing landings, Srinivasan et al. (1996, 2000b) trained bees to collect a reward of sugar water on a visually textured, horizontal surface. The reward was then removed, and the landings that the bees made on the surface in search of the food were filmed in three dimensions. Analysis of the landing trajectories revealed that the forward speed of the bee decreases steadily as the bee approaches the surface (Figure 4). In fact, the speed of flight is approximately proportional to the height above the surface, indicating that the bee is holding the angular velocity of the surface’s image approximately constant as the surface is approached. Analysis of 26 landing trajectories revealed an average image angular velocity of 500◦ /s ± 270◦ /s, where the variation reflects the fact that each animal maintained a different (but constant) image velocity (Srinivasan et al. 2000b). Holding the image velocity of the ground constant during landing may be a simple way of decreasing the flight speed progressively (and automatically) and ensuring that its value is close to zero at touchdown. The advantage of such a strategy is that the control is achieved by a simple process and without explicit knowledge of flight speed or distance from the surface. Detailed analysis of the landing trajectories revealed two characteristic properties (Srinivasan et al. 2000b). First, the instantaneous horizontal flight speed is proportional to the instantaneous height above the surface (as described above). Thus the angular velocity of the image is being held constant as the ground is approached. Second, the instantaneous speed of descent is proportional to the instantaneous horizontal speed. This shows that both the horizontal and the vertical components of flight speed decrease as the ground is approached, and they reach zero simultaneously at touchdown, thus ensuring a safe landing. A mathematical model of the landing process incorporating the above two properties predicts that during landing (a) the height should decrease exponentially as a function of time and (b) the horizontal distance traveled should also be an exponential function of time. Measurement of the time courses of height and horizontal travel distance in actual landings shows that the data are in excellent agreement with the predictions of the model (Figure 5). The value of the image angular velocity that is maintained during landing (∼500◦ /s ± 270◦ /s) is not significantly different from that observed during cruising flight (320◦ /s) (see Controlling Flight Speed, above). It is possible, therefore, that cruising and landing are controlled by the same, or very similar, visuomotor mechanisms. The only difference between cruising and landing is that during landing the approach is directed toward a surface, causing the bee to slow down automatically as the surface is approached. In principle, the strategy discussed here could also be used to control landing in a head-on approach toward a surface, if the landing insect holds the rate of image expansion constant as the surface is approached.
21 Jun 2004
14:22
AR
AR217-NE27-24.tex
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
687
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
VISUAL MOTOR COMPUTATIONS IN INSECTS
P1: IKH
Figure 4 Experiment investigating how bees make a grazing landing on a horizontal surface. The graphs show the approximately linear relationship between horizontal flight speed (Vf) and height (h) for two bees (a, b). The landing bee holds the angular velocity of the image of the ground constant at either 241◦ /s (a) or 655◦ /s (b), as calculated from the slopes of the linear regression lines. Also shown are the values of the correlation coefficient (r). Holding the angular velocity of the image of the ground constant during the approach ensures that the landing speed is zero at touchdown. From Srinivasan et al. 2000b.
21 Jun 2004
14:22
AR217-NE27-24.tex
SRINIVASAN
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
P1: IKH
ZHANG
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
688
AR
Figure 5 Analysis of honeybee landing trajectories. Variation of height (h) versus time and variation of horizontal distance traveled (d) versus time for two bees (a, b). In each panel, the circles represent experimental measurements, and the curve indicates the theoretical prediction. From Srinivasan et al. 2000b.
21 Jun 2004
14:22
AR
AR217-NE27-24.tex
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
VISUAL MOTOR COMPUTATIONS IN INSECTS
P1: IKH
689
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
ESTIMATING DISTANCE FLOWN Insects are not merely moment-to-moment navigators concerned only with avoiding collisions or making smooth landings. Honeybees, for example, can navigate accurately and repeatedly to a food source several kilometers away from their nest. Indeed, bees communicate to their nestmates the distance and direction in which to fly to reach the food, through the famous waggle dance (von Frisch 1993). But the cues bees utilize to gauge the distance flown to reach a goal have been a subject of controversy. Early studies of the waggle dance suggested that distance traveled is measured in terms of the total energy expended during flight (reviewed in von Frisch 1993). However, researchers have recently questioned this energy hypothesis, suggesting that an important cue is the extent to which the image of the environment moves in the bee’s eye as it flies to the target (Esch & Burns 1995, 1996; Esch et al. 2001; Sch¨one 1996; Si et al. 2003; Srinivasan et al. 1996, 1997, 2000a). In other words, the honeybee’s “odometer” is driven by a visual, rather than an energy-based, signal. Here we describe recent work that led to this new insight. A few years ago, Esch & Burns (1995, 1996) investigated distance measurement by enticing honeybees to find food at a feeder placed 70 meters away from a hive in an open field. They recorded the distance as signaled by the bees when they danced to recruit other nestmates to visit the feeder. When the feeder was 70 meters away, the bees signaled the correct distance. But when the feeder was raised above the ground by attaching it to a helium balloon, the bees signaled a progressively shorter distance as the height of the balloon was increased, despite the fact that the balloon was now farther away from the hive. Esch and Burns explained this finding by proposing that the bees were gauging distance flown in terms of image motion in relation to the ground, rather than through the energy required to reach the feeder. The higher the balloon was, the lower the total amount of image motion that was experienced by the bee en route to the feeder. This hypothesis was examined by Srinivasan et al. (1996, 1997), who investigated under controlled laboratory conditions the cues by which bees estimate and learn distances flown. Bees were trained to enter a 3.2-m-long tunnel and collect a reward of sugar solution at a feeder placed in the tunnel at a fixed distance from the entrance. The walls and floor of the tunnel were lined with black-and-white gratings perpendicular to the tunnel’s axis (Figure 6a). During training, the position and orientation of the tunnel were changed frequently to prevent the bees from using any external landmarks to gauge their location relative to the tunnel entrance. The bees were then tested by recording their searching behavior in an identical fresh tunnel that carried no reward and was devoid of any scent cues. In the tests, the bees showed a clear ability to search for the reward at the correct distance (see Figure 6b). How did the bees gauge the distance they had flown in the tunnel? Tests were carried out to examine the participation of a variety of potential cues, including energy consumption, time of flight, airspeed integration, and inertial navigation (Srinivasan et al. 1997). Results show that the bees estimated distance flown by
21 Jun 2004
14:22
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
690
AR
AR217-NE27-24.tex
SRINIVASAN
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
P1: IKH
ZHANG
integrating over time the image motion of the walls as registered by their eyes while they flew through the tunnel. In one experiment (Srinivasan et al. 1997) bees were trained and tested in conditions where image motion was eliminated or reduced by using axially oriented stripes on the walls and floor of the tunnel. The bees showed no ability to gauge distance traveled: In these tests, they searched uniformly over the entire length of the tunnel, showing no tendency to stop or turn at the former location of the reward. Trained bees tended to search for the feeder at the same location in the tunnel, even if the period of the gratings lining the walls and floor was varied in the tests. These findings reveal that the odometer integrates image motion robustly and reliably over a fourfold variation in the spatial period of the grating (see Figure 6b). How far do bees “think” they have flown when they return from one of these tunnels? Srinivasan et al. (2000a) and Esch et al. (2001) trained bees to fly directly from a hive into a short, narrow tunnel that was placed close to the hive entrance. The tunnel was 6 m long and 11 cm wide, and its walls and floor were lined with a random visual texture. A feeder was placed 6 m into the tunnel. The dances of bees returning from this feeder were filmed and analyzed. These bees signaled a flight distance of approximately 200 m, despite having flown only a small fraction of this distance. Thus the bees overestimated the distance they had flown in the tunnel because the proximity of the walls and floor of the tunnel greatly magnified the optic flow that they experienced in comparison with what normally occurs when foraging outdoors. This experiment reinforces the conclusion that image motion is the dominant cue that bees use to gauge how far they have traveled. The motiondetecting system that underlies distance estimation, as measured by the waggle dance, seems to be rather robust to variations in the spatial texture and contrast of the environment (Si et al. 2003). In another experiment, Ugolini (1987) transported wasps from their nests to various sites, then released them, and observed their homing trajectories. He found that the wasps headed accurately toward their homes when they had been taken to the release site in a transparent container—and could thus observe their passage through the environment—but not when they were transported in an opaque container. These findings suggest that wasps, like bees, infer the direction and distance of their travel by observing the apparent motion of the surrounding visual panorama. What are the consequences of monitoring travel by using visual, rather than energy-based, cues? Unlike an energy-based mechanism, a visually driven odometer is not affected by wind or by the load of nectar carried. Furthermore, it would provide a distance reading that is independent of the speed at which the bee flies to its destination, because the reading would depend only on the total amount of image motion that is registered by the eye. But, as discussed above, a visual odometer works accurately only if the bee follows the same route each time it flies to its destination (or if a follower bee adheres to the same route described by the dancing scout bee). This is because the total amount of image motion that is experienced during the trip depends not only on the distance flown, but also on how visually “tight” or “open” the environment is. Indeed, the dances of bees from
21 Jun 2004
14:22
AR
AR217-NE27-24.tex
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
VISUAL MOTOR COMPUTATIONS IN INSECTS
P1: IKH
691
a given colony exhibit substantially different distance-calibration curves when the bees are made to forage in different environments (Esch et al. 2001). The strong waggle dances of bees returning from a short, narrow tunnel illustrate this point even more dramatically. However, the unavoidable dependence of the dance on the environment may not be a problem in many natural situations because bees flying repeatedly to an attractive food source tend to remain faithful to the route they have discovered (e.g., Collett 1996). Because the dance indicates the direction of the food source as well as its distance, there is a reasonably good chance that the new recruits, which fly in the same direction as the scout that initially discovered the source, will experience the same environment and therefore fly the same distance. At present, it is not clear whether bees use optic flow information from the ventral as well as the lateral fields of view for odometry (the data on this are presently equivocal: see Si et al. 2003, Srinivasan et al. 1997). If ventral flow is important, bees need to fly at a consistent height, or to account for the height of flight in the computation, to estimate distances reproducibly. Whether bees use either of these strategies remains to be investigated. What are the neural mechanisms by which the distance signal is computed? Where in the insect’s brain is the odometer located? Currently, we have absolutely no idea. This is an intriguing area for future research.
CHASING BEHAVIOR In addition to navigating safely in their world, insects need to interact with other living creatures to survive and propagate their genes. They must capture prey, chase after potential mates, ward off territorial intruders, and escape from predators. Although much remains to be learned about these behaviors, it is clear that vision plays a significant role in all of them. Male houseflies, for example, will chase conspecifics with dazzling agility: Females are chased for the purpose of mating, and males are driven away in bouts of territoriality. In a pioneering study, Land & Collett (1974) filmed and analyzed these chases to formulate a model that describes the underlying visual guidance system. The combination of target and pursuer is modeled as a feedback control servomechanism in which the pursuer attempts to keep the target in his frontal (straight-ahead) field of view (see Figure 7). The dynamics of chasing behavior in the male housefly can be characterized accurately by a servomechanism that employs a proportional-derivative controller. Land & Collett (1974) found that with certain values of model parameters, the trajectories of real chases were well reproduced. Recently, it has been shown that an additional control loop, driven by the visual angle that the target subtends in the pursuer’s eye, controls the forward speed of the pursuer and ensures that he does not fall too far behind the target (Boeddeker & Egelhaaf 2003, Boeddeker et al. 2003). The dynamics of the system that mediates chasing behavior are quite different from those of the system that mediates the optomotor response (Srinivasan &
21 Jun 2004
14:22
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
692
AR
AR217-NE27-24.tex
SRINIVASAN
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
P1: IKH
ZHANG
Figure 7 Visual parameters involved in the control of chasing. θ , the bearing of the target relative to the pursuer, is the error angle that the pursuer seeks to reduce to zero to achieve perfect tracking. The pursuer’s turning rate, ω, is proportional to two visual ˙ In other parameters, the error angle, θ, and the rate of change of this error angle, θ. ˙ words, ω(t + d) = k1 θ(t) + k2 θ(t), where k1 and k2 are constants of proportionality and d is a time delay that accounts for the delay between the visual stimulus and the motor response (the turning rate). With values of k1 = 20 s−1, k2 = 0.7, and d = 0.02 s, the model reproduces well the trajectories of real chases (Land & Collett 1974).
Bernard 1977). Male flies carry “chasing” neurons that respond selectively to small, rapidly moving targets (Gilbert & Strausfeld 1991, Hausen & Strausfeld 1980). These neurons are distinct, both anatomically and physiologically, from the large-field optomotor neurons discussed above, and they are small or absent in female flies. Researchers will likely find chasing neurons of this kind in the males of many species of flying insects that rely on vision, rather than olfaction, to track down their mates.
21 Jun 2004
14:22
AR
AR217-NE27-24.tex
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
VISUAL MOTOR COMPUTATIONS IN INSECTS
P1: IKH
693
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
MOTION CAMOUFLAGE Observation of chasing behavior in insects provides compelling evidence that these creatures are very adept at detecting other moving objects in the environment. Given this exquisite sensitivity to movement, are there instances in which an insect may try to conceal its own movement to “sneak up” on another insect? Examples of such behavior have been documented in two insect species to date. Hoverflies (Srinivasan & Davey 1995) and dragonflies (Mizutani et al. 2003) occasionally seem to shadow conspecifics by moving in such a way that they appear to be stationary. Figure 8 shows an example of a dragonfly shadowing another dragonfly. Exactly how the shadower computes and executes these stealthy trajectories remains a mystery, although some possibilities are suggested by Srinivasan & Davey (1995).
CONCLUDING REMARKS Insects are prime subjects in which to study how visual information is exploited to guide a variety of important behavioral tasks. These tasks range from short-term, moment-to-moment guidance, such as that used to stabilize attitude or avoid collisions, to long-term guidance, such as that used to navigate to a food source several kilometers away and return home safely. Flying insects exploit cues derived from image motion to stabilize flight, regulate flight speed, negotiate narrow gaps, infer the ranges to objects, avoid obstacles, orchestrate smooth landings, and monitor distances traveled. Several motion-sensitive pathways exist in the insect visual system, each with a distinct set of properties and geared to a specific visual function. It seems unlikely, however, that all these systems (and other as yet undiscovered ones) operate continuously. The optomotor system, for instance, has to be switched off, or its corrective commands ignored, when the insect makes a voluntary turn or chases a target (Heisenberg & Wolf 1993, Kirschfeld 1997, Srinivasan & Bernard 1997). It is also impossible to land on a surface without first disabling the collision avoidance system. Major current challenges are to discover the conditions under which individual subsystems are called into play or ignored, to understand the ways in which these subsystems interact to coordinate flight, and to uncover the neural mechanisms that underlie these visual capacities. Another thrust that many laboratories are now pursuing is to translate these elegant principles into novel algorithms for the guidance of autonomously navigating vehicles (Ayers et al. 2002, Srinivasan & Venkatesh 1997, Srinivasan et al. 1999). ACKNOWLEDGMENTS Some work described in this review was supported by the Australia-Germany Collaborative Research Scheme, DP0208683 from the Australian Research
21 Jun 2004
14:22
694
AR
AR217-NE27-24.tex
SRINIVASAN
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
P1: IKH
ZHANG
Council, RG 84/97 from the Human Frontiers in Science Program, N00014-991-0506 from the U.S. Defense Advanced Research Projects Agency and the Office of Naval Research, and the Deutsche Forschungsgemeinschaft (SFB 554 and GK 200). The Annual Review of Neuroscience is online at http://neuro.annualreviews.org
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
LITERATURE CITED Ayers A, Davis JL, Rudolph A. 2002. Neurotechnology for Biomimetic Robots. Cambridge, MA: MIT Press. 636 pp. Boeddeker N, Egelhaaf M. 2003. Steering a virtual blowfly: simulation of visual pursuit. Proc. R. Soc. London Ser. B 270:1971–78 Boeddeker N, Kern R, Egelhaaf M. 2003. Chasing a dummy target: smooth pursuit and velocity control in male blowflies. Proc. R. Soc. London Ser. B 270:393–99 Borst A. 1991. Fly visual interneurons responsive to image expansion. Zoologische Jahrb¨ucher (Physiologie) 95:305–13 Borst A, Bahde S. 1988. Visual information processing in the fly’s landing system. J. Comp. Physiol. A 163:167–73 Braitenberg V, Taddei-Ferretti C. 1966. Landing reaction of Musca domestica. Naturwissen 53:155–56 Collett TS. 1980. Some operating rules for the optomotor system of a hoverfly during voluntary flight. J. Comp. Physiol. 138:271–82 Collett TS. 1996. Insect navigation en route to the goal: multiple strategies for the use of landmarks. J. Exp. Biol. 199:227–35 Collett TS, Harkness LIK. 1982. Depth vision in animals. In Analysis of Visual Behavior, ed. DJ Ingle, MA Goodale, RJW Mansfield, pp. 111–76. Cambridge, MA: MIT Press Collett TS, Zeil J. 1998. Places and landmarks: an arthropod perspective. In Spatial Representation in Animals, ed. S. Healy, pp. 18–53. Oxford: Oxford Univ. Press. 188 pp. David CT. 1982. Compensation for height in the control of groundspeed by Drosophila in a new, “Barber’s Pole” wind tunnel. J. Comp. Physiol. 147:485–93 Dickinson MH. 1999. Haltere-mediated equi-
librium reflexes of the fruit fly, Drosophila melanogaster. Philos. Trans. R. Soc. London Ser. B 353:903–916 Dror RO, O’Carroll DC, Laughlin SB. 2001. Accuracy of velocity estimation by Reichardt correlators. J. Opt. Soc. Am. A 18:241–52 Eckert H, Hamdorf K. 1980. Excitatory and inhibitory response components in the landing response of the blowfly, Calliphora erythrocephala. J. Comp. Physiol. 138:253–64 Egelhaaf M, Borst A. 1993. Movement detection in arthropods. See Miles & Wallman 1993, pp. 53–77 Esch H, Burns JE. 1995. Honeybees use optic flow to measure the distance of a food source. Naturwissen 82:38–40 Esch H, Burns JE. 1996. Distance estimation by foraging honeybees. J. Exp. Biol. 199:155– 62 Esch H, Zhang SW, Srinivasan MV, Tautz J. 2001. Honeybee dances communicate distances measured by optic flow. Nature 411: 581–83 Gabbiani F, Mo C, Laurent G. 2001. Invariance of angular threshold computation in a widefield, looming-sensitive neuron. J. Neurosci. 21:314–29 Gilbert C, Strausfeld NJ. 1991. The functional organization of male-specific visual neurons in flies. J. Comp. Physiol. A 169:395–411 Gray JR, Lee JK, Robertson RM. 2001. Activity of descending contralateral movement detector neurons and collision avoidance behavior in response to head-on visual stimuli in locusts. J. Comp. Physiol. A 187:115–29 Hausen K. 1993. The decoding of retinal image flow in insects. See Miles & Wallman 1993, pp. 203–35
21 Jun 2004
14:22
AR
AR217-NE27-24.tex
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
VISUAL MOTOR COMPUTATIONS IN INSECTS Hausen K, Egelhaaf M. 1989. Neural mechanisms of visual course control in insects. In Facets of Vision, ed. DG Stavenga, RC Hardie, pp. 391–424. Berlin/Heidelberg: Springer-Verlag Hausen K, Strausfeld NJ. 1980. Sexually dimorphic interneuron arrangements in the fly visual system. Proc. R. Soc. Lond. B 208:57– 71 Heisenberg M, Wolf R. 1993. The sensorymotor link in motion-dependent flight control of flies. See Miles & Wallman 1993, pp. 265–83 Hengstenberg R. 1993. Multisensory control in insect oculomotor systems. See Miles & Wallman 1993, pp. 285–98 Horridge GA. 1987. The evolution of visual processing and the construction of seeing systems. Proc. R. Soc. London Ser. B 230: 279–92 Ibbotson MR. 2001. Evidence for velocitytuned motion-sensitive descending neurons in the honeybee. Proc. R. Soc. London Ser. B 268:2195–201 Judge SJ, Rind FC. 1997. The locust DCMD, a movement-detecting neurone tightly tuned to collision trajectories. J. Exp. Biol. 200:2209– 16 Kirchner WH, Srinivasan MV. 1989. Freely flying honeybees use image motion to estimate object distance. Naturwissen 76:281– 82 Kirschfeld K. 1997. Course control and tracking: orientation through image stabilization. See Lehrer 1997, pp. 67–93 Krapp HG. 2000. Neuronal matched filters for optic flow processing in the visual system of flying insects. In Neuronal Processing of Optic Flow, ed. M Lappe, pp. 93–120. San Diego: Academic. 321 pp. Krapp HG, Hengstenberg R. 1996. Estimation of self-motion by optic flow processing in single visual interneurons. Nature 384:463– 66 Land MF, Collett TS. 1974. Chasing behaviour of houseflies (Fannia canicularis). J. Comp. Physiol. 89:331–57 Lee DN. 1976. A theory of visual control based
P1: IKH
695
on information about time-to-collision. Perception 5:437–59 Lehrer M, ed. 1997. Orientation and Communication in Arthropods. Basel: Birkh¨auser Verlag. 395 pp. Miles FA, Wallman J, eds. 1993. Visual Motion and its Role in the Stabilization of Gaze. Amsterdam: Elsevier. 417 pp. Mizutani A, Chahl JS, Srinivasan MV. 2003. Motion camouflage in dragonflies. Nature 423:604 Nalbach G. 1993. The halteres of the blowfly Calliphora. 1. Kinematics and dynamics. J. Comp. Physiol. A 163:293–300 Nalbach G, Hengstenberg R. 1994. The halteres of the blowfly Calliphora. 2. Threedimensional organization of compensatory reactions to real and simulated rotations. J. Comp. Physiol. A 175:695–708 Reichardt W. 1969. Movement perception in insects. In Processing of Optical Data by Organisms and by Machines, ed. W Reichardt, pp. 465–93. New York: Academic. 614 pp. Rossell S. 1983. Binocular stereopsis in an insect. Nature 302:821–22 Sch¨one H. 1996. Optokinetic speed control and estimation of travel distance in walking honeybees. J. Comp. Physiol. A 179:587– 92 Sherman A, Dickinson MH. 2003. A comparison of visual and haltere-mediated equilibrium reflexes in the fruit fly Drosophila melanogaster. J. Exp. Biol. 206:295–302 Si A, Srinivasan MV, Zhang SW. 2003. Honeybee navigation: properties of the visually driven ‘odometer.’ J. Exp. Biol. 206:1265– 73 Srinivasan MV. 1977. A visually-evoked roll response in the housefly: open-loop and closedloop studies. J. Comp. Physiol. 119:1–14 Srinivasan MV. 1993. How insects infer range from visual motion. See Miles & Wallman 1993, pp. 139–56 Srinivasan MV, Bernard GD. 1977. The pursuit response of the housefly and its interaction with the optomotor response. J. Comp. Physiol. 115:101–17 Srinivasan MV, Chahl JS, Weber K, Venkatesh
21 Jun 2004
14:22
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
696
AR
AR217-NE27-24.tex
SRINIVASAN
AR217-NE27-24.sgm
LaTeX2e(2002/01/18)
P1: IKH
ZHANG
S, Nagle MG, Zhang SW. 1999. Robot navigation inspired by principles of insect vision. Robotics and Autonomous Systems 26:203–16 Srinivasan MV, Davey M. 1995. Strategies for active camouflage of motion. Proc. R. Soc. Lond. B. 259:19–25 Srinivasan MV, Lehrer M, Kirchner W, Zhang SW. 1991. Range perception through apparent image speed in freely-flying honeybees. Vis. Neurosci. 6:519–35 Srinivasan MV, Venkatesh S, eds. 1997. From Living Eyes to Seeing Machines. Oxford: Oxford Univ. Press. 271 pp. Srinivasan MV, Zhang SW. 1997. Visual control of honeybee flight. See Lehrer 1997, pp. 95–113 Srinivasan MV, Zhang SW, Altwein M, Tautz J. 2000a. Honeybee navigation: nature and calibration of the ‘odometer’. Science 287:851–53 Srinivasan MV, Zhang SW, Bidwell N. 1997. Visually mediated odometry in honeybees. J. Exp. Biol. 200:2513–22 Srinivasan MV, Zhang SW, Chahl JS, Barth E, Venkatesh S. 2000b. How honeybees make grazing landings on flat surfaces. Biol. Cybernetics 83:171–83 Srinivasan MV, Zhang SW, Chandrashekara K. 1993. Evidence for two distinct movementdetecting mechanisms in insect vision. Naturwissen 80:38–41 Srinivasan MV, Zhang SW, Lehrer M, Collett TS. 1996. Honeybee navigation en route to
the goal: visual flight control and odometry. J. Exp. Biol. 199:237–44 Stange G. 1981. The ocellar component of flight equilibrium control in dragonflies. J. Comp. Physiol. 141:335–47 Stange G, Howard J. 1979. An ocellar dorsal light response in a dragonfly. J. Exp. Biol. 83: 351–55 Stange G, Stowe S, Chahl J, Massaro A. 2002. Anisotropic imaging in the dragonfly median ocellus: a matched filter for horizon detection. J. Comp. Physiol. A 188:455–67 Tammero LF, Dickinson MH. 2002. Collisionavoidance and landing responses are mediated by separate pathways in the fruit fly, Drosophila melanogaster. J. Exp. Biol. 205: 2785–98 Ugolini A. 1987. Visual information acquired during displacement and initial orientation in Polistes gallicus. Anim. Behav. 35:590–95 von Frisch K 1993. The Dance Language and Orientation of Bees. Cambridge, MA: Harvard Univ. Press. 566 pp. Wagner H. 1982. Flow-field variables trigger landing in flies. Nature 297:147–48 Wehner R. 1981. Spatial vision in insects. In Handbook of Sensory Physiology, ed. H Autrum, 7/6C:287–616. Berlin/Heidelberg: Springer-Verlag Wehner R. 1997. The ant’s celestial compass system: spectral and polarization channels. See Lehrer 1997, pp. 145–85 Wilson M. 1978. The functional organization of locust ocelli. J. Comp. Physiol. 124:297–316
srinivasam.qxd
6/19/2004
7:27 PM
Page 1
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
VISUAL MOTOR COMPUTATIONS IN INSECTS
C-1
Figure 2 Experiment investigating visual control of flight speed. (a) Bees are trained to fly through a tapered tunnel to collect a reward placed at the far end. The walls of the tunnel are lined with vertical black-and-white gratings of period 6 cm. (b) A typical flight trajectory, as filmed from above by a video camera, where the bee’s position and orientation are shown every 50 msec. (c) Mean and standard deviation of flight speeds measured at various locations along the tunnel (data from 18 flights). The dashed line represents the theoretically expected flight speed profile if the bees hold the angular velocity of the images of the walls constant at 320°/s as they fly through the tunnel. The data indicate that bees control flight speed by holding constant the angular velocity of the image of the environment. Adapted from Srinivasan et al. 1996.
srinivasam.qxd
SRINIVASAN
7:27 PM
■
Page 2
ZHANG
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
C-2
6/19/2004
Figure 3 (a) An approach perpendicular to a surface produces expansion of the image. (b) The time to contact the surface, if the insect were to continue to fly toward the surface at the same speed, is given by the ratio , where θ is the direction of a feature X on the surface relative to the direction of flight and θ˙ is the rate of increase of this angle.
srinivasam.qxd
6/19/2004
7:27 PM
Page 3
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
VISUAL MOTOR COMPUTATIONS IN INSECTS
C-3
Figure 6 (a) Experiment investigating how honeybees gauge distance flown to a food source. Bees were trained to find a food reward placed at a distance of 1.7 m from the entrance of a 3.2-m-long tunnel of width 22 cm and height 20 cm. The tunnel was lined with vertical black-and-white gratings of period 4 cm. (b) When the trained bees were tested in a fresh tunnel with the reward absent, they searched at the former location of the feeder, as shown by the bell-shaped search distributions. This is true irrespective of whether the period of the grating was 4 cm (as in the training) (blue squares), 8 cm (red triangles), or 2 cm (black diamonds). The inverted triangle shows the former location of the reward, and the symbols below it depict the mean values of the search distributions in each case. Bees lose their ability to estimate the distance of the feeder when image-motion cues are removed by lining the tunnel with axial (rather than vertical) stripes (circles). These experiments and others (Srinivasan et al. 1997) demonstrate that distance flown is estimated visually, by integrating over time the image velocity that is experienced during the flight, and the honeybee’s odometer measures image velocity independently of image structure. Adapted from Srinivasan et al. 1997.
C-4
6/19/2004
SRINIVASAN
7:27 PM
■
Figure 8 Illustration of motion camouflage in dragonflies. The dots correspond to the locations of the shadower (blue dots) and the “shadowee” (red dots) in successive video frames. The shadower moves so that, from the point of view of the shadowee, it appears to be a stationary object positioned at O. By moving in this way, the shadower emulates a stationary object located at the intersection point. In other words, the shadower produces the same trajectory in the retina of the shadowee as would a stationary object positioned at O. By camouflaging its motion in this way, the shadower is able to track and approach the shadowee without giving itself away as a live, moving entity. From Mizutani et al. 2003.
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
srinivasam.qxd Page 4
ZHANG
P1: JRX
May 24, 2004
18:55
Annual Reviews
AR217-FM
Annual Review of Neuroscience Volume 27, 2004
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
CONTENTS THE AMYGDALA MODULATES THE CONSOLIDATION OF MEMORIES OF EMOTIONALLY AROUSING EXPERIENCES, James L. McGaugh CONTROL OF CENTRAL SYNAPTIC SPECIFICITY IN INSECT SENSORY NEURONS, Jonathan M. Blagburn and Jonathan P. Bacon SENSORY SIGNALS IN NEURAL POPULATIONS UNDERLYING TACTILE PERCEPTION AND MANIPULATION, Antony W. Goodwin and Heather E. Wheat
1 29
53
E PLURIBUS UNUM, EX UNO PLURA: QUANTITATIVE AND SINGLE-GENE PERSPECTIVES ON THE STUDY OF BEHAVIOR, Ralph J. Greenspan
79
DESENSITIZATION OF G PROTEIN–COUPLED RECEPTORS AND NEURONAL FUNCTIONS, Raul R. Gainetdinov, Richard T. Premont, Laura M. Bohn, Robert J. Lefkowitz, and Marc G. Caron
107
PLASTICITY OF THE SPINAL NEURAL CIRCUITRY AFTER INJURY, V. Reggie Edgerton, Niranjala J.K. Tillakaratne, Allison J. Bigbee, Ray D. de Leon, and Roland R. Roy
THE MIRROR-NEURON SYSTEM, Giacomo Rizzolatti and Laila Craighero GENETIC APPROACHES TO THE STUDY OF ANXIETY, Joshua A. Gordon and Ren´e Hen
145 169 193
UBIQUITIN-DEPENDENT REGULATION OF THE SYNAPSE, Aaron DiAntonio and Linda Hicke
223
CELLULAR MECHANISMS OF NEURONAL POPULATION OSCILLATIONS IN THE HIPPOCAMPUS IN VITRO, Roger D. Traub, Andrea Bibbig, Fiona E.N. LeBeau, Eberhard H. Buhl, and Miles A. Whittington
247
THE MEDIAL TEMPORAL LOBE, Larry R. Squire, Craig E.L. Stark, and Robert E. Clark
279
THE NEURAL BASIS OF TEMPORAL PROCESSING, Michael D. Mauk and Dean V. Buonomano
307
THE NOGO SIGNALING PATHWAY FOR REGENERATION BLOCK, Zhigang He and Vuk Koprivica
341
MAPS IN THE BRAIN: WHAT CAN WE LEARN FROM THEM? Dmitri B. Chklovskii and Alexei A. Koulakov
369 v
P1: JRX
May 24, 2004
vi
18:55
Annual Reviews
AR217-FM
CONTENTS
ELECTRICAL SYNAPSES IN THE MAMMALIAN BRAIN, Barry W. Connors and Michael A. Long
393
NEURONAL CIRCUITS OF THE NEOCORTEX, Rodney J. Douglas and Kevan A.C. Martin
419
THE NEUROBIOLOGY OF THE ASCIDIAN TADPOLE LARVA: RECENT DEVELOPMENTS IN AN ANCIENT CHORDATE, Ian A. Meinertzhagen,
Annu. Rev. Neurosci. 2004.27:679-696. Downloaded from arjournals.annualreviews.org by Yale University SOCIAL SCIENCE LIBRARY on 10/31/05. For personal use only.
Patrick Lemaire, and Yasushi Okamura
CORTICAL NEURAL PROSTHETICS, Andrew B. Schwartz THE SYNAPTIC VESICLE CYCLE, Thomas C. S¨udhof CRITICAL PERIOD REGULATION, Takao K. Hensch CEREBELLUM-DEPENDENT LEARNING: THE ROLE OF MULTIPLE PLASTICITY MECHANISMS, Edward S. Boyden, Akira Katoh, and Jennifer L. Raymond
453 487 509 549
581
ATTENTIONAL MODULATION OF VISUAL PROCESSING, John H. Reynolds and Leonardo Chelazzi
THE HUMAN VISUAL CORTEX, Kalanit Grill-Spector and Rafael Malach VISUAL MOTOR COMPUTATIONS IN INSECTS, Mandyam V. Srinivasan and Shaowu Zhang
HOW THE BRAIN PROCESSES SOCIAL INFORMATION: SEARCHING FOR THE SOCIAL BRAIN, Thomas R. Insel and Russell D. Fernald UNRAVELING THE MECHANISMS INVOLVED IN MOTOR NEURON DEGENERATION IN ALS, Lucie I. Bruijn, Timothy M. Miller, and Don W. Cleveland
611 649 679 697
723
INDEXES Subject Index Cumulative Index of Contributing Authors, Volumes 18–27 Cumulative Index of Chapter Titles, Volumes 18–27
ERRATA An online log of corrections to Annual Review of Neuroscience chapters may be found at http://neuro.annualreviews.org/
751 767 772