Exp Brain Res (2010) 204:457–464 DOI 10.1007/s00221-010-2306-3
RESEARCH ARTICLE
Unitary haptic perception: integrating moving tactile inputs from anatomically adjacent and non-adjacent digits Marius V. Peelen • Jack Rogers • Alan M. Wing Paul E. Downing • R. Martyn Bracewell
•
Received: 25 April 2009 / Accepted: 17 May 2010 / Published online: 8 June 2010 Ó Springer-Verlag 2010
Abstract How do we achieve unitary perception of an object when it touches two parts of the sensory epithelium that are not contiguous? We investigated this problem with a simple psychophysical task, which we then used in an fMRI experiment. Two wooden rods were moved over two digits positioned to be spatially adjacent. The digits were either from one foot (or hand) or one digit was from either foot (or hand). When the rods were moving in phase, one object was reliably perceived. By contrast, when the rods were moving out of phase, two objects were reliably perceived. fMRI revealed four cortical areas where activity was higher when the moving rods were perceived as one object relative to when they were perceived as two separate objects. Areas in the right inferior parietal lobule, the left inferior temporal
sulcus and the left middle frontal gyrus were activated for this contrast regardless of the anatomical configuration of the stimulated sensory epithelia. By contrast, the left intraparietal sulcus was activated specifically when integration across the midline was required, irrespective of whether the stimulation was applied to the hands or feet. These results reveal a network of brain areas involved in generating a unified percept of the presence of an object that comes into contact with different parts of the body surface. Keywords Tactile integration fMRI Haptic perception Binding Intraparietal sulcus
Introduction M. V. Peelen (&) Center for Mind/Brain Sciences (CIMeC), University of Trento, Via delle Regole 101, 38100 Matterello, TN, Italy e-mail:
[email protected] M. V. Peelen Department of Psychology, Harvard University, Cambridge, MA, USA M. V. Peelen J. Rogers P. E. Downing R. M. Bracewell School of Psychology, Bangor University, Gwynedd LL57 2AS, UK e-mail:
[email protected] J. Rogers MRC Cognition and Brain Sciences Unit, Cambridge, UK A. M. Wing R. M. Bracewell Behavioural Brain Sciences Centre, School of Psychology, University of Birmingham, Birmingham, UK R. M. Bracewell School of Medical Sciences, Bangor University, Wales, UK
In order to recognize objects, the brain typically integrates information from multiple sensory sources. For example, when manually exploring an object, spatio-temporally consistent tactile information from multiple fingers informs us that this information, although coming from different fingers, is indeed coming from the same object. This is reminiscent of the Gestalt ‘common fate’ principle in the visual modality: dots moving with the same velocity will be reliably perceived as belonging to the same object, whereas dots moving independently will not cohere into a whole (Braddick 1980). Indeed one might consider these examples of what Helmholtz termed ‘unconscious inference’, the notion that our perceptions are based on both a knowledge of the world and the data our sensory organs provide. A direct demonstration of tactile integration across digits was provided by Kitada et al. (2003), who showed that two wooden rods moving in phase on the volar surface of the right second and third fingers are reliably perceived as a single object. Furthermore, a subsequent fMRI study
123
458
showed that left intraparietal and left inferior parietal cortex were specifically involved in this tactile integration, as these areas responded more strongly to the in-phase compared to the anti-phase stimulation of two adjacent digits of the right hand (Kitada et al. 2003). These findings raise several interesting questions: (1) Is left posterior parietal cortex similarly involved in the integration of tactile stimulation applied to the left (ipsilateral) side of the body? (2) To what extent is activity in intraparietal and inferior parietal cortex specific to the integration of stimulation applied to the hands? And (3) which areas are involved in the tactile integration of stimulation applied to digits that are not represented by contiguous areas of the somatosensory cortex? The present study addressed these questions by studying tactile integration both within and between limbs, and in both the hands and the feet. Two wooden rods were moved over two digits positioned to be spatially adjacent. The digits were either on the left hand (or foot) or one digit was from either hand (or foot). We first established that participants did indeed perceive a single object when the two rods moved in phase for all stimulation conditions. We then performed an fMRI study to reveal brain areas involved in unitary tactile perception for these conditions. Based on previous findings (Kitada et al. 2003), we expected to find stronger activity in left parietal cortex when one object was perceived relative to when two objects were perceived. Of interest was whether these regions were similarly involved in integrating tactile inputs across the midline. In the conditions in which stimulation was delivered to both hands or both feet, it is important to realize that we were not merely seeking evidence for bilateral representations (i.e., neural structures responsive to stimulation applied to both sides of the body) but for integration resulting in a unitary percept. Multiple studies have shown single units in the monkey with bilateral cutaneous receptive fields (Whitsel et al. 1969; Iwamura et al. 1994). There is also a functional imaging literature revealing areas in the human brain that respond during bilateral cutaneous stimulation. For example, Disbrow et al. (2001) demonstrated increased activity in human S2 when tactile stimulation using monofilament stimulation was applied to both hands when compared to one hand. Importantly, however, these studies did not investigate the integration of separate tactile stimuli into a unitary percept, as was done here.
Methods Behavioural experiment Participants Twelve participants (five male) were recruited from the University of Wales student population. Each gave written
123
Exp Brain Res (2010) 204:457–464
informed consent. The project was approved by the local ethics committee. Experimental paradigm Tactile stimulation of digits was administered by the experimenter with two smooth wooden dowels 0.5 cm in diameter moved across the glabrous surface of two adjacent digits. The dowels were moved in the same direction (in-phase) or in opposite directions (anti-phase). The dowels were moved at about 1 cm/s. In the one-hand conditions, the ring and the middle fingers were held extended and stimulated (the other fingers were held flexed). In the one-foot conditions, the first toe (hallux) and second toe were stimulated. In the both-hands conditions, the little fingers of the supinated hands were placed together and stimulated. In the both-feet conditions, the feet were placed together such that the two halluces were adjacent and could be stimulated. An adjustable Velcro strap was used to comfortably bind digits in some conditions. Four participants were tested in the left hand and foot conditions (in-phase and anti-phase), four in the right hand and foot conditions, and four in the both hands and feet conditions. There were four trials for each condition (e.g., left hand inphase), thus each participant experienced 16 trials. For each trial, the participant responded ‘one’ or ‘two’ according to whether he/she perceived one or two objects. Data were combined across participants. fMRI experiment Participants Eight new volunteers (three female) were recruited from the University of Wales student community to participate in the fMRI experiment. All participants were right handed. Participants satisfied all requirements in volunteer screening and gave informed consent. Ethics approval was obtained from the School of Psychology at the University of Wales, Bangor, and the North West Wales National Health Service Trust. Experimental paradigm Whilst the participant lay in the scanner, a trained experimenter applied the stimuli as described above to two digits. All participants lay prone with arms extended over the torso, supinated so the volar surfaces were superior. The feet were together (heel to heel and hallux to hallux). There were four combinations of digits (left hand, both hands, left foot, both feet (the right hand or foot only conditions were not used)), which could be stimulated either in-phase or anti-phase. This resulted in a total of eight conditions. For practical reasons,
Exp Brain Res (2010) 204:457–464
459
the hand and foot conditions were run in separate experimental runs. Each condition was repeated 4 times in total, twice in two runs. Participants thus performed 4 runs in total. Each run started and ended with a 21 s rest period. In addition, there was a 21 s rest period in the middle of the run. The tactile stimuli were applied for 15 s, followed by a 6 s window in which the participant indicated his or her response. Participants were instructed not to respond during the 15 s stimulation phase but to respond during the subsequent response phase. Timing was relayed to the experimenter over headphones by another experimenter in the control room (timing was dictated by Matlab program). For stimuli applied to the hands, a simple plantar flexion movement of the right or left foot indicated ‘‘one object felt’’ or ‘‘two objects felt’’, in response to the question ‘‘how many objects did you feel?’’ Participants were instructed about the task prior to the experiment and were not asked this question during the experiment. For stimuli applied to the feet, the participants lifted one or two fingers of the right hand. The order of conditions was counterbalanced.
Statistical analyses To identify brain regions that were more activated during the in-phase condition compared to the anti-phase condition, we performed a fixed-effects whole-brain group analysis contrasting in-phase versus anti-phase across all functional runs and conditions. The resulting activation map was thresholded at p \ 0.05 (Bonferroni corrected), corresponding to t [ 4.93 (df = 2,072). No spatial extent threshold was applied. The response profiles of regions activated by this contrast were then further investigated. Regions of interest (ROIs) were defined as the set of contiguously activated voxels. For each ROI and participant, we extracted the activation during each of the eight experimental conditions. These activation values were then tested, for each ROI separately, with a 2 9 2 9 2 repeated-measures ANOVA with phase (whether in-phase or anti-phase stimulation) HandFoot (whether hand(s) or foot/feet stimulated), and BothLeft (whether both sides or just left stimulated) as factors.
Functional imaging and analysis
Results
Data acquisition Data were collected on a 1.5T Philips MRI scanner with a SENSE parallel head coil. For functional imaging, an EPI sequence was used (TR = 3000 ms, TE 50 ms, flip angle 90°, FOV = 240, 30 axial slices, 64 9 64 in-plane matrix, 5 mm slice thickness). The scanned area covered both cerebral hemispheres and most of the cerebellum. Seventy-seven volumes were collected per run (231 s).
Behavioural experiment
Data analysis Pre-processing and statistical analysis of MRI data were performed using BrainVoyager (Brain Innovation, Maastricht, the Netherlands). The first three volumes of each run were discarded in order to avoid differences in T1 saturation. Functional data were corrected for head motion, spatially smoothed with a Gaussian kernel (FWHM 8 mm) and low-frequency drifts were removed with a temporal high-pass filter (0.006 Hz). Functional data were manually co-registered with 3D anatomical T1 scans (1 9 1 9 1.3 mm resolution) and then resampled to isometric 1 9 1 9 1 mm voxels with trilinear interpolation. The 3D scans were transformed into Talairach space, and the parameters for this transformation were subsequently applied to the co-registered functional data. In order to generate predictors for the multiple regression analysis, the event time series for each condition (the 15 s period in which stimulation was applied) were convolved with a delayed gamma function (delta = 2.5 s; tau = 1.25 s) in order to model the haemodynamic response. Voxel time series were z-normalized for each run.
Participants performed in line with our expectation (i.e., inphase stimulation was perceived as one object and antiphase stimulation as two). Participants always perceived one object with in-phase stimulation, and two objects with anti-phase stimulation, of the right hand. On only one occasion did a participant perceive two objects during inphase stimulation of the left hand or of both hands. On only one occasion did a participant perceive one object during anti-phase stimulation of both hands. When the stimuli were applied to one digit from either hand, the performance was almost ‘perfect’, i.e., even though the stimuli were applied to anatomically separate parts of the sensory epithelium, unitary perception was experienced. When stimuli were applied to the toes, responses were more variable, but in general in-phase stimulation was perceived as a single object and anti-phase as two. As shown in Table 1, significantly more ‘‘two objects’’ responses were recorded during anti-phase than in-phase stimulation for all conditions. fMRI experiment Behavioural results Seven participants performed the task as expected, in that they always indicated ‘one’ for in-phase movement and ‘two’ for anti-phase movement. One participant gave
123
460
Exp Brain Res (2010) 204:457–464
Table 1 Frequencies of ‘‘two objects’’ responses relative to the total number of responses obtained from both in- and anti-phase conditions, for both hands and feet Hands
Feet
Left
Total
Right
Both
Left
Right
Both
In-phase
0/16
0/16
1/16
5/16
3/16
2/16
Anti-phase
16/16
16/16
15/16
13/16
13/16
16/16
11/96 89/96
p-value
\0.001
\0.001
\0.001
\0.05
\0.005
\0.001
\0.001
p-values come from chi-square contingency tests, and indicate that performance was significantly different in the in-phase compared to the antiphase conditions
inconsistent and sometime early responses (such that he indicated his decision during the tactile stimulation); his data were not included in the analyses.
parietal lobule (peak coordinates (xyz): 54, -58, 31), left middle frontal gyrus (peak coordinates (xyz): -33, 26, 40) and left inferior temporal sulcus (peak coordinates (xyz): -39, -61, -8).
fMRI results Region of interest analysis
Whole-brain analyses As expected, strong activation was observed in multiple brain regions, including somatosensory cortices, when comparing all tactile stimulation conditions with baseline. This confirms that our conditions activated brain regions involved in tactile processing. Our questions and hypotheses focused on regions showing a main effect of phase (Table 2; Fig. 1): regions that were more activated for the in-phase than the anti-phase stimulation, averaged across limb conditions (left hand, both hands, left foot and both feet). Responses in these regions were further analysed and are described in more detail below. A whole-brain conjunction analysis, testing for regions where activity was higher (at p \ 0.05, uncorrected) for inphase than anti-phase stimulation for all four limb conditions separately, revealed activity in a subset of the regions activated by the main effect of phase, namely: right inferior Table 2 Areas activated for the main effect of phase (in-phase [ anti-phase), at t [ 4.93, (p \ 0.05, Bonferroni corrected for multiple comparisons) Area
Talairach coordinates X
Y
mm3
Max t
Z
Phase (in-phase C anti-phase) Right IPL
54
-49
28
714
6.0
Left MFG
-33
29
34
269
5.6
Left ITS
-42
-61
-11
213
5.5
Left IPS
-30
-46
31
21
5.2
Left cerebellar tonsil
-33
-58
-32
29
5.0
Listed coordinates are for the most significantly activated voxel in each region. Also listed are the extent of the activations (in mm3) and the t-value of the most significant voxel (max t). IPL inferior parietal lobule, MFG middle frontal gyrus, ITS inferior temporal sulcus and IPS intraparietal sulcus
123
Activation values (parameter estimates) within ROIs, defined by the contrast in-phase [ anti-phase movements (averaged across limb conditions), were extracted for all eight conditions and tested using a 2 (in-phase, antiphase) 9 2 (hand, foot) 9 2 (both, left) repeated-measures ANOVA for each ROI. Given ROI selection criteria, all ROIs showed a highly significant main effect of Phase in the repeated-measures ANOVAs, with higher responses to the in-phase conditions than the anti-phase conditions. Of interest was whether in some of the ROIs the effect of Phase was specific to a subset of the conditions tested (e.g., when tactile inputs are integrated across the midline) or whether this effect was more general and observed for all conditions. Left intraparietal sulcus (IPS) The contrast in-phase [ anti-phase revealed significant activation in left intraparietal sulcus (Fig. 1). Interestingly, the effect of phase in this area was strongly present for the conditions in which digits of both hands or both feet were stimulated, but was absent when digits of the same hand or same foot were stimulated: both hand (t6 = 3.4; p \ 0.05), both-feet condition (t6 = 2.8; p \ 0.05), left hand (t6 = 1.2; p = 0.3) and left foot (t6 = 1.0; p = 0.3). This dissociation was confirmed by a significant interaction of Phase and BothLeft (F1,6 = 23.1; p \ 0.005). Right inferior parietal lobule (rIPL) A large cluster of activation was observed in the right inferior parietal lobule (Fig. 1). Activity in this region could be observed in each individual participant (Fig. 2). In
Exp Brain Res (2010) 204:457–464
461
Fig. 1 Areas activated (at p \ 0.05, corrected for multiple comparisons) in a whole-brain group analysis contrasting inphase versus anti-phase tactile stimulation across all functional runs and conditions, overlaid on the anatomical brain image of one participant. For Talairach coordinates and spatial extent of the activations, see Table 2. The bar graphs give the activation values (parameter estimates) for each ROI in the different conditions. Dark grey bars correspond to the in-phase conditions, and light grey bars correspond to the anti-phase conditions. Error bars indicate within-subject SEM. Brain images are displayed in radiological convention (left of image is right of participant). a left intraparietal sulcus (IPS), b right inferior parietal lobule (IPL), c left middle frontal gyrus (MFG), d left inferior temporal sulcus (ITS) and e left cerebellar tonsil
123
462
Fig. 2 Right IPL was activated by the in-phase versus anti-phase contrast in 7/7 individual participants. Shown here are sagittal activation maps, thresholded at p \ 0.01 (uncorrected) for participant 1–6 and p \ 0.05 (uncorrected) for participant 7
contrast to the cluster in left IPS, activity in rIPL was higher for in-phase than anti-phase stimulation regardless of whether a single hand/foot or both hands/feet were stimulated (no interaction between phase and BothLeft: F1,6 = 0.1; p = 0.9). There was a main effect of HandFoot (F1,6 = 22.9; p \ 0.005), with overall stronger responses to the hand conditions than the foot conditions. Left middle frontal gyrus Another region to show a main effect of Phase was in the left prefrontal cortex (Fig. 1), centred in the middle frontal gyrus, Brodmann area 9. No other main effects or interactions were significant in this area, indicating a generally higher response for in-phase compared to anti-phase stimulation. Left inferior temporal sulcus A ventral temporal region in left inferior temporal sulcus (Fig. 1) showed a main effect of Phase. No other main effects or interactions were significant in this area, indicating a generally higher response for in-phase compared to anti-phase stimulation. Left cerebellar tonsil An area in the left cerebellar tonsil showed a trend towards an interaction between phase and BothLeft (F1,6 = 5.2; p = 0.06) and towards an interaction between phase and HandFoot (F1,6 = 4.2; p = 0.09). The effect of Phase in this area was significant for the both-hand condition (t6 = 3.4; p \ 0.05) and approached significance for the left-hand condition (t6 = 2.0; p = 0.09). No significant effect of phase was found for the two-feet conditions: both feet (t6 = 1.6; p = 0.2) and left foot (t6 = -0.2; p = 0.9).
Discussion We established behaviourally that in-phase stimulation of two digits in general produces a perception of a single moving object, whereas anti-phase movement is usually perceived as two objects, confirming and extending
123
Exp Brain Res (2010) 204:457–464
previous findings (Kitada et al. 2003). This effect is most reliable for stimulation of the fingers but also occurs for stimulation of the toes. The effect is also seen for the stimulation of one digit each from left and right, i.e., when tactile input has to be integrated across the midline. fMRI revealed that several brain regions were reliably more active during in-phase than anti-phase tactile stimulation. We discuss each region in turn. Posterior parietal cortex The largest cluster of activity was in the right IPL, which showed a reliably greater activation in the perception of one object compared to two (i.e., in-phase vs. anti-phase), both in the group analysis and in individual participants (Fig. 2). Right IPL responded significantly stronger to the hand than to the foot conditions, with responses to some of the foot conditions being below baseline. Nonetheless, rIPL was reliably activated when one object was perceived relative to when two objects were perceived for all stimulation conditions, as also shown by activity in this region in the whole-brain conjunction analysis. This suggests that responses were modulated by the phase of the moving rods relatively independently of overall activity levels (see Morcom and Fletcher 2007 for a discussion on the difficulty of interpreting fMRI activity levels relative to baseline). The IPL area we found to be active in the right hemisphere [peak Talairach coordinates: x = 54, y = -49, z = 28 for the main effect of Phase and x = 54, y = -58, z = 31 in the conjunction analysis] closely corresponds to the left IPL area reported for the same contrast by Kitada et al. [x = -54, y = -58, z = 28]. Importantly, Kitada et al. tested right hand stimulation, whereas our unimanual stimulation was applied to the left hand, indicating that this region of IPL may respond predominantly to tactile integration of input on the contralateral hand. In the present study, bimanual tactile integration also activated the right IPL, possibly reflecting a right-hemisphere-based bilateral representation of the body (Vallar 1997). An area in left IPS also showed greater activation in the perception of one compared to two objects. This increased activity occurred when both hands or both feet were stimulated but not when only the left hand or foot was stimulated. We therefore suggest that this area in left IPS may be specifically involved in tactile integration across the midline. Several groups have suggested that the IPS may be involved in integrating or ‘‘binding’’ multiple sources of sensory input. For instance, a previous fMRI study (Shafritz et al. 2002) found that regions in the right IPS were more activated in visual feature conjunction tasks than in single feature tasks when multiple objects were presented at the same time but not when they were presented sequentially. Another study (Wardak et al. 2002)
Exp Brain Res (2010) 204:457–464
examined the effect of reversibly deactivating the lateral intraparietal area of macaques on visual search tasks. The animals were still able to perform visual search based on simple features but were much slower to perform visual search based on the analysis of feature conjunctions. The IPL and the IPS are components of the posterior parietal cortex, which have long been considered important for the representation of the body and higher order aspects of haptic perception (Critchley 1953; Hyvarinen 1982). Interestingly, patients with PPC lesions may have a disruption of their body representation (Vallar 1997) and show tactile object agnosia (that is, they fail to recognize objects by touch alone (Reed and Caselli 1994; Reed et al. 1996). Our results raise the possibility that these deficits may in part stem from a disruption to the neural mechanisms that allow integration of sensory input from multiple digits. Left inferior temporal sulcus A ventral temporal region in left inferior temporal sulcus also showed a main effect of phase. This area has generally been considered to be part of the ventral visual processing stream concerned with object recognition. However, recent findings provide evidence for a jointly visual and tactile representation of object shape in this region. For example, Amedi et al. (2002) reported that a subregion of the lateraloccipital complex [Talairach peak x = -47, y = -62, z = -10] comparable to the region identified here [x = -42, y = -61, z = -11] is activated by both visual and tactile presentation of objects but not by auditory stimuli indicative of objects. Our results indicate that passive stimulation of the hand and foot by even simple objects may activate this region and that it may be sensitive to grouping principles.
463
viewed as a motor structure, there has been increasing interest of late in the sensory (Gao et al. 1996) and more ‘cognitive’ functions of the cerebellum (Schmahmann 2004). One might speculate that the determination of whether haptic stimulation is in or out of phase depends on precise information about the relative timing of the tactile stimulation of the two digits; the cerebellum appears to be crucial for such aspects of timing (Ivry 1997; Wing and Bracewell 2009).
Conclusion The present behavioural experiment established that inphase stimulation of two digits in general produces a perception of a single moving object, whereas anti-phase movement is usually perceived as two objects. This effect is most reliable for stimulation of the fingers but also occurs for stimulation of the toes. The effect is also seen for stimulation of one digit each from left and right, i.e., when tactile input is integrated across the midline. The functional imaging experiment revealed a network of brain regions that were reliably more active during in-phase than anti-phase tactile stimulation. We suggest that this network underlies our ability to form a unified percept of the presence of an object that comes into contact with different parts of the body surface. In particular, we identify a region in the left IPS as playing a key role in the problem of detecting correspondences in stimulation across the midline of the body. Acknowledgments This work was supported by a European Commission grant to AW and RMB (IST-2001-38040).
References Left middle frontal gyrus The final cortical region to show a reliable effect of phase was in the left middle frontal gyrus. We did not anticipate prefrontal activity related to our manipulation, as the frontal lobes are not traditionally associated with haptic processing. Interestingly, both the ROI analysis and the whole-brain conjunction analysis showed that activity in this region was reliably stronger when one object was perceived relative to when two objects were perceived for all stimulation conditions. We speculate that this may reflect modality-independent perceptual integration and perceptual decision processes (Heekeren et al. 2004). Cerebellum The only subcortical region to show a reliable effect of phase was the left cerebellar tonsil. Although traditionally
Amedi A, Jacobson G, Hendler T, Malach R, Zohary E (2002) Convergence of visual and tactile shape processing in the human lateral occipital complex. Cereb Cortex 12:1202–1212 Braddick OJ (1980) Low-level and high-level processes in apparent motion. Philos Trans R Soc Lond B Biol Sci 290:137–151 Critchley M (1953) The parietal lobes. Hafner, New York Disbrow E, Roberts T, Poeppel D, Krubitzer L (2001) Evidence for interhemispheric processing of inputs from the hands in human S2 and PV. J Neurophysiol 85:2236–2244 Gao JH, Parsons LM, Bower JM, Xiong J, Li J, Fox PT (1996) Cerebellum implicated in sensory acquisition and discrimination rather than motor control. Science 272:545–547 Heekeren HR, Marrett S, Bandettini PA, Ungerleider LG (2004) A general mechanism for perceptual decision making in the human brain. Nature 431:859–862 Hyvarinen J (1982) Posterior parietal lobe of the primate brain. Physiol Rev 62:1060–1129 Ivry R (1997) Cerebellar timing systems. Int Rev Neurobiol 41:555– 573 Iwamura Y, Iriki A, Tanaka M (1994) Bilateral hand representation in the postcentral somatosensory cortex. Nature 369:554–556
123
464 Kitada R, Kochiyama T, Hashimoto T, Naito E, Matsumura M (2003) Moving tactile stimuli of fingers are integrated in the intraparietal and inferior parietal cortices. Neuroreport 14:719– 724 Morcom AM, Fletcher PC (2007) Does the brain have a baseline? Why we should be resisting a rest. Neuroimage 37:1073–1082 Reed CL, Caselli RJ (1994) The nature of tactile agnosia: a case study. Neuropsychologia 32:527–539 Reed CL, Caselli RJ, Farah MJ (1996) Tactile agnosia. Underlying impairment and implications for normal tactile object recognition. Brain 119:875–888 Schmahmann JD (2004) Disorders of the cerebellum: ataxia, dysmetria of thought, and the cerebellar cognitive affective syndrome. J Neuropsychiatry Clin Neurosci 16:367–378
123
Exp Brain Res (2010) 204:457–464 Shafritz KM, Gore JC, Marois R (2002) The role of the parietal cortex in visual feature binding. Proc Natl Acad Sci USA 99:10917–10922 Vallar G (1997) Spatial frames of reference and somatosensory processing: a neuropsychological perspective. Philos Trans R Soc Lond B Biol Sci 352:1401–1409 Wardak C, Olivier E, Duhamel JR (2002) Saccadic target selection deficits after lateral intraparietal area inactivation in monkeys. J Neurosci 22:9877–9884 Whitsel BL, Petrucelli LM, Werner G (1969) Symmetry and connectivity in the map of the body surface in somatosensory area II of primates. J Neurophysiol 32:170–183 Wing AM, Bracewell RM (2009) Motor timing. In: Squire LR (ed) Encyclopedia of neuroscience, vol 5. Academic Press, Oxford, pp 1067–1075