Europe PMC
Updates for full text publications in Europe PMC are currently unavailable due to planned maintenance on 3-4 June 2024. For further information contact Europe PMC Helpdesk.

This website requires cookies, and the limited processing of your personal data in order to function. By using the site you are agreeing to this as outlined in our privacy notice and cookie policy.

Abstract 


In the non-human primate brain, a number of multisensory areas have been described where individual neurons respond to visual, tactile and bimodal visuotactile stimulation of the upper limb. It has been shown that such bimodal neurons can integrate sensory inputs in a linear or nonlinear fashion. In humans, activity in a similar set of brain regions has been associated with visuotactile stimulation of the hand. However, little is known about how these areas integrate visual and tactile information. In this functional magnetic resonance imaging experiment, we employed tactile, visual, and visuotactile stimulation of the right hand in an ecologically valid setup where participants were looking directly at their upper limb. We identified brain regions that were activated by both visual and tactile stimuli as well as areas exhibiting greater activity in the visuotactile condition than in both unisensory ones. The posterior and inferior parietal, dorsal, and ventral premotor cortices, as well as the cerebellum, all showed evidence of multisensory linear (additive) responses. Nonlinear, superadditive responses were observed in the cortex lining the left anterior intraparietal sulcus, the insula, dorsal premotor cortex, and, subcortically, the putamen. These results identify a set of candidate frontal, parietal and subcortical regions that integrate visual and tactile information for the multisensory perception of one's own hand.

Free full text 


Logo of jnLink to Publisher's site
J Neurophysiol. 2011 Feb; 105(2): 910–922.
Published online 2010 Dec 8. https://doi.org/10.1152/jn.00840.2010
PMCID: PMC3059180
PMID: 21148091

Integration of Visual and Tactile Signals From the Hand in the Human Brain: An fMRI Study

Abstract

In the non-human primate brain, a number of multisensory areas have been described where individual neurons respond to visual, tactile and bimodal visuotactile stimulation of the upper limb. It has been shown that such bimodal neurons can integrate sensory inputs in a linear or nonlinear fashion. In humans, activity in a similar set of brain regions has been associated with visuotactile stimulation of the hand. However, little is known about how these areas integrate visual and tactile information. In this functional magnetic resonance imaging experiment, we employed tactile, visual, and visuotactile stimulation of the right hand in an ecologically valid setup where participants were looking directly at their upper limb. We identified brain regions that were activated by both visual and tactile stimuli as well as areas exhibiting greater activity in the visuotactile condition than in both unisensory ones. The posterior and inferior parietal, dorsal, and ventral premotor cortices, as well as the cerebellum, all showed evidence of multisensory linear (additive) responses. Nonlinear, superadditive responses were observed in the cortex lining the left anterior intraparietal sulcus, the insula, dorsal premotor cortex, and, subcortically, the putamen. These results identify a set of candidate frontal, parietal and subcortical regions that integrate visual and tactile information for the multisensory perception of one's own hand.

INTRODUCTION

The human brain receives information through multiple sensory channels, the inputs of which can be optimally combined to form a coherent percept of the world surrounding us for the facilitation of adaptive behavior (Ernst and Bülthoff 2004; Stein and Stanford 2008). Similarly, to maintain a coherent representation of the body to enable interactions to take place with external objects, the brain needs to integrate visual and somatosensory signals (e.g., Graziano and Botvinick 2002). An extensive body of evidence supports the notion that individual neurons in the non-human primate brain discharge when a specific body part is stimulated across somatic and visual modalities. Over the past few decades, such multisensory, visuotactile neurons have been identified in a set of cortical and subcortical regions of the macaque brain. Neurons with visuotactile properties have been found in area 7b in the inferior posterior parietal lobe (Hyvärinen 1981; Hyvärinen and Poranen 1974), the ventral intraparietal area (VIP) in the fundus of the intraparietal sulcus (Avillac et al. 2005; Colby and Duhamel 1991; Colby et al. 1993; Duhamel et al. 1998), the premotor cortex in the frontal lobe (Graziano and Gandhi 2000; Graziano et al. 1997; Rizzolatti et al. 1981), the upper bank of the superior temporal sulcus (Bruce et al. 1981), and, subcortically, the superior colliculus (Stein and Meredith 1993; Wallace and Stein 2001; Wallace et al. 1996) and putamen (Graziano and Gross 1993). A densely interconnected subset of these visuotactile areas, consisting of parietal and premotor regions, as well as the putamen, is thought to play a fundamental role in the multisensory representation of the space in the immediate vicinity of the body (Graziano and Botvinick 2002). The properties of the visuotactile receptive fields of these neurons, centered around different body parts and restricted to the space immediately surrounding them, the so-called “peripersonal space” (Rizzolatti et al. 1981, 1997), make this set of multisensory regions an ideal candidate for hosting the mechanisms necessary to enable one to interact with the external world (Graziano and Botvinick 2002; Rizzolatti et al. 1987), defend the body from potential threats (Cooke and Graziano 2004; Cooke et al. 2003; Graziano et al. 2002), and identify one's own limbs (Ehrsson et al. 2004, 2005, 2007).

Seminal studies on the superior colliculus of the cat (Stein and Meredith 1993; Stein and Stanford 2008) showed that individual cells not only respond to multiple sensory modalities but integrate their signals to produce a combined, multisensory response the magnitude of which reflects a linear or nonlinear computation performed on the unisensory inputs. Avillac and colleagues (2007) were the first to show that individual neurons in an intraparietal node (area VIP) of the macaque multisensory network described above integrate visual and tactile signals in a fashion that obeys the computational principles described in the superior colliculus (Stein and Meredith 1993; Stein and Stanford 2008).

A growing body of evidence from recent neuroimaging studies in healthy humans suggests that a similar set of regions exists in the human brain where visual and tactile signals converge (Banati et al. 2000; Bremmer et al. 2001; Driver and Noesselt 2008; Hadjikhani and Roland 1998; Lloyd et al. 2003; Makin et al. 2007; Pasalar et al. 2010; Sereno and Huang 2006; Tal and Amedi 2009). Visual unisensory stimulation near the hand and tactile unisensory stimulation on the hand activate the anterior part of the intraparietal cortex (Makin et al. 2007). Lloyd and colleagues described how visuotactile stimulation of the hand activates premotor and intraparietal areas, but they did not compare these responses to unisensory conditions (Lloyd et al. 2003). Studies on the rubber hand illusion, a multisensory illusion where participants experience a referral of touch from their hidden real hand to a rubber hand touched in full view, have related activation of premotor and intraparietal areas to this visuotactile perceptual phenomenon (Ehrsson et al. 2004, 2007). However, the rubber hand illusion is also associated with the resolution of multisensory conflict and recalibration of position sense of the hand; in light of this, it cannot be excluded that these processes also contribute to the observed frontoparietal activations. Thus what is missing from the human neuroimaging literature available to date is an experiment that uses natural stimuli to compare unisensory and visuotactile stimulation of the hand directly to identify the hypothesized multisensory responses in the key candidate areas defined in non-human primates.

In this functional magnetic resonance imaging (fMRI) study, we investigated the multisensory integrative properties of regions in the human brain that are sensitive to visual and tactile stimulation of one's own real hand. Specifically, the aim was to classify such regions in terms of the changes measured at the level of the blood-oxygen-level-dependent (BOLD) signal when comparing unisensory and multisensory stimulation of the hand in humans. In particular, we tested our prediction that some of these areas would display additive and, crucially, superadditive multisensory responses during bisensory visuotactile stimulation. Such a finding would support the hypothesis that a set of visuotactile regions exists in the human brain, similar to those previously identified in non-human primates, where visual and tactile signals are integrated to attain a coherent multisensory perception of the body.

METHODS

Participants

Twenty-four healthy right-handed subjects [5 females, 25 ± 4 (SD) yr] without a history of neurological or psychiatric disorders participated in the experiment. Handedness was self-reported by the participants in a questionnaire before the experiment. We only recruited subjects who were ≥175 cm tall to ensure that their hand was accessible for direct tactile stimulation in the constrained space of the scanner environment. All subjects gave their written informed consent before taking part in the study, which was approved by the local ethical committee at the Karolinska Institutet.

Experimental setup

During scanning, the participant lay comfortably in a supine position on the bed of the MR scanner. To ensure that multisensory stimulation of the hand was as natural as possible, we used a setup that gave the subjects direct view of their hand and that employed tactile stimulation conducted with real objects (Fig. 1, left).

An external file that holds a picture, illustration, etc.
Object name is z9k0021105880001.jpg

Experimental setup and design. Photograph of the experimental setup and diagram of the experimental design. The participant's head was tilted to provide direct vision of the right hand, which was placed on an angled support. Inset: how the hand looked from the participant's perspective and illustrates the approximate position of the fixation point, above the right hand. The experimental design was a 2 × 2 factorial, manipulating the modality of stimulation and the presence thereof. RestEC, rest condition with eyes closed. T, tactile; V, visual; VT, visuotactile.

This was achieved by tilting the participant's head downward and placing his or her right hand on a custom-made support positioned above the waist. The required tilt of the head was achieved by slanting the head coil with the help of a custom-made wooden wedge (through an angle of ~11°). In addition, the participant's head was tilted by another 20° using pillows and foam pads. The participant wore MR-compatible headphones to dampen the scanner noise as well as to receive the appropriate set of instructions throughout the acquisition sessions (see following text). The resting left hand was placed in a comfortable position alongside the participant's body. Instructions were transmitted to the experimenter and the participant via a computer running Presentation 14.1 (Neurobehavioral Systems, Albany, CA) and connected to an MR-compatible sound delivery system (Nordic Neuro Lab, Bergen, Norway).

Experimental design

A 2 × 2 factorial design was used, with the sensory modality of stimulation (visual, tactile) and presence thereof (present, absent) as the experimental factors (Fig. 1, right). A block design was used to maximize the efficiency of the experimental design (Friston et al. 1999). Each stimulation or baseline epoch lasted 16 s and was followed by a 3 s intertrial interval during which both experimenter and participant received a brief audio instruction. The latter identified the upcoming stimulation epoch for the experimenter and cued the participant as to whether keep his or her eyes open or closed. Two baseline conditions were included, with eyes open and closed, respectively. This made it possible to use separate baselines for the visual and tactile unisensory conditions, thereby ensuring that the conjunction analysis was based on orthogonal contrasts (Beauchamp et al. 2008; Bremmer et al. 2001; Friston et al. 2005; Nichols et al. 2005; Price and Friston 1997). The participant was instructed to keep his or her eyes closed during purely tactile stimulation and during the corresponding baseline or open during all visual conditions including the relative baseline. During all epochs involving visual (-tactile) stimulation, the participant was further instructed to fixate on a point ~2 cm above his or her right hand. The fixation point was marked and visible during all conditions carried out with the participant's eyes open. All stimulations were delivered using a spherical object made of a soft material attached to a 50-cm-long stick, which allowed the experimenter to hide his hand from the participant's view while still permitting carefully controlled stimulation. We used real physical objects rather than computer generated stimuli because the former are known to be more potent in activating multisensory neurons in experiments in non-human primates (Graziano and Gandhi 2000). During tactile stimulation, the participant's right index finger was touched by the same object used in the visual condition. A trained experimenter moved the ball on the skin from the distal to the proximal phalanx, yielding a discrete number of events in the 16-s-long epoch. In total, 10 brush strokes were applied to the index finger during each tactile or visuotactile stimulation epoch. To ensure that the same number of stimuli was presented during each epoch, the experimenter followed a prerecorded metronome. Visual stimulation close to the hand was delivered by moving the object at ~2 cm from the participant's right index finger following a sequence of events identical to the one applied during the tactile epochs. During bisensory (visuotactile) stimulation, the experimenter repeated the tactile stimulation pattern while the subject fixed his or her gaze just above his or her own right hand. All participants underwent three sessions of functional acquisition, each composed of five blocks in which all epochs (including the baselines) were repeated once. The baseline condition made with eyes closed always followed the tactile epoch, while the baseline condition made with eyes open always followed the last of the visual epochs, the order of which was randomized across sessions.

fMRI acquisition

fMRI acquisition was performed using a Siemens TIM Trio 3T scanner equipped with a 12-channel head coil (Siemens, Erlangen, Germany). We acquired gradient echo T2*-weighted echo-planar images with blood-oxygenation-level-dependent (BOLD) contrast as an index of brain activity (Logothetis et al. 2001). A functional image volume comprised 47 continuous near-axial slices of 3-mm thickness (with a 0.3-mm interslice gap), which ensured that the whole brain was within the field of view (58 × 76 matrix, 3.0 × 3.0 mm in-plane resolution, TE = 40 ms). One complete volume was collected every 3 s (TR = 3,000 ms). A total of 513 functional volumes were collected for each participant (171 in each run). In each run, an initial baseline of 18 s and a final baseline of 20 s were included. To facilitate anatomical localization of statistically significant activations, a high-resolution structural image was acquired for each participant at the beginning of the experiment (3D MPRAGE sequence, voxel size 1 × 1 × 1 mm, FOV 250 × 250 mm, 176 slices, TR = 1,900 ms, TE = 2.27 ms, flip angle = 9°).

Data analysis

The fMRI data were analyzed using the Statistical Parametric Mapping software package, version 8 (SPM8; http//:www.fil.ion.ucl.ac.uk/spm; Wellcome Department of Cognitive Neurology, London, UK). The first three volumes were discarded from further analysis due to non-steady-state magnetization. The functional images were realigned to correct for head movements and co-registered with each participant's high resolution structural scan. The anatomical image was then segmented into white matter, gray matter, and cerebrospinal fluid partitions and normalized to the Montréal Neurological Institute (MNI) standard brain. The same transformation was then applied to all functional volumes, which were resliced to a 2.0 × 2.0 × 2.0 mm voxel size. The functional images were then spatially smoothed with an 8-mm full-width-at-half-maximum (FWHM) isotropic Gaussian kernel. For each individual subject's dataset, we fitted a linear regression model (a general linear model) to the data (in the 1st level analysis). For each of the experimental conditions described in the preceding text, we defined a boxcar function time-locked to the onset of the corresponding epoch, which was then convolved with the canonical hemodynamic response function implemented in SPM8. The realignment parameters were included in the model as regressors of no interest to account for residual head motion effects. We then defined linear contrasts within the general linear model (see following text). To accommodate intersubject variability, the contrast images for the comparisons of interest from all participants were entered into a random effect group analysis (2nd level analysis). To account for the problem of multiple comparisons in the statistical analysis of whole-brain data, we only report peaks of activation surviving a significance threshold of P < 0.05, corrected using topological peak-false discovery rate (FDR) as implemented in SPM8 (Chumbley et al. 2010). In addition, given the strong a priori hypotheses on the anatomical localization of visuotactile multisensory regions in the human brain (see introduction), a significance level of 0.05 was used with small volume corrections centered around relevant coordinates from a previous study (Ehrsson et al. 2004).

Crucially, to obtain a direct comparison between the multisensory stimulation and the unisensory conditions, we employed a 2 × 2 factorial analysis taking the modality of stimulation (tactile or visual) and the presence thereof (present, absent) as factors (Fig. 1, right). In the following, T stands for the tactile only condition, V for the visual only condition, VT for the visuotactile condition, and RestEC/RestEO for the baseline condition with eyes closed/open, respectively. Regions displaying superadditive responses were identified by examining the positive interaction contrast, defined as [(VT − V) − (T − RestEC)], which is equivalent to (VT −RestEC) > [(V − RestEC) + (T − RestEC)]. This contrast reveals the voxels where visuotactile stimulation of the hand produces a response that is significantly higher than the sum of the two unisensory stimuli presented in isolation. A potential concern with the latter contrast is that it can also include voxels where the positive interaction is driven by deactivations during unisensory conditions (negative BOLD-signal with respect to the baseline) (e.g., Kawashima et al. 1995), thereby yielding unwanted false positives (Calvert 2001). Therefore we excluded from the analysis all those voxels that exhibited a significant deactivation (negative BOLD-response relative to the baseline, P < 0.01 uncorrected) for at least one of the sensory modalities of interest. Because the participants had their eyes closed in the tactile condition, we included the rest condition where the participant's eyes were closed to ensure that the effects of eyes-closed/eyes-open were matched in the analysis of the factorial design. This choice of rest condition assured that the key contrast, the interaction term, was well-balanced and not biased by a comparison of tactile responses with baseline intervals carried out with the participant's eyes open.

In the factorial design, one can also examine the main effect of tactile stimulation, defined as [(T + VT) − (V + RestEC)], and the main effect of visual stimulation corresponding to the contrast [(V + VT) − (T + RestEC)]. A conjunction analysis of the main effects was then carried out to identify the brain regions performing multisensory integration in an additive fashion in which the response to visuotactile stimulation does not exceed the algebraic sum of the responses to unisensory visual and tactile stimulation.

Finally, we also tested for possible subadditive effects by applying the negative interaction contrast [(T − RestEC) − (VT − V)], which is equivalent to (VT − RestEC) < [(V − RestEC) + (T − RestEC)].

As complementary analysis, to identify areas showing evidence of multisensory convergence, we performed a conjunction analysis at the group level (Friston et al. 2005; Nichols et al. 2005; Price and Friston 1997) of the unisensory activation maps (T − RestEC) and (V − RestEO) where RestEC and RestEO stand for the baseline with eyes closed and open, respectively. Such analysis identifies brain regions responding significantly to both unisensory visual and tactile stimuli.

Visualization of the effect size of each contrast was achieved by generating plots by extracting the contrast estimates (the beta parameters of the general linear model) for the visuotactile, visual, and tactile conditions, all relative to the mean across the experimental session. FreeSurfer (Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Boston, MA) was used to display significant activations on a reconstructed canonical cortical surface. Significant activations were anatomically localized by superimposing the activation maps onto the average normalized high-resolution MRI image and labeled using the nomenclature from a human brain atlas (Duvernoy 1999). To support localization with respect to cytoarchitectonically defined areas, we used the Anatomy toolbox for SPM (Eickhoff et al. 2005).

RESULTS

Multisensory integration: factorial analysis

The factorial analysis we describe here allowed for the identification of those voxels showing evidence of multisensory integration when comparing bisensory and unisensory conditions using the 2 × 2 factorial design.

Table 1 and Fig. 2 contain a summary of the key findings, highlighting areas providing evidence of superadditive and additive multisensory responses.

Table 1.

Superadditive and additive BOLD responses

MNI coordinates
Anatomical locationxyzPeak t-ValuePeak PCluster Size
Superadditive responses (VT − V) > (T − RestEC)
    Left anterior intraparietal sulcus−32−46624.580.005*108
    Left putamen−24484.190.011*25
    Left precentral gyrus (PMd)−34−16623.500.015*45
    Left insula−32−26163.490.015*51
    Right parietal operculum52−22183.01<0.001†36
    Right cerebellum16−62−463.37<0.001†87
Additive responses [(T + VT) − (V + RestEC)] ∩ [(VT − V) − (T + RestEC)]
    Left parietal operculum−50−302011.61<0.001531
    Left postcentral sulcus−38−386210.12<0.001395
    Left supramarginal gyrus−56−26229.45<0.001531
    Right cerebellum (lobule VIIIb)20−60−468.84<0.00179
    Left postcentral sulcus−56−26448.55<0.001531
    Left supramarginal gyrus−56−22307.55<0.001531
    Left precentral gyrus (PMd)−38−12646.360.009117
    Right cerebellum (lobule VIIIa)14−68−446.000.01579
    Right lateral fissure, posterior segment46−32245.870.017336
    Left ventrolateral thalamus−16−2245.850.01796
    Right parietal operculum54−28245.450.033336
    Left precentral gyrus (PMv)−582364.070.041(*)32

Significant additive responses were obtained from the conjunction analysis of the two main effects, defined as [(T + VT) − (V + RestEC)] ∩ [(VT + V) − (T + RestEC)]; superadditive responses were obtained from the interaction contrast (VT − V) > (T − RestEC) masked exclusively with negative blood-oxygen-level-dependent (BOLD) responses following one or both unisensory conditions (P < 0.01 uncorrected). The MNI coordinates, t-values, P values, and cluster sizes of the activation maximum in each of the corresponding anatomical regions are reported. In this case, peaks sharing the same cluster size are part of the same activation cluster. The t-value threshold corresponding to the whole-brain FDR-threshold of 0.05 was 4.63. P < 0.05 after correction for multiple comparisons in the whole brain using topological FDR for peak height.

*P < 0.05 after small volume correction based on a priori hypotheses. †P < 0.001 uncorrected.
An external file that holds a picture, illustration, etc.
Object name is z9k0021105880002.jpg

Illustration of the superadditive and additive responses. Activation maps corresponding to additive (red) and superadditive (blue) responses to multisensory stimulation, displayed on an inflated average cortical surface of the left hemisphere. For display purposes only, the statistical thresholds were chosen to be P < 0.005 for both contrasts. A black star in the vicinity of a blue color-coded cluster indicates superadditive responses that are significant after correction for multiple comparisons (P < 0.05 corrected); these are located in the insular region, premotor cortex and anterior intraparietal cortex. All red color-coded clusters contained statistically significant (P < 0.05 corrected) local maxima, reported in Table 1. PMv, ventral premotor cortex; PMd, dorsal premotor cortex; PoCS, postcentral sulcus; aIPS, anterior intraparietal sulcus; SMG, supramarginal gyrus; SII/ins, secondary somatosensory cortex / insular region. The data corresponding to this figure are reported in Table 1.

First, we sought to identify and localize active clusters showing nonlinear superadditive responses by examining the positive interaction contrast, defined as [(VT −V) − (T − RestEC)]. Clusters obtained from such a contrast contain voxels where the BOLD signal measured from epochs of visuotactile stimulation is significantly greater than the sum of the responses registered during the two unisensory conditions. We found significant activations (Fig. 3; Table 1) in the left anterior intraparietal sulcus (−32,-46,62; P = 0.013 corrected), left insular cortex (−32,-26,16; P = 0.004 corrected), left dorsal premotor cortex (−34,-16,62; P = 0.015 corrected), and left putamen (−24,4,8; P = 0.011 corrected). At a lower threshold, activations were also observed in the right parietal operculum (52, −22, 18; P < 0.001 uncorrected) and right cerebellum (16, −62, −46; P < 0.001 uncorrected).

An external file that holds a picture, illustration, etc.
Object name is z9k0021105880003.jpg

Superadditive responses to multisensory stimulation. Parameter estimates (±SE) for uni- and multisensory stimuli in a set of regions where superadditive responses were found (P < 0.05 corrected for multiple comparisons). Each plot is paired with 2 anatomical sections of a standard brain displaying the localization of the superadditive responses. Coordinates in MNI space. TPar op, parietal operculum. Cer, cerebellum.

Next we looked at the contrast examining the additive multisensory responses by performing a conjunction analysis of the two main effects, defined as [(T + VT) − (V + RestEC)] ∩ [(V + VT) − (T + RestEC)] (Fig. 4; Table 1). In the parietal lobe, such responses were located in a cluster spanning the left postcentral sulcus, the superior parietal gyrus and the anterior intraparietal sulcus (with the peak at −38, −38, 62; P < 0.001 corrected), in the left supramarginal gyrus (−56, −26, 22; P < 0.001 corrected) and in the left parietal operculum/insula (-−0, −30, 20; P < 0.001 corrected). Significant clusters were also detected in the right parietal operculum and in the cortex adjacent to the posterior segment of the right lateral fissure (46, −32, 24; P = 0.017 corrected). In the frontal lobe, clusters showing additive responses were located in the dorsal and in the ventral parts of the premotor cortex (at −38, −12, 64 with P = 0.009 and at −58, 2, 36 with P = 0.041 corrected, respectively). Subcortically, clusters of active voxels displaying additive multisensory responses were found in the right cerebellum (lobule VIII, 20, −60, −46; P < 0.001 corrected) and in the left thalamus (−16, −22,4; P = 0.017 corrected).

An external file that holds a picture, illustration, etc.
Object name is z9k0021105880004.jpg

Additive responses to multisensory stimulation. Parameters estimates (±SE) for uni- and multisensory stimuli in a set of regions (see text for clarification) where additive responses were found (P < 0.05 corrected for multiple comparisons). Each plot is paired with 2 anatomical sections of a standard brain displaying the localization of the additive responses. All reported coordinates are in MNI space. Supramarg, supramarginal.

Finally, we probed for subadditive responses by examining the negative interaction contrast, defined as [(T − RestEC) − (VT-V)]. No significant effects were found for this contrast, and, therefore no visuotactile brain regions were classified as subadditive in this study.

Multisensory convergence: conjunction analysis

As a complementary analysis, a conjunction analysis of the two unisensory activation maps was performed to identify areas of convergence, i.e., to identify all the voxels onto which responses to both sensory modalities mapped significantly (T- RestEC ∩ V - RestEO, Fig. 5; Table 2).

An external file that holds a picture, illustration, etc.
Object name is z9k0021105880005.jpg

Conjunction analysis of unisensory responses. Conjunction analysis of the 2 unisensory activation maps, showing clusters activated by both tactile and visual stimuli, displayed on an inflated canonical cortical surface. Voxelwise threshold: P < 0.005 uncorrected, for illustration purposes only. SII/ins, secondary somatosensory cortex and insula. See Table 2 for additional information.

Table 2.

Conjunction analysis of unisensory responses

MNI Coordinates
Anatomical LocationxyzPeak t-ValuePeak PCluster Size
Conjunction of unisensory responses (T − RestEC) ∩ (V − RestEO)
    Left postcentral sulcus*−36−38606.270.00276
    Left parietal operculum−48−34205.530.007340
    Left precentral gyrus (PMd)−40−10624.990.021135
    Left precentral gyrus (PMv)−602323.180.028†17
    Left supramarginal gyrus−56−26424.15<0.001†340
    Left putamen−222−44.14<0.001†9
    Right parietal operculum48−28244.25<0.001220
    Right cerebellum (Lobule VI)32−60−203.02<0.001‡8

Significant activations from a conjunction analysis performed on the two unisensory activation maps. The MNI coordinates, t-values, P values, and cluster sizes for the maximum activation in each of the corresponding anatomical regions are reported. The t-value threshold corresponding to the whole-brain FDR-threshold of 0.05 was 4.78. In this case, peaks sharing the same cluster size are part of the same activation cluster. P < 0.05 after correction for multiple comparisons in the whole brain using topological FDR for peak height.

*Part of a cluster extending into the anterior section of the intraparietal sulcus (IPS) and superior parietal gyrus. †P < 0.05 after small volume correction based on a priori hypotheses.
P < 0.001 uncorrected.

The cortex lining the left postcentral sulcus (x = −36, y = −38, z = 60; P = 0.002, part of a cluster extending into the most superior portion of the anterior intraparietal sulcus; all reported coordinates are in MNI space), the left parietal operculum and insular cortex (SII, −48, −34, 20; P = 0.007) and the left dorsal (−40, −10, 62; P = 0.021) and ventral (− 60, 2, 32; P = 0.028) premotor cortices all exhibited significant BOLD responses to both tactile and visual unisensory stimulation. At a lower threshold this contrast revealed additional active areas in the left supramarginal gyrus (−56, −26, 42; P < 0.001 uncorrected), right parietal operculum (SII, 48, −28, 24; P < 0.001 uncorrected), right cerebellum (lobule VI, 32, −60, −20; P < 0.001 uncorrected) and left putamen (−22, 2, −4; P < 0.001 uncorrected).

To rule out the possibility that the identification of areas showing convergence and revealing integration of hand-related visual and tactile signals was the result of the intersubject averaging performed in group analyses of fMRI data, we carefully checked the key contrasts in our analysis (see preceding text) for all participants individually. This analysis showed a remarkable consistency across participants for the contrasts described in the preceding text, as exemplified in Fig. 6 for four representative subjects.

An external file that holds a picture, illustration, etc.
Object name is z9k0021105880006.jpg

Consistent activations across individual participants. Convergence of unisensory tactile and visual responses in 4 individual participants, labeled A–D. Middle: active clusters in the cortex within and around the intraparietal sulcus outlined by green squares. Top: an enlarged view of the area within each square; for ease of anatomical localization, a segment of the intraparietal sulcus is outlined. Bottom: active clusters in the ventral premotor cortex (inferior part of the precentral gyrus and sulcus), highlighted by green arrows. All activation maps were thresholded at P < 0.001, uncorrected for display purposes only.

DISCUSSION

This fMRI study has identified candidate brain sites associated with the convergence and integration of visual and somatic signals from the hand in premotor, parietal, and subcortical regions. To our knowledge, this is the first direct comparison of unisensory and multisensory visuotactile stimulation conditions of the hand conducted with the intention of characterizing multisensory integrative effects in analogy with the literature on multisensory integration in non-human primates (Avillac et al. 2007). We found enhanced BOLD responses to multisensory stimuli in the left (contralateral) ventral and dorsal premotor cortex, left anterior part of the intraparietal sulcus, left inferior parietal cortex (supramarginal gyrus), left postcentral sulcus, left insula, and bilateral parietal operculum. Subcortically, multisensory integrative effects were detected in the left putamen, left thalamus, and right cerebellum. These regions are anatomically interconnected and are known to contain neurons that integrate visual and somatic signals in nonhuman primates (as discussed in greater detail in the following text). Additionally, the sensorimotor properties of these regions are known to be fundamental for sensory guidance of arm and hand actions (Andersen et al. 1997; Rizzolatti et al. 1998). The localization and specificity of the fMRI activations described here to sets of brain regions that are known to be anatomically interconnected suggest that the perceptual binding of vision and touch arises as the integration of signals across interconnected brain regions. This is consistent with the literature on audiovisual perceptual binding in humans, that has shown the importance of interconnected brain regions for the fusion of auditory and visual cues (Bushara et al. 2003).

Our results are important because they provide robust evidence that these circuits are functionally relevant for multisensory perception of the hand in space. Further, they provide compelling support for the hypothesis that, in man, as in the non-human primate (Graziano and Botvinick 2002), visual, tactile, and proprioceptive signals from the upper limb converge onto multisensory neuronal populations in specific premotor, parietal, and subcortical structures that represent the space near the body in a common hand-centered reference frame.

Additive and superadditive responses

At the level of a single neuron, the firing rates induced by uni- and multisensory stimuli can be directly compared and classified into additive and sub- and superadditive (Stein and Stanford 2008; Wallace et al. 1996). All three types of interaction are known to contribute to behavior and perception (Stanford and Stein 2007). In fact, linear, additive responses are the most common mode of response of individual multisensory neurons to non near-threshold stimuli (Stanford and Stein 2007; Stanford et al. 2005). Therefore in the present study, we interpret both linear and nonlinear responses in the hypothesized areas as reflecting biologically meaningful multisensory integration.

This approach calls for some further critical discussion when employed with BOLD-based fMRI as this neuroimaging technique measures signals from a large population of (possibly heterogeneous) neurons within a voxel (Logothetis 2008; Logothetis et al. 2001). In other words, a voxel could contain both uni- and multisensory neuronal populations. This has sparked an important discussion in the field of multisensory brain imaging concerning which criteria should be used to identify multisensory responses. One highly influential study argued that superadditivity should be taken as the “golden standard” (Calvert 2001). More recent reviews have, however, challenged this claim and have suggested that other types of response should also be carefully investigated (Goebel and van Atteveldt 2009; Laurienti et al. 2005; Stein et al. 2009). The central point in this discussion is that fMRI studies cannot directly discriminate between areal and neuronal convergence (Calvert 2001; Goebel and van Atteveldt 2009; Stevenson et al. 2007). In principle, a linear BOLD response could be produced by two separate populations of unisensory cells co-existing within the same voxel. The other side of this coin, though, is that a nonlinear, superadditive response at the voxel level cannot be simply explained away as convergence of sensory inputs onto different, noninteracting neuronal populations (Calvert 2001; Goebel and van Atteveldt 2009; Laurienti et al. 2005; Stevenson et al. 2007). Thus a superadditive BOLD response must reflect a neuronal interaction of some sort, be it expressed at the level of discharge rates or local field potentials (Logothetis et al. 2001). It is important to note that there are several arguments that can be called on to explain why the superadditive criterion may turn out to be too conservative, two such being: the tendency of neurons to respond nonlinearly only for weak, near-threshold stimuli (Laurienti et al. 2005; Stanford et al. 2005; Stevenson et al. 2007) and vascular ceiling effects for the BOLD response (Buxton et al. 2004), which might prevent detection of nonlinear interactions (Calvert 2001; Goebel and van Atteveldt 2009; Stevenson et al. 2007). In light of this, the linear responses we observed in the left ventral premotor cortex and the left supramarginal gyrus, where we had particularly strong a priori anatomo-functional hypotheses (see introduction), most likely reflect biologically relevant evidence of the occurrence of multisensory integration in these areas.

Posterior parietal cortex

The posterior parietal cortex is generally considered to be a key region for the fusion of signals from different sensory modalities. In our study, we have found clear evidence for the convergence of visual and somatic signals on and their integration in several key areas in this part of the brain. The anatomical locations of these response patterns in the parietal lobe fit well with earlier electrophysiological studies on non-human primates (Avillac et al. 2005; Colby and Duhamel 1991; Colby et al. 1993; Duhamel et al. 1998; Hyvärinen 1981; Hyvärinen and Poranen 1974), as will be discussed in some detail in this section.

It is particularly important that we were able to detect a zone of convergence of visual and tactile signals on and their integration in the anterior part of the intraparietal sulcus. This activation exhibited a superadditive response profile and was located on the medial bank of the anterior segment of the intraparietal sulcus, close to the junction with the postcentral sulcus; the cluster of active voxels extended into the superior parietal gyrus as well. The overlap between the active cluster obtained in the conjunction analysis of the two unisensory conditions and the cluster identified by the positive interaction contrast in the factorial analysis was highly convincing. The location of these activations on the medial bank of the intraparietal sulcus fits very well with the activations reported in this region during the rubber hand illusion (Ehrsson et al. 2004). This illusion is associated with perceptual binding of tactile and visual stimuli, consistent with the evidence presented here that this area is a site of visuotactile convergence and integration. Furthermore, our results confirm and go beyond the data presented by Makin and colleagues (2007), showing evidence not only of multisensory convergence, but also of integration of visual and tactile hand-related signals in the anterior intraparietal area.

It is not straightforward to relate the anatomical location of the activated areas directly to specific functional regions defined in and around the intraparietal sulcus in the macaque brain, as homological relations have not been fully established (Culham et al. 2006; Grefkes and Fink 2005). In the monkey brain, the anterior intraparietal area (AIP) lays on the lateral bank of the sulcus, while areas VIP (ventral intraparietal area) and MIP (medial intraparietal area) are found on the medial bank (Grefkes and Fink 2005). Thus it is possible that the activation under discussion was driven by active neurons in the two latter areas rather than in the former. Taken together, the results presented here provide compelling support for the notion that the anterior intraparietal cortex contributes to the multisensory representation of the arm in space (Makin et al. 2007, 2008), in addition to its well-established role in the sensory guidance of grasp-related action (Binkofski et al. 1998; Culham et al. 2003, 2006; Ehrsson et al. 2000, 2001).

Anterior to the intraparietal sulcus, we observed multisensory responses in the form of distinct peaks of activation in the cortex lining the postcentral sulcus. The anterior bank of this sulcus corresponds to area 2 of the primary somatosensory cortex, the posterior bank being associated with area 5 of the posterior parietal cortex instead. Area 2, although traditionally considered part of the primary somatosensory cortex, is thought to be involved in higher order somatosensory processing (Bodegård et al. 2001; Iwamura 1998; Roland et al. 1998), with an emphasis on proprioceptive signals (Naito et al. 2005). Area 5 is generally considered to be a somatosensory association area (Bodegård et al. 2000; Duffy and Burchfiel 1971; Mountcastle et al. 1975; Sakata et al. 1973; Taoka et al. 2000). Based on research on nonhuman primates, areas 2 and 5 are not taken to be the primary candidate sites for visuotactile integration in the parietal lobe (the intraparietal and inferior posterior parietal cortex being stronger candidates; see introduction). However, in area 5 of the monkey brain, neurons with both visual and tactile receptive fields have been described (Graziano et al. 2000; Iriki et al. 1996; Sakata et al. 1973). Graziano and colleagues showed how these cells integrate visual and somatic information for the representation of arm position in space (Graziano et al. 2000), although the greatest proportion of cells of this type was observed in the part of area 5 located in the medial bank of the intraparietal sulcus (MIP) (Graziano and Botvinick 2000). Nevertheless, our human data support the existing evidence in non-human primates that the cortex lining the postcentral sulcus is involved in the integration of visual and somatic information from the hand. Anatomically, area 5 receives prominent inputs from the primary somatosensory cortex, and in particular from area 2 (Pandya and Kuypers 1969; Pearson and Powell 1985). Additionally, area 5 is also connected to the primary motor cortex and premotor areas and is involved in reaching movements (Andersen et al. 1997; Johnson et al. 1996; Mountcastle et al. 1975; Scott et al. 1997). Therefore it is possible that the activations observed in passive participants under discussion here reflect a multisensory mechanism for the localization of the arm in space, a prerequisite for potential hand-object interactions.

The inferior posterior parietal cortex was also identified as a site of multisensory convergence and visuotactile integration in the present study. The corresponding activation was located in the left supramarginal gyrus. Although the precise homology between cytoarchitectonical areas of human and monkey inferior parietal lobes is not fully known (Eidelberg and Galaburda 1984), it has been suggested that the inferior parietal cortex in monkeys (area 7b/PF) likely corresponds to the supramarginal gyrus in the human brain (Krams et al. 1998). In monkeys, this region has anatomical connections to areas 2, 4, 5, and 45, the parietal operculum (Hyvärinen 1982; Neal et al. 1987 1990b), ventral premotor cortex (Ghosh and Gattera 1995; Godschalk et al. 1984; Neal et al. 1990a), and insula (Neal 1990; Neal et al. 1987), all of which are areas that displayed evidence of multisensory integration in the present study. Neurons with bimodal visuotactile responses have been found in this region (Hyvärinen 1981; Hyvärinen and Poranen 1974; Ishida et al. 2010). The inferior parietal cortex is an important node in the fronto-parietal circuits involved in sensory-guided hand actions (Luppino and Rizzolatti 2000); in human neuroimaging studies, it is consistently activated during hand-object interactions (Binkofski et al. 1998; Ehrsson et al. 2000, 2001; Kuhtz-Buschbeck et al. 2001). The supramarginal gyrus is also activated by proprioceptive stimulation inducing sensations of limb movement (Naito et al. 2005), demonstrating that this region receives significant proprioceptive inputs. The data reported here are interesting in this respect as they show that the involvement of this region in the sensory representation of the hand is fundamentally multisensory in nature, continuously combining visual and somatic data from the hand in space.

Parietal operculum and insula

We also found superadditive responses in clusters located in the contralateral (left) insula and in the right parietal operculum; the left parietal operculum exhibited additive multisensory responses. The parietal operculum is part of the secondary somatosensory area (SII) in humans, a relatively early processing node in the somatosensory system (Eickhoff et al. 2006). In human brain imaging studies, it is consistently activated by a wide range of somatic stimuli (Banati et al. 2000; Bodegård et al. 2000; Ledberg et al. 1995; Naito et al. 1999; O'Sullivan et al. 1994; Roland et al. 1998). Furthermore, activations in the parietal operculum are induced by passive movements (Weiller et al. 1996) and illusory arm movements elicited by tendon vibration (Naito et al. 2005). In monkeys, the parietal operculum receives prominent tactile and deep receptor inputs (Robinson and Burton 1980) and is anatomically connected to the inferior posterior parietal cortex (Disbrow et al. 2003) and to the ventral premotor cortex (Cipolloni and Pandya 1999; Disbrow et al. 2003; Ghosh and Gattera 1995; Godschalk et al. 1984; Matelli et al. 1986) and thus probably works in concert with these areas to represent the spatial location of the arm and an object touching it in space. Moreover, a recent study offered further evidence that regions within the human parietal operculum are anatomically connected with key multisensory areas in the frontal and parietal lobes (Eickhoff et al. 2010). In the light of this, the parietal operculum could be an important processing node in the network involved in the multisensory perception of the hand in space as suggested by our results.

In addition, we observed multisensory effects in the left insular cortex. The insula, together with the adjacent claustrum, has been suggested to be an important cortical site for multisensory integration (Bushara et al. 2001a; Hadjikhani and Roland 1998; Naghavi et al. 2007). In particular, the insula/claustrum region has been associated with cross-modal visuotactile processing for recognition (Banati et al. 2000) and discrimination (Hadjikhani and Roland 1998) tasks. The finding that we report here provide new evidence that the basic integration of visual and somatic signals from a static hand and an object touching it is performed in this multisensory area.

Frontal responses

The premotor cortex is a site of convergence of visual, tactile, and proprioceptive information (Rizzolatti et al. 1981; Graziano et al. 1997). It is also involved in the control of goal-directed hand actions (Rizzolatti et al. 1998) and defensive movements (Graziano et al. 2002). In the present study, we found clear evidence in the left (contralateral) premotor cortex for the convergence of visual and somatic signals from the hand as well as of integration of these signals within this area. In the ventral premotor cortex, multisensory enhancement was additive; in addition to this, in the dorsal part of the premotor cortex, we found evidence for superadditive effects. These results fit well with electrophysiological studies on nonhuman primates, which have stressed the importance of the premotor cortex in processing visual and tactile sensory signals from the body and its surrounding space.

Bimodal visuotactile neurons have been described in the ventral premotor cortex of the macaque (Graziano and Gandhi 2000; Graziano et al. 1997; Rizzolatti et al. 1981) with overlapping tactile and visual receptive fields anchored to specific body parts. In particular, Graziano and colleagues described bisensory neurons that are sensitive to somatosensory and visual stimulation of the monkey's hand (Graziano and Botvinick 2002; Graziano and Gandhi 2000; Graziano et al. 1997). These responses are also present in anesthetized monkeys (Graziano and Gandhi 2000), which suggests that premotor neurons perform multisensory integration even when the monkey is not planning or performing an action. In the present study, participants held their arm still and did not perform any motor tasks, so the measured premotor responses most likely reflect basic visuotactile integration mechanisms. This interpretation fits with earlier fMRI studies that reported premotor activations during visuotactile stimulation of the arm in passive subjects (Ehrsson et al. 2004; Lloyd et al. 2003).

Besides the ventral premotor cortex, we also found evidence of visuotactile integration in the dorsal premotor cortex. Although this region of the premotor cortex has not been as extensively characterized as its ventral neighbor in terms of its multisensory properties, visuotactile neurons have been described in the caudal part of this area (Fogassi et al. 1999; Graziano and Gandhi 2000). The convergence of proprioceptive, tactile, and visual information, as well as its connectivity pattern with several regions in the posterior parietal lobe, in particular area 5 (Rizzolatti et al. 1998), has led scientists to assign to the dorsal premotor cortex an important role in monitoring and controlling arm and hand movements in space (Rizzolatti et al. 1998; Scott et al. 1997). Integration of visual, tactile, and proprioceptive information in the dorsal premotor cortex might therefore be important for calibrating the representation of limb position (Ehrsson et al. 2004) as part of a circuit devoted to the control of arm and hand movements (Kertzman et al. 1997).

Putamen

The evidence provided here for visuotactile integration in the human putamen was particularly noteworthy. In the macaque putamen, bisensory, visuotactile neurons with body-centered receptive fields, similar to those found in frontal and parietal regions, were identified in a classic paper by Graziano and Gross (1993). In this report, the authors described how neurons with tactile receptive fields organized somatotopically into arm, hand, and face regions were also responsive to visual stimulation and in particular visual stimuli originating from the space close to the body (Graziano and Gross 1993). Given the tight anatomical connections with multisensory frontal and parietal regions (Cavada and Goldman-Rakic 1991; Graziano and Gross 1993; Künzle 1978; Parthasarathy et al. 1992; Weber and Yin 1984), the putamen is considered part of the visuotactile network responsible for motor interactions with environmental objects (Graziano and Gross 1993). To the best of our knowledge, the evidence we have found of superadditive BOLD responses in the contralateral (left) putamen provides the most conclusive evidence today that the human putamen performs integration of visual and somatic signals from the hand, possibly in hand-centered reference frames. The previously mentioned fMRI studies, which included unisensory visual and tactile stimulation conditions (Makin et al. 2007) and bisensory stimulation (Ehrsson et al. 2004; Lloyd et al. 2003), failed to identify consistent activation in the putamen. Our stimulation paradigm using robust natural bisensory stimulation of the hand and functional imaging at a higher field strength of 3T, rather than the 1.5T in the reported studies, are most likely responsible for the increase in sensitivity that allowed us to describe activations in this small subcortical structure.

Cerebellum

In the study being presented here, the cerebellum displayed evidence of convergence of visual and tactile hand-related signals as well as of multisensory integration of both the additive and superadditive type. All activations were localized in the ipsilateral (right) cerebellum. Although the cerebellum has long been considered to be primarily involved in the control of action, recent developments have suggested that this structure also plays an important role in perceptual and cognitive functions (Gao et al. 1996; Ramnani 2006; Schmahmann and Sherman 1998; Strick et al. 2009). The possibility that there is convergence of signals from different sensory modalities in cerebellar regions is supported by extensive knowledge of the functional connectivity of this structure in nonhuman primates. Visual (Glickstein et al. 1994; Sultan and Glickstein 2007), tactile (Bloedel 1973; Bushara et al. 2001b; Snider and Stowell 1944), and kinesthetic (Murphy et al. 1973; van Kan et al. 1993) information is conveyed to the cerebellum either directly or indirectly, reinforcing the notion that this structure could be an important node in cortico-cerebellar networks for the processing of multisensory information. In humans, neuroimaging studies have demonstrated the involvement of the cerebellum in multisensory processing (Bushara et al. 2001a; Hagura et al. 2009; Kavounoudias et al. 2008; Naumer et al. 2010; Stevenson et al. 2009) as well as cerebellar activation following tactile stimulation of the hand in passive participants (Bushara et al. 2001b; Fox et al. 1985), which is consistent with our results. This accumulating body of evidence suggests that the cerebellum may play a role in the multisensory network devoted to the representation of the limb across sensory modalities, although the present data cannot substantiate any speculation on the functional nature of such involvement.

Multisensory circuit for body ownership

Our results are also relevant for studies of body ownership, where the integration of vision and touch is assumed to be the fundamental mechanism behind the self-attribution of limbs (Ehrsson et al. 2004, 2005, 2007; Makin et al. 2008) and entire bodies (Petkova and Ehrsson 2008). Crucial fMRI evidence for this idea came from the observation that touches sensed on an owned rubber hand were associated with activations in the ventral premotor and intraparietal cortices (Ehrsson et al. 2004). However, in this earlier study and similar ones conducted subsequently, the authors did not include a condition where the real hand was stimulated to demonstrate that these areas are robustly activated when one's own real hand is stimulated naturally in full view (Ehrsson et al. 2004, 2005, 2007). In light of this, the present findings that tactile stimulation of a real hand being watched by its “owner” activates the ventral premotor and intraparietal cortices, regions recognized to be sites of convergence and integration of visual and tactile signals, provide an important validation of the multisensory hypothesis of body ownership (Ehrsson et al. 2004; Makin et al. 2008; Petkova and Ehrsson 2008; Tsakiris 2010).

Conclusions

In this fMRI study, we manipulated the contribution of vision and touch to passive, ecologically valid stimulation of the hand in humans to characterize cortical and subcortical regions in terms of their multisensory, visuotactile responses. Adopting a methodology originated in electrophysiological studies in animals (Stein and Stanford 2008; Wallace et al. 1996) and introduced in the human brain imaging literature (Calvert 2001) to assess multisensory integration, we described linear and nonlinear BOLD responses to visuotactile stimulation in a set of premotor, posterior parietal, and subcortical regions matching the neurophysiological literature on multisensory areas in nonhuman primates and humans well. These results provide compelling support for the role of these areas in integrating visual and tactile information to build a cross-modal representation of hand-related sensory signals.

GRANTS

This study was funded by the European Research Council, the Swedish Foundation for Strategic Research, the Human Frontier Science Program, and Stockholm Brain Institute. All scans were performed at the MR-Centre at the Karolinska University Hospital in Huddinge, Sweden.

DISCLOSURES

No conflicts of interest, financial or otherwise, are declared by the author(s).

ACKNOWLEDGMENTS

The authors thank U. Keiller, T. Jonsson, and T.-Q. Li for help with the setup. The authors also thank three anonymous reviewers, whose comments helped improve this manuscript.

REFERENCES

  • Andersen RA, Snyder LH, Bradley DC, Xing J. Multimodal representation of space in the posterior parietal cortex and its use in planning movements. Annu Rev Neurosci 20: 303–330, 1997 [Abstract] [Google Scholar]
  • Avillac M, Ben Hamed S, Duhamel JR. Multisensory integration in the ventral intraparietal area of the macaque monkey. J Neurosci 27: 1922–1932, 2007 [Abstract] [Google Scholar]
  • Avillac M, Denève S, Olivier E, Pouget A, Duhamel JR. Reference frames for representing visual and tactile locations in the parietal cortex. Nat Neurosci 8: 941–949, 2005 [Abstract] [Google Scholar]
  • Banati RB, Goerres GW, Tjoa C, Aggleton JP, Grasby P. The functional anatomy of visuo-tactile integration in man: a study using positron emission tomography. Neuropsychologia 38: 155–124, 2000 [Abstract] [Google Scholar]
  • Beauchamp MS, Yasar NE, Frye RE, Ro T. Touch, sound and vision in human superior temporal sulcus. Neuroimage 41: 1011–1020, 2008 [Europe PMC free article] [Abstract] [Google Scholar]
  • Binkofski F, Dohle C, Posse S, Stephan KM, Hefter H, Seitz RJ, Freund HJ. Human anterior intraparietal area subserves prehension: a combined lesion and functional MRI activation study. Neurology 50: 1253–1259, 1998 [Abstract] [Google Scholar]
  • Bloedel JR. Cerebellar afferent systems: a review. Prog Neurobiol 2: 3–68, 1973 [Abstract] [Google Scholar]
  • Bodegård A, Geyer S, Grefkes C, Zilles K, Roland PE. Hierarchical processing of tactile shape in the human brain. Neuron 31: 317–328, 2001 [Abstract] [Google Scholar]
  • Bodegård A, Geyer S, Naito E, Zilles K, Roland PE. Somatosensory areas in man activated by moving stimuli: cytoarchitectonic mapping and PET. Neuroreport 11: 187–191, 2000 [Abstract] [Google Scholar]
  • Bremmer F, Schlack A, Shah NJ, Zafiris O, Kubischik M, Hoffmann K, Zilles K, Fink GR. Polymodal motion processing in posterior parietal and premotor cortex: a human fMRI study strongly implies equivalencies between humans and monkeys. Neuron 29: 287–296, 2001 [Abstract] [Google Scholar]
  • Bruce C, Desimone R, Gross CG. Visual properties of neurons in a polysensory area in superior temporal sulcus of the macaque. J Neurophysiol 46: 369–384, 1981 [Abstract] [Google Scholar]
  • Bushara KO, Grafman J, Hallett M. Neural correlates of auditory-visual stimulus onset asynchrony detection. J Neurosci 21: 300–304, 2001a [Abstract] [Google Scholar]
  • Bushara KO, Hanakawa T, Immisch I, Toma K, Kansaku K, Hallett M. Neural correlates of cross-modal binding. Nat Neurosci 6: 190–195, 2003 [Abstract] [Google Scholar]
  • Bushara KO, Wheat JM, Khan A, Mock BJ, Turski PA, Sorenson J, Brooks BR. Multiple tactile maps in the human cerebellum. Neuroreport 12: 2483–2486, 2001b [Abstract] [Google Scholar]
  • Buxton RB, Uludağ K, Dubowitz DJ, Liu TT. Modeling the hemodynamic response to brain activation. Neuroimage 23, Suppl 1: S220–S233, 2004 [Abstract] [Google Scholar]
  • Calvert GA. Crossmodal processing in the human brain: insights from functional neuroimaging studies. Cereb Cortex 11: 1110–1123, 2001 [Abstract] [Google Scholar]
  • Cavada C, Goldman-Rakic PS. Topographic segregation of corticostriatal projections from posterior parietal subdivisions in the macaque monkey. Neuroscience 42: 683–696, 1991 [Abstract] [Google Scholar]
  • Chumbley J, Worsley K, Flandin G, Friston K. Topological FDR for neuroimaging. Neuroimage 49: 3057–3064, 2010 [Europe PMC free article] [Abstract] [Google Scholar]
  • Cipolloni PB, Pandya DN. Cortical connections of the frontoparietal opercular areas in the rhesus monkey. J Comp Neurol 403: 431–458, 1999 [Abstract] [Google Scholar]
  • Colby CL, Duhamel JR. Heterogeneity of extrastriate visual areas and multiple parietal areas in the macaque monkey. Neuropsychologia 29: 517–537, 1991 [Abstract] [Google Scholar]
  • Colby CL, Duhamel JR, Goldberg ME. Ventral intraparietal area of the macaque: anatomic location and visual response properties. J Neurophysiol 69: 902–914, 1993 [Abstract] [Google Scholar]
  • Cooke DF, Graziano MS. Sensorimotor integration in the precentral gyrus: polysensory neurons and defensive movements. J Neurophysiol 91: 1648–1660, 2004 [Abstract] [Google Scholar]
  • Cooke DF, Taylor CS, Moore T, Graziano MS. Complex movements evoked by microstimulation of the ventral intraparietal area. Proc Natl Acad Sci USA 100: 6163–6168, 2003 [Europe PMC free article] [Abstract] [Google Scholar]
  • Culham JC, Cavina-Pratesi C, Singhal A. The role of parietal cortex in visuomotor control: what have we learned from neuroimaging? Neuropsychologia 44: 2668–2684, 2006 [Abstract] [Google Scholar]
  • Culham JC, Danckert SL, DeSouza JF, Gati JS, Menon RS, Goodale MA. Visually guided grasping produces fMRI activation in dorsal but not ventral stream brain area. Exp Brain Res 153: 180–189, 2003 [Abstract] [Google Scholar]
  • Disbrow E, Litinas E, Recanzone GH, Padberg J, Krubitzer L. Cortical connections of the secondary somatosensory area and the parietal ventral area in macaque monkeys. J Comp Neurol 462: 382–399, 2003 [Abstract] [Google Scholar]
  • Driver J, Noesselt T. Multisensory interplay reveals crossmodal influences on “sensory-specific” brain regions, neural responses, and judgments. Neuron 57: 11–23, 2008 [Europe PMC free article] [Abstract] [Google Scholar]
  • Duffy FH, Burchfiel JL. Somatosensory system: organizational hierarchy from single units in monkey area 5. Science 172: 273–275, 1971 [Abstract] [Google Scholar]
  • Duhamel JR, Colby CL, Goldberg ME. Ventral intraparietal area of the macaque: congruent visual and somatic response properties. J Neurophysiol 79: 126–136, 1998 [Abstract] [Google Scholar]
  • Duvernoy HM. The Human Brain. Surface, Blood Supply and Three-Dimensional Sectional Anatomy, 2nd edition Vienna: Springer-Verlag, 1999 [Google Scholar]
  • Ehrsson HH, Fagergren E, Forssberg H. Differential fronto-parietal activation depending on force used in a precision grip task: an fMRI study. J Neurophysiol 85: 2613–2623, 2001 [Abstract] [Google Scholar]
  • Ehrsson HH, Fagergren A, Jonsson T, Westling G, Johansson RS, Forssberg H. Cortical activity in precision- versus power-grip tasks: an fMRI study. J Neurophysiol 83: 528–536, 2000 [Abstract] [Google Scholar]
  • Ehrsson HH, Holmes NP, Passingham RE. Touching a rubber hand: feeling of body ownership is associated with activity in multisensory brain areas. J Neurosci 25: 10564–10573, 2005 [Europe PMC free article] [Abstract] [Google Scholar]
  • Ehrsson HH, Spence C, Passingham RE. That's my hand! Activity in premotor cortex reflects feeling of ownership of a limb. Science 305: 875–877, 2004 [Abstract] [Google Scholar]
  • Ehrsson HH, Wiech K, Weiskopf N, Dolan RJ, Passingham RE. Threatening a rubber hand that you feel is yours elicits a cortical anxiety response. Proc Natl Acad Sci USA 104: 9828–983, 2007 [Europe PMC free article] [Abstract] [Google Scholar]
  • Eickhoff SB, Jbabdi S, Caspers S, Laird AR, Fox PT, Zilles K, Behrens TEJ. Anatomical and functional connectivity of cytoarchitectonic areas within the human parietal operculum. J Neurosci 30: 6409–6421, 2010 [Europe PMC free article] [Abstract] [Google Scholar]
  • Eickhoff SB, Schleicher A, Zilles K, Amunts K. The human parietal operculum. I. Cytoarchitectonic mapping of subdivisions. Cereb Cortex 16: 254–267, 2006 [Abstract] [Google Scholar]
  • Eickhoff SB, Stephan KE, Mohlberg H, Grefkes C, Fink GR, Amunts K, Zilles K. A new SPM toolbox for combining probabilistic cytoarchitectonic maps and functional imaging data. Neuroimage 25: 1325–1335, 2005 [Abstract] [Google Scholar]
  • Eidelberg D, Galaburda AM. Inferior parietal lobule. Divergent architectonic asymmetries in the human brain. Arch Neurol 41: 843–852, 1984 [Abstract] [Google Scholar]
  • Ernst MO, Bülthoff HH. Merging the senses into a robust percept. Trends Cogn Sci 8: 162–169, 2004 [Abstract] [Google Scholar]
  • Fogassi L, Raos V, Franchi G, Gallese V, Luppino G, Matelli M. Visual responses in the dorsal premotor area F2 of the macaque monkey. Exp Brain Res 128: 194–199, 1999 [Abstract] [Google Scholar]
  • Fox PT, Raichel ME, Thach WT. Functional mapping of the human cerebellum using positron emission tomography. Proc Natl Acad Sci USA 82: 7462–7466, 1985 [Europe PMC free article] [Abstract] [Google Scholar]
  • Friston K, Penny WD, Glaser DE. Conjunction revisited. Neuroimage 25: 661–667, 2005 [Abstract] [Google Scholar]
  • Friston K, Zarahn E, Josephs O, Henson RN, Dale AM. Stochastic designs in event-related fMRI. Neuroimage 10: 607–619, 1999 [Abstract] [Google Scholar]
  • Gao JH, Parsons LM, Bower JM, Xiong J, Li J, Fox PT. Cerebellum implicated in sensory acquisition and discrimination rather than motor control. Science 272: 545–547, 1996 [Abstract] [Google Scholar]
  • Ghosh S, Gattera R. A comparison of the ipsilateral cortical projections to the dorsal and ventral subdivisions of the macaque premotor cortex. Somatosens Mot Res 12: 359–378, 1995 [Abstract] [Google Scholar]
  • Glickstein M, Gerrits N, Kralj-Hans I, Mercier B, Stein J, Voogd J. Visual pontocerebellar projections in the macaque. J Comp Neurol 349: 51–72, 1994 [Abstract] [Google Scholar]
  • Godschalk M, Lemon RN, Kuypers HG, Ronday HK. Cortical afferents and efferents of monkey postarcuate area: an anatomical and electrophysiological study. Exp Brain Res 56: 410–424, 1984 [Abstract] [Google Scholar]
  • Goebel R, van Atteveldt N. Multisensory functional magnetic resonance imaging: a future perspective. Exp Brain Res 198: 153–164, 2009 [Europe PMC free article] [Abstract] [Google Scholar]
  • Graziano MS, Botvinick M. How the brain represents the body: insights from neurophysiology and psychology. In: Common Mechanisms in Perception and Action: Attention and Performance XIX, edited by Prinz W, Hommel B. Oxford, UK: Oxford Univ. Press, 2002, p. 136–157 [Google Scholar]
  • Graziano MS, Cooke DF, Taylor CS. Coding the location of the arm by sight. Science 290: 1782–1786, 2000 [Abstract] [Google Scholar]
  • Graziano MS, Gandhi S. Location of the polysensory zone in the precentral gyrus of monkeys. Exp Brain Res 135: 259–266, 2000 [Abstract] [Google Scholar]
  • Graziano MS, Gross CG. A bimodal map of space: somatosensory receptive fields in the macaque putamen with corresponding visual receptive fields. Exp Brain Res 97: 96–109, 1993 [Abstract] [Google Scholar]
  • Graziano MS, Hu XT, Gross CG. Visuo-spatial properties of ventral premotor cortex. J Neurophysiol 77: 2268–2292, 1997 [Abstract] [Google Scholar]
  • Graziano MS, Taylor CS, Moore T. Complex movements evoked by microstimulation of precentral cortex. Neuron 34: 841–851, 2002 [Abstract] [Google Scholar]
  • Grefkes C, Fink GR. The functional organization of the intraparietal sulcus in humans and monkeys. J Anat 207: 3–17, 2005 [Abstract] [Google Scholar]
  • Hadjikhani N, Roland PE. Cross-modal transfer of information between the tactile and the visual representations in the human brain: a positron emission tomographic study. J Neurosci 18: 1072–1084, 1998 [Abstract] [Google Scholar]
  • Hagura N, Oouchida Y, Aramaki Y, Okada T, Matsumura M, Sadato N, Naito E. Visuokinesthetic perception of hand movements is mediated by cerebro-cerebellar interaction between the left cerebellum and right parietal cortex. Cereb Cortex 19: 176–186, 2009 [Europe PMC free article] [Abstract] [Google Scholar]
  • Hyvärinen J. Regional distributions of functions in parietal association area 7 of the monkey. Brain Res 206: 287–303, 1981 [Abstract] [Google Scholar]
  • Hyvärinen J. Posterior parietal lobe of the primate brain. Physiol Rev 62: 1060–1129, 1982 [Abstract] [Google Scholar]
  • Hyvärinen J, Poranen A. Function of the parietal associative area 7 as revealed from cellular discharges in alert monkeys. Brain 97: 673–692, 1974 [Abstract] [Google Scholar]
  • Iriki A, Tanaka M, Iwamura Y. Coding of modified body schema during tool use by macaque postcentral neurones. Neuroreport 7: 2325–2330, 1996 [Abstract] [Google Scholar]
  • Ishida H, Nakajima K, Inase M, Murata A. Shared mapping of own and others' bodies in visuotactile bimodal area of monkey parietal cortex. J Cogn Neurosci 22: 83–96, 2010 [Abstract] [Google Scholar]
  • Iwamura Y. Hierarchical somatosensory processing. Curr Opin Neurobiol 8: 522–528, 1998 [Abstract] [Google Scholar]
  • Johnson PB, Ferraina S, Bianchi L, Caminiti R. Cortical networks for visual reaching: physiological and anatomical organization of frontal and parietal lobe arm regions. Cereb Cortex 6: 102–119, 1996 [Abstract] [Google Scholar]
  • Kavounoudias A, Roll JP, Anton JL, Nazarian B, Roth M, Roll R. Proprio-tactile integration for kinesthetic perception: an fMRI study. Neuropsychologia 46: 567–575, 2008 [Abstract] [Google Scholar]
  • Kawashima R, O'Sullivan BT, Roland PE. Positron-emission tomography studies of cross-modality inhibition in selective attentional tasks: closing the “mind's eye.” Proc Natl Acad Sci USA 92: 5969–5972, 1995 [Europe PMC free article] [Abstract] [Google Scholar]
  • Kertzman C, Schwarz U, Zeffiro TA, Hallett M. The role of posterior parietal cortex in visually guided reaching movements in humans. Exp Brain Res 114: 170–183, 1997 [Abstract] [Google Scholar]
  • Krams M, Rushworth MF, Deiber MP, Frackowiak RS, Passingham RE. The preparation, execution and suppression of copied movements in the human brain. Exp Brain Res 120: 386–398, 1998 [Abstract] [Google Scholar]
  • Kuhtz-Buschbeck JP, Ehrsson HH, Forssberg H. Human brain activity in the control of fine static precision grip forces: an fMRI study. Eur J Neurosci 14: 382–390, 2001 [Abstract] [Google Scholar]
  • Künzle H. An autoradiographic analysis of the efferent connections from premotor and adjacent prefrontal regions (areas 6 and 9) in Macaca fascicularis. Brain Behav Evol 15: 185–234, 1978 [Abstract] [Google Scholar]
  • Laurienti PJ, Perrault TJ, Stanford TR, Wallace MT, Stein BE. On the use of superadditivity as a metric for characterizing multisensory integration in functional neuroimaging studies. Exp Brain Res 166: 289–297, 2005 [Abstract] [Google Scholar]
  • Ledberg A, O'Sullivan BT, Kinomura S, Roland PE. Somatosensory activation of the parietal operculum of man. A PET study. Eur J Neurosci 40: 47–55, 1995 [Abstract] [Google Scholar]
  • Lloyd DM, Shore DI, Spence C, Calvert GA. Multisensory representation of limb position in human premotor cortex. Nat Neurosci 6: 17–18, 2003 [Abstract] [Google Scholar]
  • Logothetis NK. What we can do and what we cannot do with fMRI. Nature 453: 869–878, 2008 [Abstract] [Google Scholar]
  • Logothetis NK, Pauls J, Augath M, Trinath T, Oeltermann A. Neurophysiological investigation of the BOLD signal. Nature 412: 150–157, 2001 [Abstract] [Google Scholar]
  • Luppino G, Rizzolatti G. The organization of the frontal motor cortex. News Physiol Sci 15: 219–224, 2000 [Abstract] [Google Scholar]
  • Makin TR, Holmes NP, Ehrsson HH. On the other hand: dummy hands and peripersonal space. Behav Brain Res 191: 1–10, 2008 [Abstract] [Google Scholar]
  • Makin TR, Holmes NP, Zohary E. Is that near my hand? Multisensory representation of peripersonal space in human intraparietal sulcus. J Neurosci 27: 731–740, 2007 [Abstract] [Google Scholar]
  • Matelli M, Camarda R, Glickstein M, Rizzolatti G. Afferent and efferent projections of the inferior area 6 in the macaque monkey. J Comp Neurol 251: 281–298, 1986 [Abstract] [Google Scholar]
  • Mountcastle VB, Lynch JC, Georgopoulos A, Sakata H, Acuna C. Posterior parietal association cortex of the monkey: command functions for operations within extrapersonal space. J Neurophysiol 38: 871–908, 1975 [Abstract] [Google Scholar]
  • Murphy JT, MacKay WA, Johnson F. Responses of cerebellar cortical neurons to dynamic proprioceptive inputs from forelimb muscles. J Neurophysiol 36: 711–723, 1973 [Abstract] [Google Scholar]
  • Naghavi HR, Eriksson J, Larsson A, Nyberg L. The claustrum/insula region integrates conceptually related sounds and pictures. Neurosci Lett 422: 77–80, 2007 [Abstract] [Google Scholar]
  • Naito E, Ehrsson HH, Geyer S, Zilles K, Roland PE. Illusory arm movements activate cortical motor areas: a positron emission tomography study. J Neurosci 19: 6134–6144, 1999 [Abstract] [Google Scholar]
  • Naito E, Roland PE, Grefkes C, Choi HJ, Eickhoff S, Geyer S, Zilles K, Ehrsson HH. Dominance of the right hemisphere and role of area 2 in human kinesthesia. J Neurophysiol 93: 1020–1034, 2005 [Abstract] [Google Scholar]
  • Naumer MJ, Ratz L, Yalachkov Y, Polony A, Doehrmann O, van de Ven V, Müller NG, Kaiser J, Hein G. Visuohaptic convergence in a corticocerebellar network. Eur J Neurosci 31: 1730–1736, 2010 [Abstract] [Google Scholar]
  • Neal JW. The callosal connections of area 7b, PF in the monkey. Brain Res 514: 159–162, 1990 [Abstract] [Google Scholar]
  • Neal JW, Pearson RC, Powell TP. The cortico-cortical connections of area 7b, PF, in the parietal lobe of the monkey. Brain Res 419: 341–346, 1987 [Abstract] [Google Scholar]
  • Neal JW, Pearson RC, Powell TP. The ipsilateral corticocortical connections of area 7 with the frontal lobe in the monkey. Brain Res 509: 31–40, 1990a [Abstract] [Google Scholar]
  • Neal JW, Pearson RC, Powell TP. The ipsilateral cortico-cortical connections of area 7b, PF, in the parietal and temporal lobes of the monkey. Brain Res 524: 119–132, 1990b [Abstract] [Google Scholar]
  • Nichols T, Brett M, Andersson J, Wager T, Poline JB. Valid conjunction inference with the minimum statistic. Neuroimage 25: 653–660, 2005 [Abstract] [Google Scholar]
  • O'Sullivan BT, Roland PE, Kawashima R. A PET study of somatosensory discrimination in man. Microgeometry versus macrogeometry. Eur J Neurosci 6: 137–148, 1994 [Abstract] [Google Scholar]
  • Pandya DN, Kuypers HG. Cortico-cortical connections in the rhesus monkey. Brain Res 13: 13–36, 1969 [Abstract] [Google Scholar]
  • Parthasarathy HB, Schall JD, Graybiel AM. Distributed but convergent ordering of corticostriatal projections: analysis of the frontal eye field and the supplementary eye field in the macaque monkey. J Neurosci 12: 4468–4488, 1992 [Abstract] [Google Scholar]
  • Pasalar S, Ro T, Beauchamp MS. TMS of posterior parietal cortex disrupts visual tactile multisensory integration. Eur J Neurosci 31: 1783–1790, 2010 [Europe PMC free article] [Abstract] [Google Scholar]
  • Pearson RC, Powell TP. The projection of the primary somatic sensory cortex upon area 5 in the monkey. Brain Res 356: 89–107, 1985 [Abstract] [Google Scholar]
  • Petkova VI, Ehrsson HH. If I were you: perceptual illusion of body swapping. PLoS One 3:, 2008 [Europe PMC free article] [Abstract] [Google Scholar]
  • Price CJ, Friston K. Cognitive conjunction: a new approach to brain activation experiments. Neuroimage 5: 261–270, 1997 [Abstract] [Google Scholar]
  • Ramnani N. The primate cortico-cerebellar system: anatomy and function. Nat Rev Neurosci 7: 511–522, 2006 [Abstract] [Google Scholar]
  • Rizzolatti G, Fadiga L, Fogassi L, Gallese V. The space around us. Science 277: 190–191, 1997 [Abstract] [Google Scholar]
  • Rizzolatti G, Gentilucci M, Fogassi L, Luppino G, Matelli M, Ponzoni-Maggi S. Neurons related to goal-directed motor acts in inferior area 6 of the macaque monkey. Exp Brain Res 67: 220–224, 1987 [Abstract] [Google Scholar]
  • Rizzolatti G, Luppino G, Matelli M. The organization of the cortical motor system: new concepts. Electroencephalogr Clin Neurophysiol 106: 283–296, 1998 [Abstract] [Google Scholar]
  • Rizzolatti G, Scandolara C, Matelli M, Gentilucci M. Afferent properties of periarcuate neurons in macaque monkeys. II. Visual responses. Behav Brain Res 2: 147–163, 1981 [Abstract] [Google Scholar]
  • Robinson CJ, Burton H. Somatic submodality distribution within the second somatosensory (SII), 7b, retroinsular, postauditory, and granular insular cortical areas of M. fascicularis. J Comp Neurol 192: 93–108, 1980 [Abstract] [Google Scholar]
  • Roland PE, O'Sullivan B, Kawashima R. Shape and roughness activate different somatosensory areas in the human brain. Proc Natl Acad Sci USA 95: 3295–3300, 1998 [Europe PMC free article] [Abstract] [Google Scholar]
  • Sakata H, Takaoka Y, Kawarasaki A, Shibutani H. Somatosensory properties of neurons in the superior parietal cortex (area 5) of the rhesus monkey. Brain Res 64: 85–102, 1973 [Abstract] [Google Scholar]
  • Schmahmann JD, Sherman JC. The cerebellar cognitive affective syndrome. Brain 121: 561–579, 1998 [Abstract] [Google Scholar]
  • Scott SH, Sergio LE, Kalaska JF. Reaching movements with similar hand paths but different arm orientations. II. Activity of individual cells in dorsal premotor cortex and parietal area 5. J Neurophysiol 78: 2413–2426, 1997 [Abstract] [Google Scholar]
  • Sereno MI, Huang RS. A human parietal face area contains aligned head-centered visual and tactile maps. Nat Neurosci 9: 1337–1343, 2006 [Abstract] [Google Scholar]
  • Snider RS, Stowell A. Receiving areas of the tactile, auditory, and visual systems in the cerebellum. J Neurophysiol 7: 331–357, 1944 [Google Scholar]
  • Stanford TR, Quessy S, Stein BE. Evaluating the operations underlying multisensory integration in the cat superior colliculus. J Neurosci 25: 6499–6508, 2005 [Europe PMC free article] [Abstract] [Google Scholar]
  • Stanford TR, Stein BE. Superadditivity in multisensory integration: putting the computation in context. Neuroreport 18: 787–792, 2007 [Abstract] [Google Scholar]
  • Stein BE, Meredith MA. The Merging of the Senses. MIT Press, Cambridge, MA, 1993 [Google Scholar]
  • Stein BE, Stanford TR. Multisensory integration: current issues from the perspective of the single neuron. Nat Rev Neurosci 9: 255–266, 2008 [Abstract] [Google Scholar]
  • Stein BE, Stanford TR, Ramachandran R, Perrault TJ, Rowland BA. Challenges in quantifying multisensory integration: alternative criteria, models, and inverse effectiveness. Exp Brain Res 198: 113–126, 2009 [Europe PMC free article] [Abstract] [Google Scholar]
  • Stevenson RA, Geoghegan ML, James TW. Superadditive BOLD activation in superior temporal sulcus with threshold non-speech objects. Exp Brain Res 179: 85–95, 2007 [Abstract] [Google Scholar]
  • Stevenson RA, Kim S, James TW. An additive-factors design to disambiguate neuronal and areal convergence: measuring multisensory interactions between audio, visual and haptic sensory streams using fMRI. Exp Brain Res 198: 183–194, 2009 [Abstract] [Google Scholar]
  • Strick PL, Dum RP, Fiez JA. Cerebellum and nonmotor functions. Annu Rev Neurosci 32: 413–434, 2009 [Abstract] [Google Scholar]
  • Sultan F, Glickstein M. The cerebellum: comparative and animal studies. Cerebellum 6: 168–176, 2007 [Abstract] [Google Scholar]
  • Tal N, Amedi A. Multisensory visual-tactile object related network in humans: insights gained using a novel crossmodal adaptation approach. Exp Brain Res 198: 165–182, 2009 [Europe PMC free article] [Abstract] [Google Scholar]
  • Taoka M, Toda T, Iriki A, Tanaka M, Iwamura Y. Bilateral receptive fields neurons in the hindlimb region of the postcentral somatosensory cortex in awake macaque monkeys. Exp Brain Res 134: 139–146, 2000 [Abstract] [Google Scholar]
  • Tsakiris M. My body in the brain: a neurocognitive model of body-ownership. Neuropsychologia 48: 703–712, 2010 [Abstract] [Google Scholar]
  • van Kan PLE, Gibson AR, Houk JC. Movement-related inputs to intermediate cerebellum of the monkey. J Neurophysiol 69: 74–94, 1993 [Abstract] [Google Scholar]
  • Wallace MT, Stein BE. Sensory and multisensory responses in the newborn monkey superior colliculus. J Neurosci 21: 8886–0094, 2001 [Abstract] [Google Scholar]
  • Wallace MT, Wilkinson LK, Stein BE. Representation and integration of multiple sensory inputs in primate superior colliculus. J Neurophysiol 76: 1246–1266, 1996 [Abstract] [Google Scholar]
  • Weber JT, Yin TC. Subcortical projections of the inferior parietal cortex (area 7) in the stump-tailed monkey. J Comp Neurol 224: 206–230, 1984 [Abstract] [Google Scholar]
  • Weiller C, Jüptner M, Fellows S, Rijntjes M, Leonhardt G, Kiebel S, Müller S, Diener HC, Thilmann AF. Brain representation of active and passive movements. Neuroimage 4: 105–110, 1996 [Abstract] [Google Scholar]

Articles from Journal of Neurophysiology are provided here courtesy of American Physiological Society

Citations & impact 


Impact metrics

Jump to Citations

Citations of article over time

Alternative metrics

Altmetric item for https://www.altmetric.com/details/3307843
Altmetric
Discover the attention surrounding your research
https://www.altmetric.com/details/3307843

Article citations


Go to all (139) article citations

Other citations

Similar Articles 


To arrive at the top five similar articles we use a word-weighted algorithm to compare words from the Title and Abstract of each citation.