51 results match your criteria crossmodal enhancement

Crossmodal Phase Reset and Evoked Responses Provide Complementary Mechanisms for the Influence of Visual Speech in Auditory Cortex.

J Neurosci 2020 10 6;40(44):8530-8542. Epub 2020 Oct 6.

Department of Neurosurgery, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, New York 11549

Natural conversation is multisensory: when we can see the speaker's face, visual speech cues improve our comprehension. The neuronal mechanisms underlying this phenomenon remain unclear. The two main alternatives are visually mediated phase modulation of neuronal oscillations (excitability fluctuations) in auditory neurons and visual input-evoked responses in auditory neurons. Read More

View Article and Full-Text PDF
October 2020

Seeing objects improves our hearing of the sounds they make.

Neurosci Conscious 2020 9;2020(1):niaa014. Epub 2020 Aug 9.

Brain and Creativity Institute, University of Southern California, 3620A McClintock Avenue, Los Angeles, CA 90089, USA.

It has been established that lip reading improves the perception of auditory speech. But does seeing objects themselves help us hear better the sounds they make? Here we report a series of psychophysical experiments in humans showing that the visual enhancement of auditory sensitivity is not confined to speech. We further show that the crossmodal enhancement was associated with the conscious visualization of the stimulus: we can better hear the sounds an object makes when we are conscious of seeing that object. Read More

View Article and Full-Text PDF

Food and beverage flavour pairing: A critical review of the literature.

Charles Spence

Food Res Int 2020 07 28;133:109124. Epub 2020 Feb 28.

Crossmodal Research Laboratory, Oxford University, UK. Electronic address:

The recent explosion of interest in the topic of flavour pairing has been driven, at least in part, by the now-discredited food-pairing hypothesis, along with the emergence of the new field of computational gastronomy. Many chefs, sommeliers, mixologists, and drinks brands, not to mention a few food brands, have become increasingly interested in moving the discussions that they have with their consumers beyond the traditional focus solely on food and wine pairings. Here, two key approaches to pairing that might help to explain/justify those food and beverage combinations that the consumer is likely to appreciate are outlined. Read More

View Article and Full-Text PDF

β adrenergic receptor modulated signaling in glioma models: promoting β adrenergic receptor-β arrestin scaffold-mediated activation of extracellular-regulated kinase 1/2 may prove to be a panacea in the treatment of intracranial and spinal malignancy and extra-neuraxial carcinoma.

Mol Biol Rep 2020 Jun 18;47(6):4631-4650. Epub 2020 Apr 18.

Department of Neurological Surgery, University of California, San Francisco, 505 Parnassus Avenue, Box-0112, San Francisco, CA, 94143, USA.

Neoplastically transformed astrocytes express functionally active cell surface β adrenergic receptors (βARs). Treatment of glioma models in vitro and in vivo with β adrenergic agonists variably amplifies or attenuates cellular proliferation. In the majority of in vivo models, β adrenergic agonists generally reduce cellular proliferation. Read More

View Article and Full-Text PDF

Multisensory Enhancement of Odor Object Processing in Primary Olfactory Cortex.

Neuroscience 2019 10 29;418:254-265. Epub 2019 Aug 29.

Department of Clinical Neuroscience, Karolinska Institutet, Nobels väg 9, 17177, Stockholm, Sweden; Monell Chemical Senses Center, 3500 Market St, Philadelphia, PA 19104, USA; Department of Psychology, University of Pennsylvania, 3720 Walnut St., Philadelphia, PA 19104-6241, USA; Stockholm University Brain Imaging Centre, Stockholm University, Universitetsvägen 10 C, 10691 Stockholm, Sweden. Electronic address:

Identification of an object based on its odor alone is inherently difficult, but becomes easier when other senses provide supporting cues. This suggests that crossmodal sensory input facilitates neural processing of olfactory object information; however, direct evidence is still lacking. Here, we tested the effect of multisensory stimulation on information processing in the human posterior piriform cortex (PPC), a region linked to olfactory object encoding. Read More

View Article and Full-Text PDF
October 2019

Adult-Onset Hearing Impairment Induces Layer-Specific Cortical Reorganization: Evidence of Crossmodal Plasticity and Central Gain Enhancement.

Cereb Cortex 2019 05;29(5):1875-1888

Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario, London, Ontario, Canada.

Adult-onset hearing impairment can lead to hyperactivity in the auditory pathway (i.e., central gain enhancement) as well as increased cortical responsiveness to nonauditory stimuli (i. Read More

View Article and Full-Text PDF

The intraparietal sulcus governs multisensory integration of audiovisual information based on task difficulty.

Hum Brain Mapp 2018 03 12;39(3):1313-1326. Epub 2017 Dec 12.

Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden.

Object recognition benefits maximally from multimodal sensory input when stimulus presentation is noisy, or degraded. Whether this advantage can be attributed specifically to the extent of overlap in object-related information, or rather, to object-unspecific enhancement due to the mere presence of additional sensory stimulation, remains unclear. Further, the cortical processing differences driving increased multisensory integration (MSI) for degraded compared with clear information remain poorly understood. Read More

View Article and Full-Text PDF

Measuring multisensory integration: from reaction times to spike counts.

Sci Rep 2017 06 8;7(1):3023. Epub 2017 Jun 8.

Jacobs University Bremen, Life Sciences & Chemistry, Bremen, 28759, Germany.

A neuron is categorized as "multisensory" if there is a statistically significant difference between the response evoked, e.g., by a crossmodal stimulus combination and that evoked by the most effective of its components separately. Read More

View Article and Full-Text PDF

Experience with crossmodal statistics reduces the sensitivity for audio-visual temporal asynchrony.

Sci Rep 2017 05 3;7(1):1486. Epub 2017 May 3.

Biological Psychology and Neuropsychology, Faculty for Psychology and Human Movement Science, University of Hamburg, Von-Melle-Park 11, 20146, Hamburg, Germany.

Bayesian models propose that multisensory integration depends on both sensory evidence (the likelihood) and priors indicating whether or not two inputs belong to the same event. The present study manipulated the prior for dynamic auditory and visual stimuli to co-occur and tested the predicted enhancement of multisensory binding as assessed with a simultaneity judgment task. In an initial learning phase participants were exposed to a subset of auditory-visual combinations. Read More

View Article and Full-Text PDF

A cellular mechanism for inverse effectiveness in multisensory integration.

Elife 2017 03 18;6. Epub 2017 Mar 18.

Department of Neuroscience, Brown University, Providence, United States.

To build a coherent view of the external world, an organism needs to integrate multiple types of sensory information from different sources, a process known as multisensory integration (MSI). Previously, we showed that the temporal dependence of MSI in the optic tectum of tadpoles is mediated by the network dynamics of the recruitment of local inhibition by sensory input (Felch et al., 2016). Read More

View Article and Full-Text PDF

The informativity of sound modulates crossmodal facilitation of visual discrimination: a fMRI study.

Neuroreport 2017 Jan;28(2):63-68

aDepartment of Computer Science and Technology, School of Computer Science and Technology, Changchun University of Science and Technology, Changchun bDepartment of Radiology, Shengjing Hospital of China Medical University, Shenyang cDepartment of Biomedical Instrumentation, School of Biomedical Engineering, Capital Medical University, Beijing, China.

Many studies have investigated behavioral crossmodal facilitation when a visual stimulus is accompanied by a concurrent task-irrelevant sound. Lippert and colleagues reported that a concurrent task-irrelevant sound reduced the uncertainty of the timing of the visual display and improved perceptional responses (informative sound). However, the neural mechanism by which the informativity of sound affected crossmodal facilitation of visual discrimination remained unclear. Read More

View Article and Full-Text PDF
January 2017

Neural mechanisms underlying touch-induced visual perceptual suppression: An fMRI study.

Sci Rep 2016 11 22;6:37301. Epub 2016 Nov 22.

Developmental Disorders Section, Department of Rehabilitation for Brain Functions, Research Institute of National Rehabilitation Center for Persons with Disabilities, 4-1, Namiki, Tokorozawa-shi, Saitama, 359-8555 Japan.

Crossmodal studies have demonstrated inhibitory as well as facilitatory neural effects in higher sensory association and primary sensory cortices. A recent human behavioral study reported touch-induced visual perceptual suppression (TIVS). Here, we introduced an experimental setting in which TIVS could occur and investigated brain activities underlying visuo-tactile interactions using a functional magnetic resonance imaging technique. Read More

View Article and Full-Text PDF
November 2016

Reduced frontal theta oscillations indicate altered crossmodal prediction error processing in schizophrenia.

J Neurophysiol 2016 09 29;116(3):1396-407. Epub 2016 Jun 29.

Department of Psychiatry and Psychotherapy, Charité-Universitätsmedizin Berlin Hospital, St. Hedwig Hospital, Berlin, Germany; and.

Our brain generates predictions about forthcoming stimuli and compares predicted with incoming input. Failures in predicting events might contribute to hallucinations and delusions in schizophrenia (SZ). When a stimulus violates prediction, neural activity that reflects prediction error (PE) processing is found. Read More

View Article and Full-Text PDF
September 2016

Quantifying and comparing the pattern of thalamic and cortical projections to the posterior auditory field in hearing and deaf cats.

J Comp Neurol 2016 10 14;524(15):3042-63. Epub 2016 Apr 14.

Cerebral Systems Laboratory, University of Western Ontario, London, Ontario, Canada, N6A 5C2.

Following sensory loss, compensatory crossmodal reorganization occurs such that the remaining modalities are functionally enhanced. For example, behavioral evidence suggests that peripheral visual localization is better in deaf than in normal hearing animals, and that this enhancement is mediated by recruitment of the posterior auditory field (PAF), an area that is typically involved in localization of sounds in normal hearing animals. To characterize the anatomical changes that underlie this phenomenon, we identified the thalamic and cortical projections to the PAF in hearing cats and those with early- and late-onset deafness. Read More

View Article and Full-Text PDF
October 2016

On the relative contributions of multisensory integration and crossmodal exogenous spatial attention to multisensory response enhancement.

Acta Psychol (Amst) 2015 Nov 2;162:20-8. Epub 2015 Oct 2.

Utrecht University, Department of Experimental Psychology, Helmholtz Institute, Utrecht, The Netherlands.

Two processes that can give rise to multisensory response enhancement (MRE) are multisensory integration (MSI) and crossmodal exogenous spatial attention. It is, however, currently unclear what the relative contribution of each of these is to MRE. We investigated this issue using two tasks that are generally assumed to measure MSI (a redundant target effect task) and crossmodal exogenous spatial attention (a spatial cueing task). Read More

View Article and Full-Text PDF
November 2015

Olfactory-visual integration facilitates perception of subthreshold negative emotion.

Neuropsychologia 2015 Oct 8;77:288-97. Epub 2015 Sep 8.

Department of Psychology, Florida State University, 1107 W. Call St., Tallahassee, FL 32304, USA. Electronic address:

A fast growing literature of multisensory emotion integration notwithstanding, the chemical senses, intimately associated with emotion, have been largely overlooked. Moreover, an ecologically highly relevant principle of "inverse effectiveness", rendering maximal integration efficacy with impoverished sensory input, remains to be assessed in emotion integration. Presenting minute, subthreshold negative (vs. Read More

View Article and Full-Text PDF
October 2015

Crossmodal enhancement in the LOC for visuohaptic object recognition over development.

Neuropsychologia 2015 Oct 10;77:76-89. Epub 2015 Aug 10.

Cognitive Science Program, Indiana University, Bloomington, USA; Department of Psychological and Brain Sciences, Indiana University, Bloomington, USA; Program in Neuroscience, Indiana University, Bloomington, USA.

Research has provided strong evidence of multisensory convergence of visual and haptic information within the visual cortex. These studies implement crossmodal matching paradigms to examine how systems use information from different sensory modalities for object recognition. Developmentally, behavioral evidence of visuohaptic crossmodal processing has suggested that communication within sensory systems develops earlier than across systems; nonetheless, it is unknown how the neural mechanisms driving these behavioral effects develop. Read More

View Article and Full-Text PDF
October 2015

A matter of attention: Crossmodal congruence enhances and impairs performance in a novel trimodal matching paradigm.

Neuropsychologia 2016 07 21;88:113-122. Epub 2015 Jul 21.

Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg, Germany.

A novel crossmodal matching paradigm including vision, audition, and somatosensation was developed in order to investigate the interaction between attention and crossmodal congruence in multisensory integration. To that end, all three modalities were stimulated concurrently while a bimodal focus was defined blockwise. Congruence between stimulus intensity changes in the attended modalities had to be evaluated. Read More

View Article and Full-Text PDF

On the 'visual' in 'audio-visual integration': a hypothesis concerning visual pathways.

Exp Brain Res 2014 Jun 4;232(6):1631-8. Epub 2014 Apr 4.

Department of Brain and Cognitive Sciences, Center for Visual Science, University of Rochester, Rochester, NY, USA,

Crossmodal interaction conferring enhancement in sensory processing is nowadays widely accepted. Such benefit is often exemplified by neural response amplification reported in physiological studies conducted with animals, which parallel behavioural demonstrations of sound-driven improvement in visual tasks in humans. Yet, a good deal of controversy still surrounds the nature and interpretation of these human psychophysical studies. Read More

View Article and Full-Text PDF

Directing eye gaze enhances auditory spatial cue discrimination.

Curr Biol 2014 Mar 13;24(7):748-52. Epub 2014 Mar 13.

Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA; Department of Speech and Hearing Sciences, University of Washington, 1417 NE 42(nd) Street, Eagleson Hall, Box 354875, Seattle, WA 98105, USA. Electronic address:

The present study demonstrates, for the first time, a specific enhancement of auditory spatial cue discrimination due to eye gaze. Whereas the region of sharpest visual acuity, called the fovea, can be directed at will by moving one's eyes, auditory spatial information is derived primarily from head-related acoustic cues. Past auditory studies have found better discrimination in front of the head [1-3] but have not manipulated subjects' gaze, thus overlooking potential oculomotor influences. Read More

View Article and Full-Text PDF

Shifting attention between the space of the body and external space: electrophysiological correlates of visual-nociceptive crossmodal spatial attention.

Psychophysiology 2014 May 3;51(5):464-77. Epub 2014 Mar 3.

Department of Experimental Clinical and Health Psychology, Ghent University, Ghent, Belgium.

The study tested whether nociceptive stimuli applied to a body limb can orient spatial attention in external space toward visual stimuli delivered close to that limb. Nociceptive stimuli were applied to either the left or the right hand. Task-relevant visual stimuli were delivered at the location adjacent to the stimulated hand (70% valid trials) or adjacent to the other hand (30% invalid trials). Read More

View Article and Full-Text PDF

Crossmodal enhancement of visual orientation discrimination by looming sounds requires functional activation of primary visual areas: a case study.

Neuropsychologia 2014 Apr 15;56:350-8. Epub 2014 Feb 15.

CsrNC, Centro studi e ricerche in Neuroscienze Cognitive, Alma Mater Studiorum - Università di Bologna, Polo Scientifico-Didattico di Cesena, 47521 Cesena, Italy; Dipartimento di Psicologia, Alma Mater Studiorum - Università di Bologna, 40127 Bologna, Italy. Electronic address:

Approaching or looming sounds are salient, potentially threatening stimuli with particular impact on visual processing. The early crossmodal effects by looming sounds (Romei, Murray, Cappe, & Thut, 2009) and their selective impact on visual orientation discrimination (Leo, Romei, Freeman, Ladavas, & Driver, 2011) suggest that these multisensory interactions may take place already within low-level visual cortices. To investigate this hypothesis, we tested a patient (SDV) with bilateral occipital lesion and spared residual portions of V1/V2. Read More

View Article and Full-Text PDF

Crossmodal induction of thalamocortical potentiation leads to enhanced information processing in the auditory cortex.

Neuron 2014 Feb;81(3):664-73

Institute for Systems Research, University of Maryland, College Park, MD 20742, USA; Department of Biology, University of Maryland, College Park, MD 20742, USA. Electronic address:

Sensory systems do not work in isolation; instead, they show interactions that are specifically uncovered during sensory loss. To identify and characterize these interactions, we investigated whether visual deprivation leads to functional enhancement in primary auditory cortex (A1). We compared sound-evoked responses of A1 neurons in visually deprived animals to those from normally reared animals. Read More

View Article and Full-Text PDF
February 2014

Feeling better: separate pathways for targeted enhancement of spatial and temporal touch.

Psychol Sci 2014 Feb 3;25(2):555-65. Epub 2014 Jan 3.

Johns Hopkins University School of Medicine.

People perceive spatial form and temporal frequency through touch. Although distinct somatosensory neurons represent spatial and temporal information, these neural populations are intermixed throughout the somatosensory system. Here, we show that spatial and temporal touch can be dissociated and separately enhanced via cortical pathways that are normally associated with vision and audition. Read More

View Article and Full-Text PDF
February 2014

Simultaneous and preceding sounds enhance rapid visual targets: Evidence from the attentional blink.

Adv Cogn Psychol 2013 20;9(3):130-42. Epub 2013 Sep 20.

Neuropsychology Lab, Department of Psychology, Carl von Ossietzky University Oldenburg, Germany.

Presenting two targets in a rapid visual stream will frequently result in the second target (T2) being missed when presented shortly after the first target (T1). This so-called attentional blink (AB) phenomenon can be reduced by various experimental manipulations. This study investigated the effect of combining T2 with a non-specific sound, played either simultaneously with T2 or preceding T2 by a fixed latency. Read More

View Article and Full-Text PDF
October 2013

Delineating prefrontal cortex region contributions to crossmodal object recognition in rats.

Cereb Cortex 2014 Aug 15;24(8):2108-19. Epub 2013 Mar 15.

Department of Psychology and Neuroscience Program, University of Guelph, Guelph, ON, Canada

In the present study, we assessed the involvement of the prefrontal cortex (PFC) in the ability of rats to perform crossmodal (tactile-to-visual) object recognition tasks. We tested rats with 3 different types of bilateral excitotoxic lesions: (1) Large PFC lesions, including the medial PFC (mPFC) and ventral and lateral regions of the orbitofrontal cortex (OFC); (2) selective mPFC lesions; and (3) selective OFC lesions. Rats were tested on 2 versions of crossmodal object recognition (CMOR): (1) The original CMOR task, which uses a tactile-only sample phase and a visual-only choice phase; and (2) a "multimodal pre-exposure" version (PE/CMOR), in which simultaneous pre-exposure to the tactile and visual features of an object facilitates CMOR performance over longer memory delays. Read More

View Article and Full-Text PDF

Crossmodal influences on early somatosensory processing: interaction of vision, touch, and task-relevance.

Exp Brain Res 2013 May 3;226(4):503-12. Epub 2013 Mar 3.

Department of Kinesiology, BMH 3031, University of Waterloo, 200 University Ave. W, Waterloo, ON, N2L 3G1, Canada.

Previous research suggests that somatosensory cortex is subject to modulation based on the relevancy of incoming somatosensory stimuli to behavioural goals. Recent fMRI findings provide evidence for modulation of primary somatosensory cortex when simultaneous visual and tactile stimuli were relevant to the performance of a motor task. The present study aimed to (1) determine the temporal characteristics of this modulation using event-related potentials (ERPs) and (2) investigate the role of task-relevance in mediating such a modulation. Read More

View Article and Full-Text PDF

Focused attention vs. crossmodal signals paradigm: deriving predictions from the time-window-of-integration model.

Front Integr Neurosci 2012 29;6:62. Epub 2012 Aug 29.

Department of Psychology, Carl von Ossietzky Universitaet Oldenburg Oldenburg, Germany.

In the crossmodal signals paradigm (CSP) participants are instructed to respond to a set of stimuli from different modalities, presented more or less simultaneously, as soon as a stimulus from any modality has been detected. In the focused attention paradigm (FAP), on the other hand, responses should only be made to a stimulus from a pre-defined target modality and stimuli from non-target modalities should be ignored. Whichever paradigm is being applied, a typical result is that responses tend to be faster to crossmodal stimuli than to unimodal stimuli, a phenomenon often referred to as "crossmodal interaction. Read More

View Article and Full-Text PDF
October 2012

The role of active locomotion in space perception.

Naohide Yamamoto

Cogn Process 2012 Aug;13 Suppl 1:S365-8

Department of Psychology, Cleveland State University, 2121 Euclid Avenue, Cleveland, OH 44115, USA.

It has been shown that active control of locomotion increases accuracy and precision of nonvisual space perception, but psychological mechanisms of this enhancement are poorly understood. The present study explored a hypothesis that active control of locomotion enhances space perception by facilitating crossmodal interaction between visual and nonvisual spatial information. In an experiment, blindfolded participants walked along a linear path under one of the following two conditions: (1) They walked by themselves following a guide rope and (2) they were led by an experimenter. Read More

View Article and Full-Text PDF

Synchronous sounds enhance visual sensitivity without reducing target uncertainty.

Seeing Perceiving 2011 ;24(6):623-38

Department of Experimental Psychology, University of Oxford, Oxford, OX1 3UD, UK.

We examined the crossmodal effect of the presentation of a simultaneous sound on visual detection and discrimination sensitivity using the equivalent noise paradigm (Dosher and Lu, 1998). In each trial, a tilted Gabor patch was presented in either the first or second of two intervals embedded in dynamic 2D white noise with one of seven possible contrast levels. The results revealed that the sensitivity of participants' visual detection and discrimination performance were both enhanced by the presentation of a simultaneous sound, though only close to the noise level at which participants' target contrast thresholds started to increase with the increasing noise contrast. Read More

View Article and Full-Text PDF