Publications by authors named "Claudia Bonmassar"

5 Publications

  • Page 1 of 1

Eye-movement patterns to social and non-social cues in early deaf adults.

Q J Exp Psychol (Hove) 2021 Mar 17:1747021821998511. Epub 2021 Mar 17.

Centre for Mind/Brain Sciences (CIMeC), University of Trento, Rovereto, Italy.

Previous research on covert orienting to the periphery suggested that early profound deaf adults were less susceptible to uninformative gaze-cues, though were equally or more affected by non-social arrow-cues. The aim of this work was to investigate whether spontaneous eye movement behaviour helps explain the reduced impact of the social cue in deaf adults. We tracked the gaze of 25 early profound deaf and 25 age-matched hearing observers performing a peripheral discrimination task with uninformative central cues (gaze vs arrow), stimulus-onset asynchrony (250 vs 750 ms), and cue validity (valid vs invalid) as within-subject factors. In both groups, the cue effect on reaction time (RT) was comparable for the two cues, although deaf observers responded significantly slower than hearing controls. While deaf and hearing observers' eye movement pattern looked similar when the cue was presented in isolation, deaf participants made significantly more eye movements than hearing controls once the discrimination target appeared. Notably, further analysis of eye movements in the deaf group revealed that independent of the cue type, cue validity affected saccade landing position, while latency was not modulated by these factors. Saccade landing position was also strongly related to the magnitude of the validity effect on RT, such that the greater the difference in saccade landing position between invalid and valid trials, the greater the difference in manual RT between invalid and valid trials. This work suggests that the contribution of overt selection in central cueing of attention is more prominent in deaf adults and helps determine the manual performance, irrespective of the cue type.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1747021821998511DOI Listing
March 2021

The role of eye movements in manual responses to social and nonsocial cues.

Atten Percept Psychophys 2019 Jul;81(5):1236-1252

Center for Mind/Brain Sciences (CIMeC), University of Trento, Rovereto, Italy.

Gaze and arrow cues cause covert attention shifts even when they are uninformative. Nonetheless, it is unclear to what extent oculomotor behavior influences manual responses to social and nonsocial stimuli. In two experiments, we tracked the gaze of participants during the cueing task with nonpredictive gaze and arrow cues. In Experiment 1, the discrimination task was easy and eye movements were not necessary, whereas in Experiment 2 they were instrumental in identifying the target. Validity effects on manual response time (RT) were similar for the two cues in Experiment 1 and in Experiment 2, though in the presence of eye movements observers were overall slower to respond to the arrow cue compared with the gaze cue. Cue direction had an effect on saccadic performance before the discrimination was presented and throughout the duration of the trial. Furthermore, we found evidence of a distinct impact of the type of cue on diverse oculomotor components. While saccade latencies were affected by the type of cue, both before and after the target onset, saccade landing positions were not. Critically, the manual validity effect was predicted by the landing position of the initial eye movement. This work suggests that the relationship between eye movements and attention is not straightforward. In the presence of overt selection, saccade latency related to the overall speed of manual response, while eye movements landing position was closely related to manual performance in response to different cues.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3758/s13414-019-01669-9DOI Listing
July 2019

Multiple oscillatory rhythms determine the temporal organization of perception.

Proc Natl Acad Sci U S A 2017 12 4;114(51):13435-13440. Epub 2017 Dec 4.

Center for Mind/Brain Sciences, University of Trento, 38068 Rovereto, Italy.

Incoming sensory input is condensed by our perceptual system to optimally represent and store information. In the temporal domain, this process has been described in terms of temporal windows (TWs) of integration/segregation, in which the phase of ongoing neural oscillations determines whether two stimuli are integrated into a single percept or segregated into separate events. However, TWs can vary substantially, raising the question of whether different TWs map onto unique oscillations or, rather, reflect a single, general fluctuation in cortical excitability (e.g., in the alpha band). We used multivariate decoding of electroencephalography (EEG) data to investigate perception of stimuli that either repeated in the same location (two-flash fusion) or moved in space (apparent motion). By manipulating the interstimulus interval (ISI), we created bistable stimuli that caused subjects to perceive either integration (fusion/apparent motion) or segregation (two unrelated flashes). Training a classifier searchlight on the whole channels/frequencies/times space, we found that the perceptual outcome (integration vs. segregation) could be reliably decoded from the phase of prestimulus oscillations in right parieto-occipital channels. The highest decoding accuracy for the two-flash fusion task (ISI = 40 ms) was evident in the phase of alpha oscillations (8-10 Hz), while the highest decoding accuracy for the apparent motion task (ISI = 120 ms) was evident in the phase of theta oscillations (6-7 Hz). These results reveal a precise relationship between specific TW durations and specific oscillations. Such oscillations at different frequencies may provide a hierarchical framework for the temporal organization of perception.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1073/pnas.1714522114DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5754799PMC
December 2017

Multisensory Interference in Early Deaf Adults.

J Deaf Stud Deaf Educ 2017 Oct;22(4):422-433

Center for Mind/Brain Sciences (CIMeC), University of Trento, Corso Bettini, 31, Rovereto TN 38068, Italy.

Multisensory interactions in deaf cognition are largely unexplored. Unisensory studies suggest that behavioral/neural changes may be more prominent for visual compared to tactile processing in early deaf adults. Here we test whether such an asymmetry results in increased saliency of vision over touch during visuo-tactile interactions. About 23 early deaf and 25 hearing adults performed two consecutive visuo-tactile spatial interference tasks. Participants responded either to the elevation of the tactile target while ignoring a concurrent visual distractor at central or peripheral locations (respond to touch/ignore vision), or they performed the opposite task (respond to vision/ignore touch). Multisensory spatial interference emerged in both tasks for both groups. Crucially, deaf participants showed increased interference compared to hearing adults when they attempted to respond to tactile targets and ignore visual distractors, with enhanced difficulties with ipsilateral visual distractors. Analyses on task-order revealed that in deaf adults, interference of visual distractors on tactile targets was much stronger when this task followed the task in which vision was behaviorally relevant (respond to vision/ignore touch). These novel results suggest that behavioral/neural changes related to early deafness determine enhanced visual dominance during visuo-tactile multisensory conflict.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/deafed/enx025DOI Listing
October 2017

Motivation and short-term memory in visual search: Attention's accelerator revisited.

Cortex 2018 05 13;102:45-56. Epub 2017 Jul 13.

Center for Mind/Brain Sciences (CiMeC), University of Trento, Italy.

A cue indicating the possibility of cash reward will cause participants to perform memory-based visual search more efficiently. A recent study has suggested that this performance benefit might reflect the use of multiple memory systems: when needed, participants may maintain the to-be-remembered object in both long-term and short-term visual memory, with this redundancy benefitting target identification during search (Reinhart, McClenahan & Woodman, 2016). Here we test this compelling hypothesis. We had participants complete a memory-based visual search task involving a reward cue that either preceded presentation of the to-be-remembered target (pre-cue) or followed it (retro-cue). Following earlier work, we tracked memory representation using two components of the event-related potential (ERP): the contralateral delay activity (CDA), reflecting short-term visual memory, and the anterior P170, reflecting long-term storage. We additionally tracked attentional preparation and deployment in the contingent negative variation (CNV) and N2pc, respectively. Results show that only the reward pre-cue impacted our ERP indices of memory. However, both types of cue elicited a robust CNV, reflecting an influence on task preparation, both had equivalent impact on deployment of attention to the target, as indexed in the N2pc, and both had equivalent impact on visual search behavior. Reward prospect thus has an influence on memory-guided visual search, but this does not appear to be necessarily mediated by a change in the visual memory representations indexed by CDA. Our results demonstrate that the impact of motivation on search is not a simple product of improved memory for target templates.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.cortex.2017.06.022DOI Listing
May 2018