Publications by authors named "Simon Rigoulot"

19 Publications

  • Page 1 of 1

About time: Ageing influences neural markers of temporal predictability.

Biol Psychol 2021 07 11;163:108135. Epub 2021 Jun 11.

International Laboratory for Brain, Music, and Sound Research (BRAMS), Montreal, Canada; Dept. of Psychology, University of Montreal, Montreal, Canada; Centre for Research on Brain, Language and Music (CRBLM), Montreal, Canada; Dept. of Cognitive Psychology, University of Economics and Human Sciences in Warsaw, Warsaw, Poland. Electronic address:

Timing abilities help organizing the temporal structure of events but are known to change systematically with age. Yet, how the neuronal signature of temporal predictability changes across the age span remains unclear. Younger (n = 21; 23.1 years) and older adults (n = 21; 68.5 years) performed an auditory oddball task, consisting of isochronous and random sound sequences. Results confirm an altered P50 response in the older compared to younger participants. P50 amplitudes differed between the isochronous and random temporal structures in younger, and for P200 in the older group. These results suggest less efficient sensory gating in older adults in both isochronous and random auditory sequences. N100 amplitudes were more negative for deviant tones. P300 amplitudes were parietally enhanced in younger, but not in older adults. In younger participants, the P50 results confirm that this component marks temporal predictability, indicating sensitive gating of temporally regular sound sequences.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.biopsycho.2021.108135DOI Listing
July 2021

Unattended Emotional Prosody Affects Visual Processing of Facial Expressions in Mandarin-Speaking Chinese: A Comparison With English-Speaking Canadians.

J Cross Cult Psychol 2021 Apr 15;52(3):275-294. Epub 2021 Feb 15.

McGill University, Montréal, QC, Canada.

Emotional cues from different modalities have to be integrated during communication, a process that can be shaped by an individual's cultural background. We explored this issue in 25 Chinese participants by examining how listening to emotional prosody in Mandarin influenced participants' gazes at emotional faces in a modified visual search task. We also conducted a cross-cultural comparison between data of this study and that of our previous work in English-speaking Canadians using analogous methodology. In both studies, eye movements were recorded as participants scanned an array of four faces portraying fear, anger, happy, and neutral expressions, while passively listening to a pseudo-utterance expressing one of the four emotions (Mandarin utterance in this study; English utterance in our previous study). The frequency and duration of fixations to each face were analyzed during 5 seconds after the onset of faces, both during the presence of the speech (early time window) and after the utterance ended (late time window). During the late window, Chinese participants looked more frequently and longer at faces conveying congruent emotions as the speech, consistent with findings from English-speaking Canadians. Cross-cultural comparison further showed that Chinese, but not Canadians, looked more frequently and longer at angry faces, which may signal potential conflicts and social threats. We hypothesize that the socio-cultural norms related to harmony maintenance in the Eastern culture promoted Chinese participants' heightened sensitivity to, and deeper processing of, angry cues, highlighting culture-specific patterns in how individuals scan their social environment during emotion processing.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/0022022121990897DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8053741PMC
April 2021

Neurophysiological correlates of sexually evocative speech.

Biol Psychol 2020 07 23;154:107909. Epub 2020 May 23.

McGill University, School of Communication Sciences and Disorders, 2001 McGill College, 8th Floor, Montreal, QC, H3A 1G1, Canada(1).

Speakers modulate their voice (prosody) to communicate non-literal meanings, such as sexual innuendo (She inspected his package this morning, where "package" could refer to a man's penis). Here, we analyzed event-related potentials to illuminate how listeners use prosody to interpret sexual innuendo and what neurocognitive processes are involved. Participants listened to third-party statements with literal or 'sexual' interpretations, uttered in an unmarked or sexually evocative tone. Analyses revealed: 1) rapid neural differentiation of neutral vs. sexual prosody from utterance onset; (2) N400-like response differentiating contextually constrained vs. unconstrained utterances following the critical word (reflecting integration of prosody and word meaning); and (3) a selective increased negativity response to sexual innuendo around 600 ms after the critical word. Findings show that the brain quickly integrates prosodic and lexical-semantic information to form an impression of what the speaker is communicating, triggering a unique response to sexual innuendos, consistent with their high social relevance.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.biopsycho.2020.107909DOI Listing
July 2020

Auditory repetition suppression alterations in relation to cognitive functioning in fragile X syndrome: a combined EEG and machine learning approach.

J Neurodev Disord 2018 01 29;10(1). Epub 2018 Jan 29.

Neuroscience of Early Development (NED), 90 Avenue Vincent-D'indy, Montreal, QC, H2V 2S9, Canada.

Background: Fragile X syndrome (FXS) is a neurodevelopmental genetic disorder causing cognitive and behavioural deficits. Repetition suppression (RS), a learning phenomenon in which stimulus repetitions result in diminished brain activity, has been found to be impaired in FXS. Alterations in RS have been associated with behavioural problems in FXS; however, relations between RS and intellectual functioning have not yet been elucidated.

Methods: EEG was recorded in 14 FXS participants and 25 neurotypical controls during an auditory habituation paradigm using repeatedly presented pseudowords. Non-phased locked signal energy was compared across presentations and between groups using linear mixed models (LMMs) in order to investigate RS effects across repetitions and brain areas and a possible relation to non-verbal IQ (NVIQ) in FXS. In addition, we explored group differences according to NVIQ and we probed the feasibility of training a support vector machine to predict cognitive functioning levels across FXS participants based on single-trial RS features.

Results: LMM analyses showed that repetition effects differ between groups (FXS vs. controls) as well as with respect to NVIQ in FXS. When exploring group differences in RS patterns, we found that neurotypical controls revealed the expected pattern of RS between the first and second presentations of a pseudoword. More importantly, while FXS participants in the ≤ 42 NVIQ group showed no RS, the > 42 NVIQ group showed a delayed RS response after several presentations. Concordantly, single-trial estimates of repetition effects over the first four repetitions provided the highest decoding accuracies in the classification between the FXS participant groups.

Conclusion: Electrophysiological measures of repetition effects provide a non-invasive and unbiased measure of brain responses sensitive to cognitive functioning levels, which may be useful for clinical trials in FXS.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/s11689-018-9223-3DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5789548PMC
January 2018

Effects of musical expertise on oscillatory brain activity in response to emotional sounds.

Neuropsychologia 2017 Aug 15;103:96-105. Epub 2017 Jul 15.

Centre for Research on Brain, Language, and Music (CRBLM), Montreal, Canada; Departement of Psychology, Université de Montréal, Montreal, Quebec, Canada; Department of Psychiatry, McGill University & Douglas Mental Health University Institute, Montreal, Quebec, Canada; International Laboratory for Brain, Music, and Sound Research (BRAMS), Montreal, Quebec, Canada.

Emotions can be conveyed through a variety of channels in the auditory domain, be it via music, non-linguistic vocalizations, or speech prosody. Moreover, recent studies suggest that expertise in one sound category can impact the processing of emotional sounds in other sound categories as they found that musicians process more efficiently emotional musical and vocal sounds than non-musicians. However, the neural correlates of these modulations, especially their time course, are not very well understood. Consequently, we focused here on how the neural processing of emotional information varies as a function of sound category and expertise of participants. Electroencephalogram (EEG) of 20 non-musicians and 17 musicians was recorded while they listened to vocal (speech and vocalizations) and musical sounds. The amplitude of EEG-oscillatory activity in the theta, alpha, beta, and gamma band was quantified and Independent Component Analysis (ICA) was used to identify underlying components of brain activity in each band. Category differences were found in theta and alpha bands, due to larger responses to music and speech than to vocalizations, and in posterior beta, mainly due to differential processing of speech. In addition, we observed greater activation in frontal theta and alpha for musicians than for non-musicians, as well as an interaction between expertise and emotional content of sounds in frontal alpha. The results reflect musicians' expertise in recognition of emotion-conveying music, which seems to also generalize to emotional expressions conveyed by the human voice, in line with previous accounts of effects of expertise on musical and vocal sounds processing.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neuropsychologia.2017.07.014DOI Listing
August 2017

Altered visual repetition suppression in Fragile X Syndrome: New evidence from ERPs and oscillatory activity.

Int J Dev Neurosci 2017 Jun 19;59:52-59. Epub 2017 Mar 19.

Departement de Psychologie, Université de Montréal, Montreal, Canada; Neuroscience of Early Development (NED), Montreal, Canada; Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Montreal, Canada; Research Center of the CHU Ste-Justine Mother and Child University Hospital Center, Université de Montreal, Quebec, Canada; International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, Quebec, Canada.

Fragile X Syndrome (FXS) is a neurodevelopmental genetic disorder associated with cognitive and behavioural deficits. In particular, neuronal habituation processes have been shown to be altered in FXS patients. Yet, while such deficits have been primarily explored using auditory stimuli, less is known in the visual modality. Here, we investigated the putative alteration of repetition suppression using faces in FXS patients compared to controls that had the same age distribution. Electroencephalographic (EEG) signals were acquired while participants were presented with 18 different faces, each repeated ten times successively. The repetition suppression effect was probed by comparing the brain responses to the first and second presentation, based on task-evoked event-related potentials (ERP) as well as on task-induced oscillatory activity. We found different patterns of habituation for controls and patients both in ERP and oscillatory power. While the N170 was not affected by face repetition in controls, it was altered in FXS patients. Conversely, while a repetition suppression effect was observed in the theta band (4-8Hz) over frontal and parieto-occipital areas in controls, it was not seen in FXS patients. These results provide the first evidence for diminished ERP and oscillatory habituation effects in response to face repetitions in FXS. These findings extend previous observations of impairments in learning mechanisms and may be linked to deficits in the maturation processes of synapses caused by the mutation. The present study contributes to bridging the gap between animal models of synaptic plasticity dysfunctions and human research in FXS.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.ijdevneu.2017.03.008DOI Listing
June 2017

Early selectivity for vocal and musical sounds: electrophysiological evidence from an adaptation paradigm.

Eur J Neurosci 2016 11 13;44(10):2786-2794. Epub 2016 Oct 13.

Centre for Research on Brain, Language and Music (CRBLM), Montreal, QC, Canada.

There is growing interest in characterizing the neural basis of music perception and, in particular, assessing how similar, or not, it is to that of speech. To further explore this question, we employed an EEG adaptation paradigm in which we compared responses to short sounds belonging to the same category, either speech (pseudo-sentences) or music (piano or violin), depending on whether they were immediately preceded by a same- or different-category sound. We observed a larger reduction in the N100 component magnitude in response to musical sounds when they were preceded by music (either the same or different instrument) than by speech. In contrast, the N100 amplitude was not affected by the preceding stimulus category in the case of speech. For P200 component, we observed a diminution of amplitude when speech sounds were preceded speech, compared to music. No such decrease was found when we compared the responses to music sounds. These differences in the processing of speech and music are consistent with the proposal that some degree of category selectivity for these two classes of complex stimuli already occurs at early stages of auditory processing, possibly subserved by partly separated neuronal populations.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/ejn.13391DOI Listing
November 2016

Cultural immersion alters emotion perception: Neurophysiological evidence from Chinese immigrants to Canada.

Soc Neurosci 2017 12 23;12(6):685-700. Epub 2016 Sep 23.

a School of Communication Sciences and Disorders , McGill University , Montréal , Canada.

To explore how cultural immersion modulates emotion processing, this study examined how Chinese immigrants to Canada process multisensory emotional expressions, which were compared to existing data from two groups, Chinese and North Americans. Stroop and Oddball paradigms were employed to examine different stages of emotion processing. The Stroop task presented face-voice pairs expressing congruent/incongruent emotions and participants actively judged the emotion of one modality while ignoring the other. A significant effect of cultural immersion was observed in the immigrants' behavioral performance, which showed greater interference from to-be-ignored faces, comparable with what was observed in North Americans. However, this effect was absent in their N400 data, which retained the same pattern as the Chinese. In the Oddball task, where immigrants passively viewed facial expressions with/without simultaneous vocal emotions, they exhibited a larger visual MMN for faces accompanied by voices, again mirroring patterns observed in Chinese. Correlation analyses indicated that the immigrants' living duration in Canada was associated with neural patterns (N400 and visual mismatch negativity) more closely resembling North Americans. Our data suggest that in multisensory emotion processing, adopting to a new culture first leads to behavioral accommodation followed by alterations in brain activities, providing new evidence on human's neurocognitive plasticity in communication.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1080/17470919.2016.1231713DOI Listing
December 2017

Cultural differences in on-line sensitivity to emotional voices: comparing East and West.

Front Hum Neurosci 2015 29;9:311. Epub 2015 May 29.

School of Communication Sciences and Disorders, Centre for Research on Brain, Language, and Music (CRBLM), McGill University Montréal, QC, Canada.

Evidence that culture modulates on-line neural responses to the emotional meanings encoded by vocal and facial expressions was demonstrated recently in a study comparing English North Americans and Chinese (Liu et al., 2015). Here, we compared how individuals from these two cultures passively respond to emotional cues from faces and voices using an Oddball task. Participants viewed in-group emotional faces, with or without simultaneous vocal expressions, while performing a face-irrelevant visual task as the EEG was recorded. A significantly larger visual Mismatch Negativity (vMMN) was observed for Chinese vs. English participants when faces were accompanied by voices, suggesting that Chinese were influenced to a larger extent by task-irrelevant vocal cues. These data highlight further differences in how adults from East Asian vs. Western cultures process socio-emotional cues, arguing that distinct cultural practices in communication (e.g., display rules) shape neurocognitive activity associated with the early perception and integration of multi-sensory emotional cues.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3389/fnhum.2015.00311DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4448034PMC
June 2015

Culture modulates the brain response to human expressions of emotion: electrophysiological evidence.

Neuropsychologia 2015 Jan 2;67:1-13. Epub 2014 Dec 2.

School of Communication Sciences and Disorders, McGill University, Montréal, Québec, Canada; International Laboratory for Brain, Music and Sound Research (BRAMS), Centre for Research on Brain, Language, and Music (CRBLM), McGill University, Montréal, Québec, Canada.

To understand how culture modulates on-line neural responses to social information, this study compared how individuals from two distinct cultural groups, English-speaking North Americans and Chinese, process emotional meanings of multi-sensory stimuli as indexed by both behaviour (accuracy) and event-related potential (N400) measures. In an emotional Stroop-like task, participants were presented face-voice pairs expressing congruent or incongruent emotions in conditions where they judged the emotion of one modality while ignoring the other (face or voice focus task). Results indicated that while both groups were sensitive to emotional differences between channels (with lower accuracy and higher N400 amplitudes for incongruent face-voice pairs), there were marked group differences in how intruding facial or vocal cues affected accuracy and N400 amplitudes, with English participants showing greater interference from irrelevant faces than Chinese. Our data illuminate distinct biases in how adults from East Asian versus Western cultures process socio-emotional cues, supplying new evidence that cultural learning modulates not only behaviour, but the neurocognitive response to different features of multi-channel emotion expressions.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neuropsychologia.2014.11.034DOI Listing
January 2015

Neural correlates of inferring speaker sincerity from white lies: an event-related potential source localization study.

Brain Res 2014 May 18;1565:48-62. Epub 2014 Apr 18.

McGill University, Faculty of Medicine, School of Communication Sciences and Disorders, 1266 Avenue des Pins Ouest, Montreal QC, Canada H3G 1A8; McGill Centre for Research on Brain, Language and Music (CRBLM), Canada. Electronic address: http://www.mcgill.ca/pell_lab.

During social interactions, listeners weigh the importance of linguistic and extra-linguistic speech cues (prosody) to infer the true intentions of the speaker in reference to what is actually said. In this study, we investigated what brain processes allow listeners to detect when a spoken compliment is meant to be sincere (true compliment) or not ("white lie"). Electroencephalograms of 29 participants were recorded while they listened to Question-Response pairs, where the response was expressed in either a sincere or insincere tone (e.g., "So, what did you think of my presentation?"/"I found it really interesting."). Participants judged whether the response was sincere or not. Behavioral results showed that prosody could be effectively used to discern the intended sincerity of compliments. Analysis of temporal and spatial characteristics of event-related potentials (P200, N400, P600) uncovered significant effects of prosody on P600 amplitudes, which were greater in response to sincere versus insincere compliments. Using low resolution brain electromagnetic tomography (LORETA), we determined that the anatomical sources of this activity were likely located in the (left) insula, consistent with previous reports of insular activity in the perception of lies and concealments. These data extend knowledge of the neurocognitive mechanisms that permit context-appropriate inferences about speaker feelings and intentions during interpersonal communication.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.brainres.2014.04.022DOI Listing
May 2014

Feeling backwards? How temporal order in speech affects the time course of vocal emotion recognition.

Front Psychol 2013 24;4:367. Epub 2013 Jun 24.

Faculty of Medicine, School of Communication Sciences and Disorders, McGill University Montreal, QC, Canada ; McGill Centre for Research on Brain, Language and Music Montreal, QC, Canada.

Recent studies suggest that the time course for recognizing vocal expressions of basic emotion in speech varies significantly by emotion type, implying that listeners uncover acoustic evidence about emotions at different rates in speech (e.g., fear is recognized most quickly whereas happiness and disgust are recognized relatively slowly; Pell and Kotz, 2011). To investigate whether vocal emotion recognition is largely dictated by the amount of time listeners are exposed to speech or the position of critical emotional cues in the utterance, 40 English participants judged the meaning of emotionally-inflected pseudo-utterances presented in a gating paradigm, where utterances were gated as a function of their syllable structure in segments of increasing duration from the end of the utterance (i.e., gated syllable-by-syllable from the offset rather than the onset of the stimulus). Accuracy for detecting six target emotions in each gate condition and the mean identification point for each emotion in milliseconds were analyzed and compared to results from Pell and Kotz (2011). We again found significant emotion-specific differences in the time needed to accurately recognize emotions from speech prosody, and new evidence that utterance-final syllables tended to facilitate listeners' accuracy in many conditions when compared to utterance-initial syllables. The time needed to recognize fear, anger, sadness, and neutral from speech cues was not influenced by how utterances were gated, although happiness and disgust were recognized significantly faster when listeners heard the end of utterances first. Our data provide new clues about the relative time course for recognizing vocally-expressed emotions within the 400-1200 ms time window, while highlighting that emotion recognition from prosody can be shaped by the temporal properties of speech.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3389/fpsyg.2013.00367DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3690349PMC
June 2013

Implicit emotional processing in peripheral vision: behavioral and neural evidence.

Neuropsychologia 2012 Oct 28;50(12):2887-2896. Epub 2012 Aug 28.

Université de Lille Nord de France, F-59000, France; Neurosciences Fonctionnelles et Pathologies, Université de Lille 2, F-59000, France; Neurosciences, Université de Lille 1, F-59000, France. Electronic address:

Emotional facial expressions (EFE) are efficiently processed when both attention and gaze are focused on them. However, what kind of processing persists when EFE are neither the target of attention nor of gaze remains largely unknown. Consequently, in this experiment we investigated whether the implicit processing of faces displayed in far periphery could still be modulated by their emotional expression. Happy, fearful and neutral faces appeared randomly for 300 ms at four peripheral locations of a panoramic screen (15 and 30° in the right and left visual fields). Reaction times and electrophysiological responses were recorded from 32 participants who had to categorize these faces according to their gender. A decrease of behavioral performance was specifically found for happy and fearful faces, probably because emotional content was automatically processed and interfered with information necessary to the task. A spatio-temporal principal component analysis of electrophysiological data confirmed an enhancement of early activity in occipito-temporal areas for emotional faces in comparison with neutral ones. Overall, these data show an implicit processing of EFE despite the strong decrease of visual performance with eccentricity. Therefore, the present research suggests that EFE could be automatically detected in peripheral vision, confirming the abilities of humans to process emotional saliency in very impoverished conditions of vision.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neuropsychologia.2012.08.015DOI Listing
October 2012

Electrophysiological correlates of enhanced perceptual processes and attentional capture by emotional faces in social anxiety.

Brain Res 2012 Jun 26;1460:50-62. Epub 2012 Apr 26.

Institut de Recherche en Psychologie, Université Catholique de Louvain, Louvain-La-Neuve, Belgium.

Behavioural studies have used spatial cueing designs extensively to investigate emotional biases in individuals exhibiting clinical and sub-clinical anxiety. However, the neural processes underlying the generation of these biases remain largely unknown. In this study, people who scored unusually high or low on scales of social anxiety performed a spatial cueing task. They were asked to discriminate the orientation of arrows appearing at the location previously occupied by a lateralised cue (consisting of a face displaying an emotional or a neutral expression) or at the empty location. The results showed that the perceptual encoding of faces, indexed by P1, and mobilisation of attentional resources, reflected in P2 on occipital locations, were modulated by social anxiety. These modulations were directly linked to the social anxiety level but not to trait anxiety. By contrast, later cognitive stages and behavioural performances were not modulated by social anxiety, supporting the theory of dissociation between efficiency and effectiveness in anxiety.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.brainres.2012.04.034DOI Listing
June 2012

Seeing emotion with your ears: emotional prosody implicitly guides visual attention to faces.

PLoS One 2012 30;7(1):e30740. Epub 2012 Jan 30.

McGill University, Faculty of Medicine, School of Communication Sciences and Disorders, Montreal, Quebec, Canada.

Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0030740PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3268762PMC
June 2012

Fearful faces impact in peripheral vision: behavioral and neural evidence.

Neuropsychologia 2011 Jun 29;49(7):2013-21. Epub 2011 Mar 29.

Université de Lille Nord de France, F-59000, France.

Many studies provided evidence that the emotional content of visual stimulations modulates behavioral performance and neuronal activity. Surprisingly, these studies were carried out using stimulations presented in the center of the visual field while the majority of visual events firstly appear in the peripheral visual field. In this study, we assessed the impact of the emotional facial expression of fear when projected in near and far periphery. Sixteen participants were asked to categorize fearful and neutral faces projected at four peripheral visual locations (15° and 30° of eccentricity in right and left sides of the visual field) while reaction times and event-related potentials (ERPs) were recorded. ERPs were analyzed by means of spatio-temporal principal component and baseline-to-peak methods. Behavioral data confirmed the decrease of performance with eccentricity and showed that fearful faces induced shorter reaction times than neutral ones. Electrophysiological data revealed that the spatial position and the emotional content of faces modulated ERPs components. In particular, the amplitude of N170 was enhanced by fearful facial expression. These findings shed light on how visual eccentricity modulates the processing of emotional faces and suggest that, despite impoverished visual conditions, the preferential neural coding of fearful expression of faces still persists in far peripheral vision. The emotional content of faces could therefore contribute to their foveal or attentional capture, like in social interactions.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neuropsychologia.2011.03.031DOI Listing
June 2011

Early brain-body impact of emotional arousal.

Front Hum Neurosci 2010 19;4:33. Epub 2010 Apr 19.

Université de Lille Nord de France Lille, France.

Current research in affective neuroscience suggests that the emotional content of visual stimuli activates brain-body responses that could be critical to general health and physical disease. The aim of this study was to develop an integrated neurophysiological approach linking central and peripheral markers of nervous activity during the presentation of natural scenes in order to determine the temporal stages of brain processing related to the bodily impact of emotions. More specifically, whole head magnetoencephalogram (MEG) data and skin conductance response (SCR), a reliable autonomic marker of central activation, were recorded in healthy volunteers during the presentation of emotional (unpleasant and pleasant) and neutral pictures selected from the International Affective Picture System (IAPS). Analyses of event-related magnetic fields (ERFs) revealed greater activity at 180 ms in an occipitotemporal component for emotional pictures than for neutral counterparts. More importantly, these early effects of emotional arousal on cerebral activity were significantly correlated with later increases in SCR magnitude. For the first time, a neuromagnetic cortical component linked to a well-documented marker of bodily arousal expression of emotion, namely, the SCR, was identified and located. This finding sheds light on the time course of the brain-body interaction with emotional arousal and provides new insights into the neural bases of complex and reciprocal mind-body links.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3389/fnhum.2010.00033DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2859881PMC
July 2011

Peripherally presented emotional scenes: a spatiotemporal analysis of early ERP responses.

Brain Topogr 2008 Jun 12;20(4):216-23. Epub 2008 Mar 12.

Neurosciences Fonctionnelles & Pathologies, CNRS, UMR 8160, Service EFV, CHRU-Université de Lille II, 59037 Lille, France.

Recent findings from event-related potentials (ERPs) studies provided strong evidence that centrally presented emotional pictures could be used to assess affective processing. Moreover, several studies showed that emotionally charged stimuli may automatically attract attention even if these are not consciously identified. Indeed, such perceptive conditions can be compared to those typical of the peripheral vision, particularly known to have low spatial resolution capacities. The aim of the present study was to characterize at behavioral and neural levels the impact of emotional visual scenes presented in peripheral vision. Eighteen participants were asked to categorize neutral and unpleasant pictures presented at central (0 degrees ) and peripheral eccentricities (-30 and +30 degrees ) while ERPs were recorded from 63 electrodes. ERPs were analysed by means of spatio-temporal principal component analyses (PCA) in order to evaluate influences of the emotional content on ERP components for each spatial position (central vs. peripheral). Main results highlight that affective modulation of early ERP components exists for both centrally and peripherally presented pictures. These findings suggest that, for far peripheral eccentricities as for central vision, the brain engages specific resources to process emotional information.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s10548-008-0050-9DOI Listing
June 2008

Arousal and valence effects on event-related P3a and P3b during emotional categorization.

Int J Psychophysiol 2006 Jun 13;60(3):315-22. Epub 2005 Oct 13.

Neurosciences Cognitives, Bât SN4.1, Université de Lille 1, Villeneuve d'Ascq, France.

Due to the adaptive value of emotional situations, categorizing along the valence dimension may be supported by critical brain functions. The present study examined emotion-cognition relationships by focusing on the influence of an emotional categorization task on the cognitive processing induced by an oddball-like paradigm. Event-related potentials (ERPs) were recorded from subjects explicitly asked to categorize along the valence dimension (unpleasant, neutral or pleasant) deviant target pictures embedded in a train of standard stimuli. Late positivities evoked in response to the target pictures were decomposed into a P3a and a P3b and topographical differences were observed according to the valence content of the stimuli. P3a showed enhanced amplitudes at posterior sites in response to unpleasant pictures as compared to both neutral and pleasant pictures. This effect is interpreted as a negativity bias related to attentional processing. The P3b component was sensitive to the arousal value of the stimulation, with higher amplitudes at several posterior sites for both types of emotional pictures. Moreover, unpleasant pictures evoked smaller amplitudes than pleasant ones at fronto-central sites. Thus, the context updating process may be differentially modulated by the affective arousal and valence of the stimulus. The present study supports the assumption that, during an emotional categorization, the emotional content of the stimulus may modulate the reorientation of attention and the subsequent updating process in a specific way.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.ijpsycho.2005.06.006DOI Listing
June 2006
-->