Time filter

Source Type

Pell M.D.,McGill University | Monetta L.,Laval University | Rothermich K.,McGill University | Kotz S.A.,University of Manchester | And 3 more authors.
Neuropsychology | Year: 2014

Objective: Our study assessed how nondemented patients with Parkinson's disease (PD) interpret the affective and mental states of others from spoken language (adopt a "theory of mind") in ecologically valid social contexts. A secondary goal was to examine the relationship between emotion processing, mentalizing, and executive functions in PD during interpersonal communication. Method: Fifteen adults with PD and 16 healthy adults completed The Awareness of Social Inference Test, a standardized tool comprised of videotaped vignettes of everyday social interactions (McDonald, Flanagan, Rollins, & Kinch, 2003). Individual subtests assessed participants' ability to recognize basic emotions and to infer speaker intentions (sincerity, lies, sarcasm) from verbal and nonverbal cues, and to judge speaker knowledge, beliefs, and feelings. A comprehensive neuropsychological evaluation was also conducted. Results: Patients with mild-moderate PD were impaired in the ability to infer "enriched" social intentions, such as sarcasm or lies, from nonliteral remarks; in contrast, adults with and without PD showed a similar capacity to recognize emotions and social intentions meant to be literal. In the PD group, difficulties using theory of mind to draw complex social inferences were significantly correlated with limitations in working memory and executive functioning. Conclusions: In early PD, functional compromise of the frontal-striatal-dorsal system yields impairments in social perception and understanding nonliteral speaker intentions that draw upon cognitive theory of mind. Deficits in social perception in PD are exacerbated by a decline in executive resources, which could hamper the strategic deployment of attention to multiple information sources necessary to infer social intentions. © 2014 American Psychological Association.

Rigoulot S.,McGill University | Rigoulot S.,ll Center for Research on Brain | Pell M.D.,McGill University | Pell M.D.,ll Center for Research on Brain
PLoS ONE | Year: 2012

Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions. © 2012 Rigoulot, Pell.

Rigoulot S.,McGill University | Rigoulot S.,ll Center for Research on Brain | Pell M.D.,McGill University | Pell M.D.,ll Center for Research on Brain
Speech Communication | Year: 2014

Previous eye-tracking studies have found that listening to emotionally-inflected utterances guides visual behavior towards an emotionally congruent face (e.g., Rigoulot and Pell, 2012). Here, we investigated in more detail whether emotional speech prosody influences how participants scan and fixate specific features of an emotional face that is congruent or incongruent with the prosody. Twenty-one participants viewed individual faces expressing fear, sadness, disgust, or happiness while listening to an emotionally-inflected pseudo-utterance spoken in a congruent or incongruent prosody. Participants judged whether the emotional meaning of the face and voice were the same or different (match/mismatch). Results confirm that there were significant effects of prosody congruency on eye movements when participants scanned a face, although these varied by emotion type; a matching prosody promoted more frequent looks to the upper part of fear and sad facial expressions, whereas visual attention to upper and lower regions of happy (and to some extent disgust) faces was more evenly distributed. These data suggest ways that vocal emotion cues guide how humans process facial expressions in a way that could facilitate recognition of salient visual cues, to arrive at a holistic impression of intended meanings during interpersonal events. © 2014 Elsevier B.V. All rights reserved.

Rigoulot S.,McGill University | Rigoulot S.,ll Center for Research on Brain | Fish K.,McGill University | Fish K.,ll Center for Research on Brain | And 2 more authors.
Brain Research | Year: 2014

During social interactions, listeners weigh the importance of linguistic and extra-linguistic speech cues (prosody) to infer the true intentions of the speaker in reference to what is actually said. In this study, we investigated what brain processes allow listeners to detect when a spoken compliment is meant to be sincere (true compliment) or not ("white lie"). Electroencephalograms of 29 participants were recorded while they listened to Question-Response pairs, where the response was expressed in either a sincere or insincere tone (e.g., "So, what did you think of my presentation?"/ "I found it really interesting."). Participants judged whether the response was sincere or not. Behavioral results showed that prosody could be effectively used to discern the intended sincerity of compliments. Analysis of temporal and spatial characteristics of event-related potentials (P200, N400, P600) uncovered significant effects of prosody on P600 amplitudes, which were greater in response to sincere versus insincere compliments. Using low resolution brain electromagnetic tomography (LORETA), we determined that the anatomical sources of this activity were likely located in the (left) insula, consistent with previous reports of insular activity in the perception of lies and concealments. These data extend knowledge of the neurocognitive mechanisms that permit context-appropriate inferences about speaker feelings and intentions during interpersonal communication. © 2014 Elsevier B.V.

Loading ll Center for Research on Brain collaborators
Loading ll Center for Research on Brain collaborators