Abad A.,Spoken Language Systems Laboratory |
Pompili A.,Spoken Language Systems Laboratory |
Pompili A.,University of Lisbon |
Costa A.,Spoken Language Systems Laboratory |
And 7 more authors.
Computer Speech and Language | Year: 2013
One of the most common effects among aphasia patients is the difficulty to recall names or words. Typically, word retrieval problems can be treated through word naming therapeutic exercises. In fact, the frequency and the intensity of speech therapy are key factors in the recovery of lost communication functionalities. In this sense, speech and language technology can have a relevant contribution in the development of automatic therapy methods. In this work, we present an on-line system designed to behave as a virtual therapist incorporating automatic speech recognition technology that permits aphasia patients to perform word naming training exercises. We focus on the study of the automatic word naming detector module and on its utility for both global evaluation and treatment. For that purpose, a database consisting of word naming therapy sessions of aphasic Portuguese native speakers has been collected. In spite of the different patient characteristics and speech quality conditions of the collected data, encouraging results have been obtained thanks to a calibration method that makes use of the patients' word naming ability to automatically adapt to the patients' speech particularities. © 2012 Elsevier Ltd. All rights reserved.
Martins M.D.,University of Vienna |
Martins M.D.,Language Research Laboratory |
Laaha S.,University of Vienna |
Freiberger E.M.,University of Vienna |
And 3 more authors.
Cognition | Year: 2014
The ability to understand and generate hierarchical structures is a crucial component of human cognition, available in language, music, mathematics and problem solving. Recursion is a particularly useful mechanism for generating complex hierarchies by means of self-embedding rules. In the visual domain, fractals are recursive structures in which simple transformation rules generate hierarchies of infinite depth. Research on how children acquire these rules can provide valuable insight into the cognitive requirements and learning constraints of recursion. Here, we used fractals to investigate the acquisition of recursion in the visual domain, and probed for correlations with grammar comprehension and general intelligence. We compared second (n= 26) and fourth graders (n= 26) in their ability to represent two types of rules for generating hierarchical structures: Recursive rules, on the one hand, which generate new hierarchical levels; and iterative rules, on the other hand, which merely insert items within hierarchies without generating new levels. We found that the majority of fourth graders, but not second graders, were able to represent both recursive and iterative rules. This difference was partially accounted by second graders' impairment in detecting hierarchical mistakes, and correlated with between-grade differences in grammar comprehension tasks. Empirically, recursion and iteration also differed in at least one crucial aspect: While the ability to learn recursive rules seemed to depend on the previous acquisition of simple iterative representations, the opposite was not true, i.e., children were able to acquire iterative rules before they acquired recursive representations. These results suggest that the acquisition of recursion in vision follows learning constraints similar to the acquisition of recursion in language, and that both domains share cognitive resources involved in hierarchical processing. © 2014 The Authors.
Martins M.D.J.D.,University of Vienna |
Martins M.D.J.D.,Humboldt University of Berlin |
Martins M.D.J.D.,Max Planck Institute for Human Cognitive and Brain Sciences |
Martins M.D.J.D.,Language Research Laboratory |
And 3 more authors.
Cognitive Psychology | Year: 2015
The ability to form and use recursive representations while processing hierarchical structures has been hypothesized to rely on language abilities. If so, linguistic resources should inevitably be activated while representing recursion in non-linguistic domains. In this study we use a dual-task paradigm to assess whether verbal resources are required to perform a visual recursion task. We tested participants across 4 conditions: (1) Visual recursion only, (2) Visual recursion with motor interference (sequential finger tapping), (3) Visual recursion with verbal interference - low load, and (4) Visual recursion with verbal interference - high load. Our results show that the ability to acquire and use visual recursive representations is not affected by the presence of verbal and motor interference tasks. Our finding that visual recursion can be represented without access to verbal resources suggests that recursion is available independently of language processing abilities. © 2015 Elsevier Inc.
Ravignani A.,University of Vienna |
Martins M.,University of Vienna |
Martins M.,Language Research Laboratory |
Fitch W.T.,University of Vienna
Behavioral and Brain Sciences | Year: 2014
Ackermann et al.'s arguments in the target article need sharpening and rethinking at both mechanistic and evolutionary levels. First, the authors' evolutionary arguments are inconsistent with recent evidence concerning nonhuman animal rhythmic abilities. Second, prosodic intonation conveys much more complex linguistic information than mere emotional expression. Finally, human adults' basal ganglia have a considerably wider role in speech modulation than Ackermann et al. surmise. Copyright © Cambridge University Press 2014.
Martins M.J.,University of Vienna |
Martins M.J.,Language Research Laboratory |
Fischmeister F.P.,Medical University of Vienna |
Puig-Waldmuller E.,University of Vienna |
And 5 more authors.
NeuroImage | Year: 2014
Hierarchical structures play a central role in many aspects of human cognition, prominently including both language and music. In this study we addressed hierarchy in the visual domain, using a novel paradigm based on fractal images. Fractals are self-similar patterns generated by repeating the same simple rule at multiple hierarchical levels. Our hypothesis was that the brain uses different resources for processing hierarchies depending on whether it applies a "fractal" or a "non-fractal" cognitive strategy. We analyzed the neural circuits activated by these complex hierarchical patterns in an event-related fMRI study of 40 healthy subjects.Brain activation was compared across three different tasks: a similarity task, and two hierarchical tasks in which subjects were asked to recognize the repetition of a rule operating transformations either within an existing hierarchical level, or generating new hierarchical levels. Similar hierarchical images were generated by both rules and target images were identical.We found that when processing visual hierarchies, engagement in both hierarchical tasks activated the visual dorsal stream (occipito-parietal cortex, intraparietal sulcus and dorsolateral prefrontal cortex). In addition, the level-generating task specifically activated circuits related to the integration of spatial and categorical information, and with the integration of items in contexts (posterior cingulate cortex, retrosplenial cortex, and medial, ventral and anterior regions of temporal cortex). These findings provide interesting new clues about the cognitive mechanisms involved in the generation of new hierarchical levels as required for fractals. © 2014 Elsevier Inc.
Fitch W.T.,University of Vienna |
Martins M.D.,University of Vienna |
Martins M.D.,Language Research Laboratory
Annals of the New York Academy of Sciences | Year: 2014
Sixty years ago, Karl Lashley suggested that complex action sequences, from simple motor acts to language and music, are a fundamental but neglected aspect of neural function. Lashley demonstrated the inadequacy of then-standard models of associative chaining, positing a more flexible and generalized "syntax of action" necessary to encompass key aspects of language and music. He suggested that hierarchy in language and music builds upon a more basic sequential action system, and provided several concrete hypotheses about the nature of this system. Here, we review a diverse set of modern data concerning musical, linguistic, and other action processing, finding them largely consistent with an updated neuroanatomical version of Lashley's hypotheses. In particular, the lateral premotor cortex, including Broca's area, plays important roles in hierarchical processing in language, music, and at least some action sequences. Although the precise computational function of the lateral prefrontal regions in action syntax remains debated, Lashley's notion-that this cortical region implements a working-memory buffer or stack scannable by posterior and subcortical brain regions-is consistent with considerable experimental data. © 2014 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals Inc. on behalf of The New York Academy of Sciences.
PubMed | University of Vienna and Language Research Laboratory
Type: Journal Article | Journal: Behavior research methods | Year: 2016
We describe a new method to explore recursive cognition in the visual domain. We define recursion as the ability to represent multiple hierarchical levels using the same rule, entailing the ability to generate new levels beyond those previously encountered. With this definition recursion can be distinguished from general hierarchical embedding. To investigate this recursion/hierarchy distinction in the visual domain, we developed two novel methods: The Visual Recursion Task (VRT), in which an inferred rule is used to represent new hierarchical levels, and the Embedded Iteration Task (EIT), in which additional elements are added to an existing hierarchical level. We found that adult humans can represent recursion in the visuo-spatial domain, and that this ability is distinct from both general intelligence and the ability to represent iterative processes embedded within hierarchical structures. Compared with embedded iteration, visual recursion correlated positively with other recursive planning tasks (Tower of Hanoi), but not with specific visuo-spatial resources (spatial short-term memory and working memory). We conclude that humans are able to use recursive representations to process complex visuo-spatial hierarchies and that our visual recursion task taps into specific cognitive resources. This method opens exciting opportunities to explore the relationship between visual recursion and language.
Martins M.J.,Language Research Laboratory |
Moura B.L.,Language Research Laboratory |
Martins I.P.,Language Research Laboratory |
Figueira M.L.,Psychiatric Unit of Santa Maria Hospital |
Prkachin K.M.,University of Northern British Columbia
Psychiatry Research | Year: 2011
Patients with schizophrenia tend to neglect their own pain and are known to have impairments in the processing of facial expressions. However, the sensitivity to dynamic expressions of pain has not been studied in these patients. Our goal was to test this ability in schizophrenia and to probe the underlying cognitive processes. We hypothesized that patients would have a reduced sensitivity to expressions of pain and that this impairment would correlate with deficits in attention, working memory, basic emotions recognition and with positive symptoms. We applied a battery of tests composed of the Comprehensive Affect Testing System (CATS), Sensitivity to Expressions of Pain (STEP), Toulouse-Pierón, Stroop and Digit Span tests to two groups of individuals, 27 patients with the diagnosis of schizophrenia and 27 healthy volunteers, matched on age, education and gender. Symptoms were assessed using Brief Psychiatric Rating Scale. The sensitivity to expressions of pain was found to be impaired in schizophrenia and a bias to attribute lower pain intensities may be present at some discrimination levels. STEP performance was correlated with working memory but not with Affect Naming or attention. These findings may contribute to the improvement of cognitive remediation strategies. © 2011 Elsevier Ireland Ltd.
Rybarczyk Y.,New University of Lisbon |
Fonseca J.,Language Research Laboratory
Assistive Technology Research Series | Year: 2011
This paper describes a tangible user interface (TUI) built to be used for written and spoken comprehension therapy in aphasic patients. The tool works with the Trackmate system  and an application especially designed from clinical tools developed by speech and language therapists. The software implements a series of tasks that ask the disabled person to identify, from a set, tagged objects and put them on the sensing table to be recognized by the TUI system. At the end of each exercise, the percentage of identification is saved into a database, which records patients' performances according to the time and the task types. This information technology (IT) was chosen and adapted to the aphasia rehabilitation to take the advantage of the manipulation of physical objects, in order to ensure an effective transfer of training exercises into everyday life activities. © 2011 The authors and IOS Press. All rights reserved.
Rybarczyk Y.,New University of Lisbon |
Fonseca J.,Language Research Laboratory |
Martins R.,New University of Lisbon
Assistive Technology Research Series | Year: 2013
The game described in this article was developed for the treatment of lusophone aphasic patients. Various information technologies were used in order to create a multimedia platform of rehabilitation. The objective of this software is to provide a complementary tool for the classical speech therapy, which enhances the patient's recovering through the completion of exercises adapted to the different symptoms of the disease. The principal features of the game are: i) a realistic 3D virtual environment that enables the interaction with modeled objects and ii) a dynamic interface that allows the addition of new therapeutic tasks in order to get a customizable and easily upgradable platform. One of the main scientific contributions of this project is the fact that it is the only product of this sort tailored to the Portuguese population of aphasics. © 2013 The authors and IOS Press. All rights reserved.