Time filter

Source Type

Messina, Italy

Plebe A.,v. Concezione | Mazzone M.,Laboratory of Cognitive Science | de la Cruz V.,v. Concezione
Cognitive Computation | Year: 2010

Humans come to recognize an infinite variety of natural and man-made objects in their lifetime and make use of sounds to identify and categorize them. How does this lifelong learning process begin? Many hypotheses have been proposed to explain the learning of first words with some emerging from the particular characteristics observed in child development. One is the peculiar trend in the speed with which words are learned, which have been referred to in the literature as "fast mapping". We present a neural network model trained in stages that parallel developmental ones and that simulates cortical processes of selforganization during an early crucial stage of first word learning. This is done by taking into account strictly visual and acoustic perceptions only. The results obtained show evidence of the emergence in the artificial maps used in the model, of cortical functions similar to those found in the biological correlates in the brain. Evidence of non-catastrophic fast mapping based on the quantity of objects and labels gradually learned by the model is also found. We interpret these results as meaning that early stages of first word learning may be explained by strictly perceptual learning processes, coupled with cortical processes of self-organization and of fast mapping. Specialized word-learning mechanisms thus need not be invoked, at least not at an early word-learning stage. © 2010 Springer Science+Business Media, LLC. Source

Plebe A.,v. Concezione | Mazzone M.,Laboratory of Cognitive Science | De La Cruz V.M.,v. Concezione
Neural Network World | Year: 2011

One crucial step in the construction of the human representation of the world is found at the boundary between two basic stimuli: visual experience and the sounds of language. In the developmental stage when the ability of recognizing objects consolidates, and that of segmenting streams of sounds into familiar chunks emerges, the mind gradually grasps the idea that utterances are related to the visible entities of the world. The model presented here is an attempt to reproduce this process, in its basic form, simulating the visual and auditory pathways, and a portion of the prefrontal cortex putatively responsible for more abstract representations of object classes. Simulations have been performed with the model, using a set of images of 100 real world objects seen from many different viewpoints and waveforms of labels of various classes of objects. Subsequently, categorization processes with and without language are also compared. © ICS AS CR 2011. Source

Discover hidden collaborations