A stentrode, the size of a matchstick and planted inside a blood vessel near the brain, is seen in this January 20, 2016 handout photo released by the University of Melbourne. REUTERS/University of Melbourne/Handout via Reuters More SYDNEY (Reuters) - Thought-controlled prosthetic limbs, wheelchairs and computers may be available within a decade, say Australian scientists who are planning to conduct human trials next year on a high-tech implant that can pick up and transmit signals from the brain. Animals have already been tested with the device, called a stentrode, which is the size of a matchstick and planted inside a blood vessel near the brain. It uses a web of small electrodes to pick up neuron signals from the brain and converts them into electrical commands that may one day, the scientists hope, allow paralyzed patients to control a bionic limb or wheelchair. "The big breakthrough is that we now have a minimally invasive brain-computer interface device which is potentially practical for long-term use," said Terry O’Brien, head of medicine at the Department of Medicine and Neurology at the University of Melbourne. The current method for accessing brain signals requires complex open-brain surgery and becomes less effective over several months, which means it is rarely applied, he said. The stentrode is less invasive because it can be inserted through a vein in a patient’s neck and placed in a blood vessel near the brain. The animal trial was on the functionality of the stentrode to pick up neuro signals, not the converting of the electronic signals into movement of bionic limbs, which is established technology. Dr Ganesh Naik, from the University of Technology Sydney, who is not involved in the project, said animal trials did not always translate into successful human trials. "If it functions as it should at the (human) trial, it will be a massive breakthrough," said Ganesh. Other potential uses for the stentrode include monitoring the brain signals of people with epilepsy to detect an oncoming seizure. If successful, the device could also allow a patient to communicate through a computer, said Professor Clive May from the Florey Institute of Neuroscience and Mental Health, who is working on the project. "People would need to be trained in how to think the right thoughts to make it work, like learning to play music. You need to learn it, but once you do, it becomes natural," May said. The device was developed by Melbourne University, the Royal Melbourne Hospital and the Florey Institute of Neuroscience and Mental Health. The project is funded by both the Australian government and the U.S military, which sees potential benefits for paraplegic veterans.
Turner S.J.,University of Melbourne |
Hildebrand M.S.,University of Melbourne |
Block S.,La Trobe University |
Damiano J.,University of Melbourne |
And 9 more authors.
American Journal of Medical Genetics, Part A | Year: 2013
Relatively little is known about the neurobiological basis of speech disorders although genetic determinants are increasingly recognized. The first gene for primary speech disorder was FOXP2, identified in a large, informative family with verbal and oral dyspraxia. Subsequently, many de novo and familial cases with a severe speech disorder associated with FOXP2 mutations have been reported. These mutations include sequencing alterations, translocations, uniparental disomy, and genomic copy number variants. We studied eight probands with speech disorder and their families. Family members were phenotyped using a comprehensive assessment of speech, oral motor function, language, literacy skills, and cognition. Coding regions of FOXP2 were screened to identify novel variants. Segregation of the variant was determined in the probands' families. Variants were identified in two probands. One child with severe motor speech disorder had a small de novo intragenic FOXP2 deletion. His phenotype included features of childhood apraxia of speech and dysarthria, oral motor dyspraxia, receptive and expressive language disorder, and literacy difficulties. The other variant was found in a family in two of three family members with stuttering, and also in the mother with oral motor impairment. This variant was considered a benign polymorphism as it was predicted to be non-pathogenic with in silico tools and found in database controls. This is the first report of a small intragenic deletion of FOXP2 that is likely to be the cause of severe motor speech disorder associated with language and literacy problems. © 2013 Wiley Periodicals, Inc.
One of the most basic responsibilities parents have is to feed their babies. Compared with the smorgasbord of possibilities later in life, food options for early babyhood are thankfully quite limited: The entire menu is either breast milk or formula (or often, both). While the benefits of breast milk are clear, infant formula has come a long way since the days of concoctions brewed up in the kitchen with raw cow milk. (Side note: Please don’t do that.) In the United States, the Food and Drug Administration keeps an eagle eye on formula recipes and their safe preparation. And one important ingredient in the recipe is iron. Overall, iron has been a clutch addition, drastically dropping the rates of anemia caused by iron deficiency. But that success story may not be so straightforward. Some nutrition experts say that formula makers are adding more iron than necessary, and that this extra iron may not be harmless. A provocative opinion article published in October suggests that excess iron during infancy might actually be dangerous. In the article, scientists raise the possibility that too much iron early in life can kick off a chain of events that leaves the brain vulnerable decades later to neurodegenerative diseases such as Parkinson’s. The idea, proposed in the September Nature Reviews Neurology, is speculative — no human data exist to make that claim. But just as low iron levels are dangerous, it’s not hard to imagine that high levels are too, says article coauthor Dominic Hare, an analytical neurochemist of the University of Technology Sydney and the Florey Institute of Neuroscience and Mental Health in Melbourne. “My concern is that it’s entirely possible that this may be a case of too much of a good thing,” he says. Before you swear off iron for you and your baby, please consider this: Iron is absolutely essential for growing bodies. The element isn’t just crucial for keeping the body humming along, iron is also needed to build it. Nowhere is this more obvious than in the developing brain. If babies don’t get enough iron, their brain cells have trouble forming connections and insulating the ones they have. Severe iron deficiency during infancy can lead to permanent mental and physical impairments. “I’ve spent a lifetime of research on the brain and behavior effects of insufficient iron,” says pediatrician Betsy Lozoff of the University of Michigan in Ann Arbor. “There are dozens and dozens of studies that show that’s problematic.” Doctors have been aware of these consequences for decades. In the 1930s, anemia caused by iron deficiency was rampant among infants. Thirty years later, public health officials took action, and by the mid-1960s, several iron-supplemented formulas appeared on the market. In 1969, the American Academy of Pediatrics issued guidelines stating that formula ought to be fortified with iron. This intervention worked, and in many cases, spectacularly well. In the ’70s and ’80s, rates of iron-deficiency anemia started dropping. “Iron fortification of infant foods is one of the big public health successes,” Lozoff says. Yet this victory may be carrying along some extra baggage, Hare and his colleagues write. In young babies, the blood-brain barrier may not be fully sealed. Excess iron in the body could slip through this leaky barrier and reach the brain, Hare and colleagues propose. Studies from animals have shown that lots of iron early in life leads to higher iron levels in the brain later. And that may be concerning, Hare says: Some studies have found troublesome links between high iron levels in the brain and certain brain diseases. Iron piles up in nerve cells found in the substantia nigra, the brain area that’s decimated in Parkinson’s disease. Alzheimer’s plaques made of the sticky amyloid-beta protein are lined with iron. And iron accumulation in the brain has been linked to flare-ups of multiple sclerosis. It’s not clear whether iron problems actually contribute to these disorders or whether iron is just a marker of them, but the link is definitely worth exploring. So far, we don’t know whether an iron overload early in life might influence disorders that strike in old age. But there is a small hint about iron’s effects during childhood, and it comes from one of Lozoff’s studies. She and her colleagues followed 473 Chilean infantswho received formula with either low levels of iron (about 2.3 milligrams of iron per liter) or regular levels (about 12.7 milligrams per liter). (Most formula in the United States has 10 to 12 milligrams of iron per liter.) Ten years later, children who received the higher-iron formula scored worse on spatial memory tests and visual-motor integration than the children who received the low-iron formula, Lozoff and colleagues reported in JAMA Pediatrics in 2012. The lower scores came mainly from a small number of children who had the highest levels of hemoglobin (a proxy for iron). So the idea is that for babies who already get plenty of iron, adding even more may be harmful. Because the results seem to be driven by just a handful of children — about 13 kids in each group — the study gives only a preliminary look at the issue. “That is a call for more research rather than the basis for any change in policy, because it’s only a few kids,” Lozoff says. What’s more, bodies have evolved precise ways of regulating their iron stores. “If extra iron is given, iron absorption will decrease,” says pediatric gastroenterologist Robert Baker of the University at Buffalo. Given this tight bodily oversight, it’s not clear how much iron actually gets into the brain. Yet the fact remains that the standard levels of iron in U.S. formulas are probably higher than necessary. European countries use formula with about half the iron (between 4 and 7 milligrams per liter) and have similar rates of iron-deficiency anemia, Hare says. A recent panel of iron nutrition experts agreed that “current levels of iron fortification of infant formulas in the U.S. are not optimal and do not reflect current evidence for iron requirements in this age group.” Their new recommendations, published in October in a supplement to the Journal of Pediatrics, suggest dropping iron intake at birth and then gradually increasing it as the baby grows. For the first three months of life, babies would get by with the iron accumulated during gestation — no fortification needed, the panel wrote. After three months of age, infants ought to start taking formula that has between 2 and 4 milligrams of iron per liter, the recommendations suggest. Then from six to 12 months of age, infants should get between 4 and 8 milligrams of iron per liter. Those recommendations are for healthy, full-term babies. Babies born prematurely or at low birth weights may have different needs, the committee wrote. Clearly, iron deficiency can be disastrous, particularly for developing babies. And there’s no doubt that decades of fortification has reduced anemia. But Hare and colleagues raise a provocative issue: Just because some iron is good doesn’t mean more is better. Perhaps iron overload comes with its own risks that remain to be seen. The first wave of infants who got iron-fortified formula are still relatively young and won’t hit their 60s until the 2030s. So for now, Hare’s idea about the dangers of iron overload remains just that — an idea. It’s an idea that I find compelling, but the data aren’t there yet. “We are working on this as fast as we can,” Hare says. But following the bread crumbs from infant feeding to elderly brain changes is tough. And in the end, the trail may disappear. It’s hard to sit with uncertainty, even more so when that uncertainty relates to what to feed your baby. The best we can do is to keep asking questions — and encouraging scientists, policy people and formula-makers to do the same.
News Article | September 18, 2015
In a 2011 study, scientists found signs of depression and post-traumatic stress disorder (PTSD) in chimpanzees that had been used in laboratory research, orphaned, trapped by snares, or been part of illegal trade. Stressful events can even leave marks on animals' genes. In 2014, researchers found that African grey parrots that were housed alone suffered more genetic damage than parrots that were housed in pairs... "All you can do with animals is to observe them," says (University of Mississippi neurogenetics researcher Eric) Vallender. "Imagine if you could study mental disorders in humans only by observing them. It would be really hard to tell what's going on in their brain." Faced with these obstacles, scientists have begun looking at animals' genes. "A lot of mental disorders can be quite different. But what we do know is that they have a very, very strong genetic component to them," says Jess Nithianantharajah of the Florey Institute of Neuroscience and Mental Health in Melbourne, Australia. All mental disorders, from depression to schizophrenia, involve abnormal behaviours. Those behaviours are influenced by genes just like other behaviours. So the idea is to identify genes that can cause abnormal behaviours in humans and other animals. By tracing the origins of these genes, we can trace the origins of mental disorders.
Faced with a shortage of the essential nutrient selenium, the brain and the testes duke it out. In selenium-depleted male mice, testes hog the trace element, leaving the brain in the lurch, scientists report in the Nov. 18 Journal of Neuroscience. The results are some of the first to show competition between two organs for trace nutrients, says analytical neurochemist Dominic Hare of the University of Technology Sydney and the Florey Institute of Neuroscience and Mental Health in Melbourne. In addition to uncovering this brain-testes scuffle, the study “highlights that selenium in the brain is something we can’t continue to ignore,” he says. About two dozen proteins in the body contain selenium, a nonmetallic chemical element. Some of these proteins are antioxidants that keep harmful molecules called free radicals from causing trouble. Male mice without enough selenium have brain abnormalities that lead to movement problems and seizures, neuroscientist Matthew Pitts of the University of Hawaii at Manoa and colleagues found. In some experiments, Pitts and his colleagues depleted selenium by interfering with genes. Male mice engineered to lack two genes that produce proteins required for the body to properly use selenium had trouble balancing on a rotating rod and moving in an open field. In their brains, a particular group of nerve cells called parvalbumin interneurons didn’t mature normally. But removing the selenium-hungry testes via castration before puberty improved these symptoms, leaving more selenium for the brain, the team found. Selenium levels in the brains of these castrated mice were higher than those in uncastrated mice (though not as high as in females). The results “really suggest that there is some competition going on” in the males, Pitts says. Because selenium is known to be important for both fertility and the brain, the results make sense, says biochemist Lutz Schomburg of Charité-University Medical School Berlin. “Taking out the brain or the testes will likely benefit the other organ,” he says. “The former experiment is impossible to do but the latter has now nicely been conducted.” Schomburg cautions that the results aren’t necessarily relevant for people, who aren’t likely to be as selenium-deprived as the mice in the experiment. “Under normal conditions, the competition between testes and brain is not existent,” he says. That’s in part because most people’s diets contain plenty of selenium. The nutrient is found in crops grown in soil with plentiful selenium, such as in the Great Plains in the United States. Brazil nuts are packed with selenium, as are tuna, halibut and sardines. Yet some people in parts of China, New Zealand and Europe have low selenium intake, Pitts says. Differences in selenium levels in the body, either due to diet or genetic traits, may play a role in psychiatric disorders such as schizophrenia, he speculates. While that idea is unconfirmed, a hint comes from an earlier study that found that people with schizophrenia had reduced activity of the gene that encodes a protein that helps deliver selenium to where it is needed. Early-onset schizophrenia is also more prevalent in males. “In this way, males could be more at risk, because they have an additional organ sucking up resources that could be going to the brain,” Pitts says.